So, I got myself a new laptop.
The main reason was that I wanted more powerful and – most importantly – slightly bigger portable computer. Up until now I used a cute 11.6″ machine that claimed to be a gaming laptop but worked pretty well as all-around development kit. The various trials and tribulations I had to overcome to make Ubuntu work reasonably well on this thing (it’s officially “not supported”) significantly increased my skills in tweaking Linux. And sometimes things worked so well that I actually managed to accomplish some work!
Nevertheless, the small size started to irk me quite a bit; the (small) additional mobility just wasn’t worth it. So this time I went for something just slightly bigger, but with a lot more pixels.
Alas, I got a 13″ Retina display MacBook Pro. Admittedly, I was a bit reluctant to be a semi-early adopter here, because the way increased resolution seems to work on these screens is a bit confusing. I mean, it’s apparently very natural to almost anyone: things look nicer, end of story. However, for someone who remembers how back in the days the difference between, say, 800×600 and 1024×768 made such a huge impact on UI scaling, the Retina’s quadrupling of pixel count may sounds pretty scary.
Just recall that the standard width for many website layouts is still around 960px, which translates to a little more than 1/3 (!) of Retina display’s width. Does it mean the web comes with big slabs of wasted whitespace and tiny column of content in between?…
Not really, as it turns out. By default, Retina cheats: the real (millimeter) size of UI elements is still roughly the same as on normal 13″ display running something around 1280×800. For typical GUI applications involving standard components and some text rendering, it’s indeed just making the interface sharper and more vivid. For pixel-perfect apps (such as games with set resolution), it seems the default solution is to stretch them proportionately; things might not look as nice then but they still work well.
Where the Retina display really shines is any serious text “processing”, be it reading websites, writing articles or – of course – programming. The additional level of detail might not be noticeable at first, however the difference becomes apparent when you look again on a screen with lower pixel density. There’s still some way to go in order to fluidly present even the smallest noticeable details to the sharpest of eyes, but it’s pretty short way.
I just shudder to think what resolution is needed to replicate the same sensation on 27″ or 30″ monitor :)
What about the operating system, though, the glorified OS X?
Besides handling that precious little screen very well – which cannot be said of some other systems – I don’t actually have much to say about it. With the rampant scavenging of UX concepts that goes back-and-forth between today’s platforms, the differences in look & feel of their graphical interfaces are mostly superficial. Whatever it is in the upper-right corner of your desktop – be it half-bitten apple, a rotated square or circle with dots – is unlikely to dictate the shape of your UI experience.
…Once you move the Dock to its proper position on the side, that is.
Under the hood, OS X is just a *nix, some say even more POSIX-y than what Linux currently is. This makes it a viable native choice for most developers, while the rest (i.e. those working with Microsoft products) can be accommodated via outstanding virtualization options. But all this goodness doesn’t come without few caveats.
Probably the biggest one is the horrendous functionality gap: lack of built-in package manager and installer. Life without
apt-get really sucks, and the bottom-up effort coalesced into Homebrew cannot really make up for it. I was especially appalled when I had to revert to the old google-download-unpack method for installing new programs. Amazingly, the Mac App Store is still mostly useless some two years after its inception.
Although I’m readily pointing out various quirks of the OS X platform here, I must say I’m not particularly concerned with them in the long term. I do not intended the Mac to become my primary system of choice, especially for development purposes. Its goal is to serve as handy portable computer, while simultaneously providing access to the third important platform to address any testing needs.
But that’s all aside of the most important perk: finally being able to visit those trendy coffee shops, of course! ;-)
I think you can try to use Fink project(apt-get port) instead apt-get. Fink has over 13k packages, so it is little in comparison with apt-get.
I thought about use OS X as primary OS but your post raise up my doubts.
Glad that I could provide some food for thought :)
I’ve only heard about Fink when already installed quite a bit of packages from Homebrew, so I don’t feel like switching. Homebrew seems to came up on the next more often, which gives me more confidence that if I encounter some problems, there will be community out there to help me fix them.
This aside, I think using OS X as primary OS is perfectly possible given how ridiculously compatible with everything it is. The fact that I could theoretically have MS Office Word open next to Unix terminal while waiting for Steam game to download (coming to Linux, I know) is quite impressive.
By definition you won’t hack Linux kernel modules here but many things higher than that are very doable. For web development any *nix will do, broadly speaking. And for mobile this is probably the only desktop OS that allows (non-hackily, at least) to target both ecosystems.