In this post I am going to talk about what I feel are the pros and cons between either always having the best of the best or sticking to the old hardware that you know and love. If you have an opinion one way or another please let me know on Twitter, or there are comment threads on both Reddit and Hacker News, if you enjoy this post I would appreciate a quick upvote or a share.
Do you believe that in order to produce great work in large quantities you need great kit?
This isn’t a leading question, I am genuinely interested and it is something I to-and-fro on fairly regularly.
One side of me thinks in order to be the most efficient you can be you need the fastest machine you can afford, I mean there is nothing worse than waiting for an application or a process to open when you are in the flow of something. This is something that is at least in part backed up by the likes of Joel Spolsky when he wrote the Joel Test (see point #9).
The other side of me thinks that the bottlenecks to productivity are only superficially to be found on your machine, I mean people *did* get stuff done before the MacBook Pro came out, so what is my excuse? This point is something similar to what was mentioned in a recent Back to Work podcast episode where Merlin Mann said that if you are a writer then in theory you should be able to at least do part of your work with a pen and a bit of paper.
I can see the benefits of both, and I suppose that probably want you want is the mindset of the latter with the equipment of the former, but in saying that there are some drawbacks to following either road.
If you are happy using your 5 year old netbook to code on that is running 2GB of RAM and is rocking a sweet 60GB SATA drive then you are probably well versed in its quirks and you have saved a load of money compared to someone who upgrades their laptop every year, this is a good thing, but if you splashed out on a better machine 3 years ago that shaved 3 minutes off your day, every day for those three years it would save you over 50 potentially billable hours, that is a lot of money you could be leaving on the table.
Conversely, if you are always throwing RAM at the problem and have been playing with SSDs since they were released then you haven’t wasted those 3 minutes a day waiting on something chugging itself along, but you have maybe failed to appreciated what you have and maybe you haven’t thought about smarter ways of working because you haven’t had that 9 second wait while something reloads.
One part of me wants to resolve this by saying, even the fact that I am aware of the advantages afforded to me with the latest kit means I have an appreciation for what it would be like to use sub-optimal gear, but I think there is a massive difference between hypothesising about something like this and actually living and working with old stuff.
Then of course there is the cloud, why would we need good hardware when we do everything on the cloud these days!
Thanks to Ryan McDonough for creating this image when I couldn’t find the original. Find him on http://twitter.com/ryanmcdonough
Funny picture aside, it is true, there are many things we used to think we needed to process on our hardware that we now do on someone else’s over the internet, I think so far the pinnacle of this is something like the Chomebook which doesn’t allow you to easily code directly on it, but with a plethora of web services that allow you to code with them it isn’t really an issue.
The next question then is do web developers need high bandwidth? For me there is nothing to think about, bigger is better and I think the faster your internet speed the more productive you can be in this day and age.
We download and upload large files more than we probably realise, and with the amount of events and tutorials that are either live streaming or high quality downloads/playback the bigger your pipe the quicker you can get access to this content. The only potential upside to having a slow internet connection is that it would really give you an appreciation that your site needs to be performant when being forced through narrow pipes, but we have plenty of tools that allow us to check if that is the case without actually limiting ourselves.
Side note; If you do want to throttle your bandwidth for any reason, I made a quick script that will work on *nix machines and Mac OS X.
One best of both worlds example would be to have the best gear money can buy but be ultra strict about spotting failings in your current workflow, basically finding anytime you have to type something over and over again, or anytime you need to complete a series of events, make sure you log them and attempt to eliminate them.
An example this of occurred to me just this morning, I use the terminal a lot in web development and I am working on one main project right now which is located in my Sites directory. So every morning I would turn on my machine, open the terminal and type cd ~/Sites/ExamTime, it is only a couple of key strokes to do this, but I have better things to be doing that moving into a directory every morning (or every time I close a terminal window by mistake) a quick visit to the settings panel and I have made my terminal do those key strokes for me, I will never need to worry about that again.
To wrap up my admittedly rather loose thinking on the subject I suppose I should share my setup, before I do, I would love to hear your thoughts on the subject, Twitter, Reddit, Hacker News.
I hack on an early 2011 MacBook Pro with 16GB of RAM and an i5 Processor, I have a 255GB SSD in it and it is running Mountain Lion.
I write on a 2012 Samsung Chromebook, it has 2GB of RAM and a 16GB SSD, it is on the dev channel of ChromeOS.
I have no plans to upgrade either of these machines.