an argument for old hardware

Nov 15, 2020

Throughout a lot of this writing, you can substitute old for “bad”. I use quotation marks around “bad” because hardware isn’t inherently bad or anything. It simply is often times slower relative to newer hardware. Knowing this, I offer this forewarning: In pursuit of acquiring better programming skills, don’t downgrade if avoidable. The idea of spending money to get older hardware I feel is a situational circumstance most often rewarded by specific projects, not everyday ones. The specific stance I take on hardware consumerism is one that you should try to use your hardware until it is simply unusable. With that being said, I’ll try to expand on the ideas a bit further.

Many times I find myself gawking at new hardware releases, with awe and wonder around the new “efficiencies” they offer. It’s as such that at any time of year I can probably quote a relatively accurate number of the average percentage improvement over the previous generation the latest has brought about. This isn’t an all bad thing for me, I lead a life where I work many different specific disciplines in the Information Technology world such that possibly I’ll need to bring about newer hardware for clients. Tho, that reality is diminishing rapidly with cost-effectiveness improving within cloud hosting services. With minor exceptions I find a lot of focus has been put on consistent upgrading of hardware. The interval seems to have shrunk down to around 2 years or roughly 2 cycles of hardware releases. While this isn’t nearly as bad as related technology, like smartphones, it is becoming eerily similar. The improvements each release are often touted as “revolutionary” or “game changing”, yet in real world usage, are often just iterative improvements. I often find myself thinking,”well, it would make this thing faster” or might save me time in that area. Though, the cost to time benefit is almost never as great as the monkey side of my brain makes it out to be.

A lot of this only works up to a certain extent, right? There’s always exceptions to most things in life, the main point here is more of “if it ain’t broke, don’t replace it” sort of thing. Another thing to consider, especially if you’re still new to programming, is that worse hardware can force you to be more responsible with what you write. I remember the nights spent frustrated with my code because it was “slow”, on my dual core Phenom II 955 with 2GB ram and integrated Radeon HD 4250 IGP, that took a sizable chunk of that ram as well. Back then I didn’t write in C, or any relatively plain languages either. I was focused on learning Java and using the good ol’ IDE, Eclipse. Neither of these lend themselves to super fast usage, even with somewhat older versions. I’m not claiming to be a great programmer because once upon a time I had bad hardware. Although, I do believe the experience has an inherent value to it for future endeavours.

To sum it all up with a nice little bow tie, think about what hardware you have now, what software you pair it with, and how you can squeeze more juice out of it? I have a few examples of friends running 9 year old CPUs and still keeping pace with development. Future generations might be revolutionary, but for now, can you get by with what you already have, or the cheaper/maybe used version? I know for myself, after having done some very rough and inaccurate tests, I use on average 5-10% of my CPU, and less of my GPU from day to day.

If you managed to find some meaning in my ramblings here, let me know. I don’t have any usage stats or form of tracking on this website, mostly out of trying to be a courteous internet surfer. Take care!