Hardware Longevity: The 5+ Year Old Computer

posted by on 26th August 2015, at 3:57pm

Computer technology has reached a point where most of our interactions simply work without much fuss. In the 1990s and earlier it was not uncommon for a computer of the era to have issues with day to day tasks. These problems would arise because of two reasons, first, new applications might not meet the system requirements, and second, hardware in the early days of computing wasn’t all that powerful. Today, most applications are efficient enough and we have an abundance of powerful hardware. In the last 5–8 years this has lead to a drastic increase in hardware longevity.

My primary gaming PC was built in 2012, I upgraded the GPU (to a NVIDIA GTX 670) in 2013 since it inherited an aging but still adequate GPU. The machine features an Intel Core i5 3570k CPU with 8GB of memory (RAM). I have a solid state drive and a couple spinning disks for volume storage on it. Aside from degradation that happens over time in Windows (much less prevalent on 7 and 8 than previous versions) this computer still runs all my games and performs day to day tasks adequately. Why? The vast majority of games out there are GPU dependent (shiny graphics) and an Intel Core i5 is great for anything aside from heavy media encoding. I do not see any reason in the near future for a hardware upgrade aside from a catastrophic failure.

Now you may be asking yourself, why does this happen? Why don’t we need an upgrade every 2 years? Right now microprocessor (CPU) design has reached a point where changes are incremental and slower. Intel is currently facing a challenge in decreasing the size of their manufacturing process. Intel has been sitting at the 22nm lithography since 2012. Just recently in 2015 we saw the move to 14nm. What this basically means is that each individual transistor on a chip is manufactured at that scale. As the scale gets smaller we see either an increase in power efficiency or more transistors added resulting in more compute power. The downside however is that once the process goes below 11nm weird things start happening (related to quantum physics). I would love to go into depth explaining the weirdness and possible measures to correct it but the takeaway point for today is that microprocessor evolution has slowed down. Each new iteration will bring a 10%–15% increase in power use/performance for die-shrink (22nm -> 14nm) and about a 20%–25% increase when a new architecture is released.

Aside from when performing tasks such as gaming and encoding (models, media, etc.) our computers are fast enough. A whole renaissance happened between 2005 and 2010 where web browsers became more efficient and responsive letting us use the web with very few hiccups. This is largely due to an emphasis on efficiency for mobile users and the hardware advancements mentioned previously. Development kits on Windows and OS X provide developers easy ways of writing efficient code that uses very few resources. The resources that are used most heavily (memory and CPU cycles are the big ones) are plentiful. Developers have also started to embrace technologies that let them use all 4 cores inside of that powerful CPU that you have sitting on your desk, this once again has the side effect of reducing the appearance of wait time.

As you are probably seeing by now there’s no reason that a computer in 2015 should feel slow. If your machine is feeling slow there are three main things that can be done to speed it up:

1. Ensure you have enough memory (RAM).
Windows 7, 8, and 10 will happily run with 2GB of memory. 4GB is what you should actually be aiming for. 8GB is even better if you feel yourself as a power user or just want more room to grow. Your RAM is one of the factors that controls how fast your computer feels. If your RAM is all utilized the system will begin storing temporary data on the hard disk and this takes much longer to access than accessing directly from RAM.

2. Reduce startup bloat.
Startup of applications or the computer itself is what ultimately has the largest impact on our perception of its performance. By going into the task manager (right click the task bar) and going to the startup tab you can view what is actually starting with your computer. Warning: Be careful, don’t disable everything. Every so often it’s a good idea to see what’s starting and remove items that you no longer require.

3. Get a SSD (Solid State Drive).
Hard drives are great because they allow for terabytes of storage but they’re slow in the best case scenario. Solid state drives are much faster for reading and writing data, that is why they’re the preferred option for housing your operating system and frequently used applications. Solid state drives have the benefit of making any application appear to open almost instantly. If you don’t have one I would highly recommend grabbing one. They don’t have to be large since they are just meant to house the operating system and critical applications. Anywhere from 120GB to 250GB works fine.

With these 3 solutions it is possible to have a computer that is even a few years old running like it’s brand new. While there are still cases out there that require more CPU power (the odd game or media encoding) it’s often the case in the modern world that the CPU is almost too powerful. And with this, we look to other areas of the system for optimization. I truly believe that we are entering a world where a 5+ year lifespan of a computer is not hard to imagine.

This article is filed under Tech. You can follow any responses to this entry through the RSS 2.0 feed. You can discuss this article on our forums.