Prophesying the future of any technology domain has always been a difficult thing to do. Not that we’re held back by our limited imagination, but the pace of innovation is so rapid that any reasonable prediction we make always ends up being way too conservative. Since Artificial Intelligence has become very accessible, almost every company with a good R&D division is using AI to build reliable data models and hasten the research process. So take these predictions with a pinch of salt. If at all you are reading this 20 years from when it was written and they all seem way too conservative, you know why.
Is the PC really dying?
No, the personal computer is not going to die anytime soon, no matter which form-factor we’re looking at. From the traditional desktop machine to the new slim ultraportables, the value that a personal computer holds makes it invulnerable for the next 20 years. However, there has been a strong shift towards smaller and sleeker form factors and manufacturers are more than happy to accommodate customers since it allows them to charge a premium. The older bulkier systems don’t give manufacturers as high a margin as they used to, but since they form the bulk of the PC business, it’s going to be around till the market share makes it unviable to continue. And even the slim computers aren’t going to outlast the traditional desktop by a huge margin. Simply consider them to be a stage on the PC’s path of evolution towards becoming commoditised. However, it’s this path of evolution that we’re curious about, and this is what we think is most likely to happen with PCs.
Let’s start with the most visible component of the PC. There’s no questioning the fact that the display and keyboard are what’s keeping the PC from shrinking further. Going below the 13-inch form factor is seen as asking to be ignored by the customer as we can assuage from the shrinking tablet market. Tablets aren’t popular and manufacturers aren’t going to place their bets on a dying market so we’ll continue to see the 13/15/17-inch screens to live on. However, the panel technology should see interesting developments.
Foldable screens should become more popular as it will allow computers to be squeezed into an even smaller dimension. Imagine a notebook that fits in your pocket that can be unfolded twice before you operate it. Provided the elastomers that allow for “bending” can survive repetitive bending. The other interesting development we expect is for haptic displays to become mainstream. Though it will remain a novelty until the haptic feedback can be used to provide a practical and long lasting feature. Another possibility is to use projection technology to provide both, a keyboard and display, thus, allowing for the form factor to reduce even further. However, this becomes impractical for folks on the move.
This one is a no-brainer. The traditional hard drive will continue to survive with even greater densities being made possible. Only recently, Western Digital unveiled their Microwave-assisted Magnetic Recording (MAMR) technology which will help increase storage density to 4 Tb/sq. inch making hard drives with 40 TB a possibility by 2025. Hard drive technologies tend to evolve at a slower pace and hard drive manufacturers have resorted to coming up with scenario-specific SKUs. So it’s unlikely we’ll see anything remarkable with those.
On the other hand, we have the SSDs to look forward to. NVMe was a significant leap over existing technologies and we should see similar leaps whenever semiconductor process nodes evolve. Moore’s Law is on the verge of ceasing to be so we’re reaching the fundamental limits of transistor technology. There’s still a significant scope for development and it’ll just take more time to increase density. This means we’ll see SSDs mature and gain reliability akin to those of hard drives. Already, technologies like Optane have provided a nominal increment in memory endurance. As these technologies mature, we’ll soon see the hard drives relegated to enterprise uses. The only caveat being equalisation of cost. For all we know, if SSDs do equal performance and endurance levels of RAM, they just might replace RAM altogether.
The easy thing to say would be that CPUs are going to get faster and smarter and more power efficient. However, NVIDIA CEO Jen-Hsun Huang’s recent statement at the GPU Technology Conference should have Intel and AMD worried. He stated that GPUs will replace CPUs and that the days of Moore’s Law are numbered. If you think about it, GPUs have been, for some time, taking over functions that were traditionally performed by CPUs. CPUs have only been gaining incremental performance gains with each generation while GPUs have been scaling by leaps and bounds. So if you were to offset CPU tasks onto the GPU, then even with computational overhead considered the GPU should be able to outperform the CPU. To add credibility to Jen-Hsun Huang’s statement, an NVIDIA whitepaper that was accidentally made public outlined a Multi-Chip-Module(MCM) GPU. Since Moore’s law also affects GPUs, NVIDIA is exploring the package-level integration of multiple GPUs to build a bigger GPU module.
Since CPUs are general purpose computational devices, having different modules within the MCM GPU focused on different computational aspects essentially makes it a CPU. It seems fairly plausible on a 20-year roadmap to see the CPU make way for the GPU.
Computers might just get a little too personal. By personal, we mean both, near-permanent fixtures that are embedded into your person and swappable gadgets that are placed on your person. Cybernetic implants aren’t unheard of but the current generation of cybernetic implants are hardly full-fledged computers. Twenty years from now, we should see human beings make use of multiple cybernetic implants to compensate for human inadequacies and a central node to control them all.
Most of these implants will start off as prosthetics and it will be some time before cosmetic applications take off. So we aren’t going to be transformed into the Borg collective 20 years from now, but having a cybernetic implant inside you will be really common. Some academics predict that we might have the electronic equivalent of stem cells in the future that can be injected into your body to help repair impaired motor functions or simply enhance your brain. Honestly, we’d argue that 20 years feels a bit too less for such a tremendous leap in technology.
Human Interface Devices are probably the most interesting devices that most of us would be looking forward to. The massive hype around Pranav Mistry’s sixth sense when it was first unveiled and the current VR boom serves as a testament to how much people are interested in HIDs. Current generation HIDs include interactive projections and tracking without the need of interacting with a physical controller such as the Microsoft Kinect. However, these technologies aren’t anywhere close to achieving the accuracy of the traditional mouse. This is where accurate eye-tracking comes into the picture. However, eye-tracking already exists and if you’ve ever tried one, you’ll realise how bland an experience it can be. You are simply staring at a screen and waiting for a visual feedback to tell you that your input has been received i.e. it’s a unidirectional system.
Bidirectional HID is the future. Imagine having a system that can provide haptic feedback for a non-contact input. In a similar vein, think of emotion tracking, where human emotions are considered before receiving input. It’s almost as if computers are gaining good bedside manners. Throw a bit of VR/AR/MR into the mix and you have systems like those seen in the Minority Report become a reality.
PC as a Service (PCaaS)
It goes without saying that the PC will get commoditised in the next 20 years leading to the PC being offered as a service. It’s quite plausible to expect average performing PCs to be sold at kiosks for next to nothing sort of like miniature credit cards. And with cloud storage becoming cheaper by the day, you could simply do away with the PC altogether and just carry around an ID which can simply bring all your data to your fingertips no matter where you are. If you need the extra compute power, you can simply plug in another compute module and you’re good to go, or you can simply move to a higher payment plan and get the extra processing power in a manner of seconds.
Despite all of these conveniences, there’s one thing that won’t die off so easily in the next 20 years, and that’s the enthusiast machine. Heavy compute tasks like video editing, AI, ML, etc. will still require a large format machine. Long live the #PCMasterRace.