Skip to main content

The Future of PCs and Home Computing

Back in 2005, noted British Telecommunication "Futurologist" Ian Pearson hypothesized that computing power would be so great, and our ability to tap into it so advanced, that by 2050 we could effectively utilize the technology to store and access human consciousness. "So when you die," Pearson so eloquently understated, "it’s not a major career problem." Supporting his now-celebrated speculation on future pseudo-immortality, Pearson also concluded that a "conscious computer with superhuman levels of intelligence" could be readied as early as 2020.

TerminatorHow close will Pearson come to the truth? What will computers look like in 2050? What will they be capable of? Well, let’s just say that this is the same guy who envisaged in 1999 that our pets would be robotic and our contact lenses would project HUD-like displays into the retina – a la The Terminator – by 2010. To be fair, the latter concept may not be that far off, and certainly Pearson’s a bright fellow who’s been proven correct enough times that his forecasts can’t be seen as mere pap.

Recommended Videos

Gordon MooreWe do know this much: In 1965, Intel co-founder Gordon Moore predicted the number of transistors on a circuit board – and thusly, its speed – would double every 24 months. His prophecy has become known as Moore’s Law, and it hasn’t been wrong yet. In fact, it looks like it won’t be wrong for quite a few more years.


Chip Speed and Processing Power

So what’s the big deal about Moore’s Law? It’s simple – computing speed, power, and miniaturization are the secret behind virtually all the major technological advancements we’ve seen so far, and will see in the near future. Just look how far we’ve come in the last few decades. Twenty years ago, the finest desktop computer CPUs featured perhaps 100,000 transistors and chugged along at 33MHz. Today, high-end quad-core CPUs scream along at 3GHz and brandish in excess of 800 million transistors. Indeed, some of today’s transistors are so small that millions could fit on the head of a pin.

In 2050, however, we will have long exhausted current design and manufacturing techniques and concepts, which up until now have involved "etching" multi-layered silicon wafers with ultraviolet light in a process called photolithography. There are several highly sophisticated reasons behind this, but suffice it to say that leading chipmakers such as Intel, already working in ridiculously sub-microscopic environments, will within the next two decades come up against a number of undeniable limitations. The current process and the current materials used in that process – and the accepted laws of physics – won’t support continued miniaturization and energy efficiency as we reach molecular levels.


New PC and Computing Technologies

That’s forcing scientists to look at new technologies. The bad news is that we’re not really sure right now which technology will win out. In the foreseeable future, recently discovered materials such as graphene may be used instead of silicon to form circuit board wafers. Graphene, essentially a single layer of the very same graphite used in pencils, conducts electricity much faster than silicon. In the more distant future, radical ideas such as optical computing, which uses protons and light in lieu of electrons and transistors, might be the ticket.

But by 2050, we may well be in the realm of quantum computing. This is a world best understood by the proverbial rocket scientist, though the general theory involves the harnessing of quantum mechanical phenomena (the same stuff that prevents us from continuing to miniaturize today’s silicon-based transistors) to do good rather than evil. Instead of utilizing "bits," which can either be on or off, like a light switch, quantum computing utilizes qubits (quantum bits), which can be on, off, or both.

Because quantum computing takes place at the atomic level, and because each qubit is capable of handling multiple computations simultaneously, a quantum-based computer of the future could very well make today’s desktop look like an abacus. The only major holdup – and it’s a gargantuan one – is in the development of a means of controlling and stabilizing all those qubits. If we can manage to do that, and we likely will before 2050 rolls around, the possibilities and the potential are staggering.

D-Wave Quantum Computing Processor

D-Wave Quantum Computing Processor


SSD, Flash and High-Tech Storage Devices

And what of storage devices? We’ve seen just recently that traditional hard drives with spinning platters are not the ideal we once thought. They’re simply too fragile, too noisy, too slow, and too big to be a reliable part of our increasingly demanding, and often mobile lifestyles. Instead, the next few years look to be the territory of flash memory-based and SSD (solid-state drive) devices. Indeed, 1TB (1,000 gigabyte) SSDs are already available, and 2TB drives are just around the corner.

Meanwhile, the future of large-scale storage may well lay in something called quantum holography. By definition, holography is a method of recreating a three-dimensional image of an object via patterns of light produced by a split laser beam. In "holographic storage," data is imprinted onto an input device called a spatial light modulator, with two laser beams intersecting at a predetermined location to read the data. By changing either the angle between the object and reference beams or the laser wavelength, multiple sets of data can be stored at the exact same coordinates.
 
Add "quantum" to the equation and you’re getting really, really small. And indeed, just this January, a team of Stanford physicists were able to permanently store 35 bits of information in the quantum space surrounding a single electron. This is just the beginning of an emerging technology that one day soon may be capable of storing "petabytes" (1,000,000 gigabytes) of data.

The real question may be whether we need or even want that much personal storage – or all that power we spoke of earlier – in the year 2050. Certainly if we want to personally store a few thousand HD movies, we’ll need all the storage we can get our hands on. But why even bother if the Internet "cloud" and "cloud computing" exists in the form many futurists agree it will?


Online Networked, Mobile and Cloud Computing

As we become increasingly mobile, the advantages, and, some say, the need for full-blown access to digital information/entertainment – whether it’s our own or stuff that lies in the public domain – is just as increasingly vital. Imagine a world where virtually everything – from cars, to airplanes, to "smart homes," to the public terminals we’ll likely see springing up all around us – are all, essentially, networked together. Imagine a world without thumb and portable hard drives, where all you need to do is connect and do what you need to do. Need to create a document? Do it online. Access your latest vacation pics? Do it online. Play games, listen to music from your own personalized library, edit one of your videos, peruse personal health records, or turn on your porch lights while you’re away from home? Again, do it online, potentially through some form of subscription service, no matter where you are.

Window AzureSure, the viability of such a future depends very much on whether the infrastructure can handle it, but that’s why mega-corporations like Microsoft, which for decades has made money from the fact that we’re all a bunch of partially-connected yet generally individual pods, is now dropping  billions on infrastructure upgrades and "data centers." And let’s not forget Windows "Azure," Microsoft’s new cloud-based application platform, designed to compete head-to-head with Google’s Google Apps.

Control freaks might not take to this whole cloud idea initially, but there are benefits apart from those discussed above. Cloud computing decreases the need for an ultra-powerful home-based system, and will likely reduce maintenance issues. And it should be less costly and less troubling than purchasing all that add-on software and high-end, high-energy hardware.

Still, those of us who really use our PCs – gamers, animators, 3D graphic designers, and the like – won’t give up their hot rods for some time to come, even if it means risking incarceration at the hands of the Green Police, which it may by 2050.

What certainly will change is the way we interact with our computers. Granted, today’s wireless keyboards and cordless optical mice are extremely easy to use even by futuristic standards – but the truth is that you could theoretically get a lot more done and experience and much more freedom if you weren’t constricted to smacking keys and sliding mice back and forth on your desktop.


Voice Recognition and Artificial Intelligence

Voice RecognitionTake voice recognition, for instance. The technology has been around for many years already, but it’s never been proficient enough to catch hold in the mainstream. The problems are many. Proper punctuation is difficult to decipher. Speech patterns and accents differ from one person to the next.  Many English words have double meanings. The solution to current voice recognition hassles involves not only better future software, but also serious computing horsepower upon which to run it. All of that will be here long before 2050, and that’s one of the reasons so many big time deals have been flying about lately between voice recognition researchers, developers, and industry giants such as Google.

And you can expect your computer device to talk back, too – with reasoned, thoughtful statements if Intel’s chief technology officer, Justin Rattner, can be trusted. Rattner prophesized at an August 2008 Intel forum that he believes the lines between human and machine intelligence will begin to blur by the middle of the next decade, and potentially reach "singularity" (techno babble for the era when artificial intelligence reaches the level of human intelligence) by 2050.

Intel CatomsAt the same forum, Rattner also discussed Intel’s recent research into artificially intelligent, grain-of-sand-sized robots dubbed "catoms." Though the idea might seem far-fetched right now (as did the concept of personal computers in 1960), millions of catoms could one day be manipulated by electromagnetic forces to clump together virtually any way we see fit. The real kicker? Catoms are also shape-shifters. According to Intel, a cell phone comprised of catoms may, by 2050, be able to morph into a keyboard-shaped device for text messaging.


Touchscreens and Motion-Sensitive Devices

And if catoms don’t reach their potential, perhaps we can look toward Nintendo’s Wii gaming system, or recent smart phones such as the iPhone, to get an inkling of what may be in store for future computer interaction. If a cell phone can offer multi-touch operations (wherein applications are controlled with several fingers) and functions based on how you tilt or move the device in space, a similarly capable computer interface can’t be too far away. Microsoft obviously believes strongly enough in multi-touch technology, having included support for it with its next operating system, Windows 7.

iPhone

Apple iPhone

But perhaps the ultimate solution to computer device interfacing doesn’t involve hands or the voice at all. Maybe it simply involves your brain.

The idea is nothing new. Researchers have, for many years, experimented with monkeys, implanting electrodes into their brains and watching as the primates perform simple tasks without physical input. But while that may all be well and good for those of us who’ll consent to such an incredibly invasive procedure, the real magic will happen when we’re able to monitor brain functions without going inside the skull.

A team at Microsoft Research – with input from several universities – has been dealing with these very issues for some time now, reporting highly accurate results when wearing non-invasive electroencephalography (EEG) caps and sensor-packed armbands to measure muscle activity. To which we say: Thank you, monkeys, for handling the really icky part.

Monkey Controlling Bionic Arm

Monkey Controlling Bionic Arm


LCD, OLED and 3D Displays

As for displays, the distant future isn’t quite so clear. Certainly, we know that LCD technology has done wonders for our eyes, our desktop space, and our energy consumption when compared to old school CRTs. Yet even now, LCD appears ultimately doomed, ousted by something called OLED (Organic Light Emitting Diode).

Heliodisplay 3D DisplayIn the works for many years, but only now beginning to appear on the market, OLED has several advantages over LCD, including the capacity for much thinner screens (so thin in fact that some can be rolled up and taken with you), far greater energy efficiency (they do not require backlighting), and brighter, more vivid images. Expect a full roster of OLED computer displays in the coming years.

But by 2050, we will have moved past OLED and potentially into the world of interactive 3D displays. 3D displays are intriguing, because they broadcast images in open space, much like Leia’s holographic message in the original Star Wars movie. Moreover, some forms of the technology will likely support interactivity. The technology is sketchy at present, and there is some question of it being a quirky sidestep, but there certainly are enough proponents and developers. Some approaches involve 3D glasses, and therefore should be written off immediately, and most sport curious names as volumetric, stereogram, and parallax.


Is the Desktop PC Dead?

All this speculation, and we still haven’t touched upon what is arguably the most important question of all: Will the venerable, traditional PC be dead and dormant by the year 2050?

Given all that we’ve looked at above, chances are it will. Indeed, chances are that people in 2050 will look back at the big, clunky cases and spider’s web of cables of today’s towers and mini-towers and chuckle in much the same way we now look back in wonderment at those monstrous AM floor radios of the mid-20th century.   

But what will replace it? Will we, as some futurists predict, become a nation of handheld computer users? Probably, though some say even today’s handhelds will look archaic next to the super-thin wearable computers we may have at our disposal by 2050. Yet there will always be a need for something slightly more substantial – if not for keyboards, which may be quite dispensable long before that, then for the big, beautiful (but energy-efficient) displays we’ll always crave.


Tomorrow’s Connected Home

Imagine this scenario. You arrive home with your always-connected, quantum-powered  portable computing device attached to your body or clothing. Your ultra-thin 40-inch display, or perhaps your fully-realized 3D display recognizes your approach, automatically switches on, says hello, and listens for your verbal instructions. Alternately, it may wait for you to slip on some form of brainwave-measuring headset so you need merely think your instructions.

In either case, you advise your PCD that you want to continue working on that presentation you’d begun earlier in the day. Hundreds of times more powerful than today’s fastest desktop, it obliges by retrieving the presentation from a remote, online storage center and sending it, wirelessly of course, to your display. The entire operation takes less than a second or two, and you’re soon yapping away comfortably, completing your presentation merely by speaking in plain English. If your display is of the 3D variety, you may reach out occasionally to manually manipulate portions of it.

When you’re finished, your PCD recognizes you’ve made a critical mistake along the way. It is, after all, capable of independent "thought" and has been trained to understand the way you work. It alerts you, helps you rethink your error, and congratulates you when you’ve perfected your presentation.

Feeling celebratory, you grab a cold one from the refrigerator and prepare for a romantic evening with your significant other – a robotic consort. Yep, we’ll likely have those too when 2050 rolls around, though that’s an entirely different subject we’ll save for another tale…

Christopher Nickson
Former Digital Trends Contributor
Chris Nickson is a journalist who's written extensively about music and related fields. He's the author of more than 30…
Want better home security? Don’t miss these Reolink early Black Friday deals
Reolink Go PT Ultra with Solar Panel installed on roof

With a few things happening around my home, hurricanes, an uptick in crime, and a few other concerns, I recently decided to beef up my home security. After quite a bit of research, personally, I settled on a series of Reolink devices. Had I been smarter about it, I would have waited a few weeks. That's because a ton of Reolink early Black Friday deals have dropped and it's an excellent way to save a bunch of money. From wireless security cameras, for indoors and out, to smart doorbells, and an excellent smart home hub with local storage, there are a lot of options. Perhaps more exciting, however, is that there are also a lot of great home security deals. Let's not mince words and get into it, shall we?

 
Reolink Home Hub -- $85, was $100 15% off
Unfortunately, a lot of security cameras create storage concerns. If you're not storing the footage in the cloud where is it going? When will you run out of storage? Will you lose important recordings? None of that's a worry at all with Reolink's Home Hub. It's an all-in-one smart home management and storage center. Keep all your recordings offline and available via local storage, built into the hub. More importantly, it supports WiFi 6, incorporates an alarm, offers unparalleled security and privacy via AES128-bit encryption, and much more. You can connect up to eight cameras or devices, including a smart video doorbell to the Hub.

Read more
Apple gearing up for ‘first foray into smart home camera market’
Apple Fifth Avenue Store Apple Logo

Apple is planning to release a smart home security camera in 2026, according to Ming-Chi Kuo, a prominent figure with contacts in Apple’s supply chain in Asia.

“Apple is making its first foray into the smart home IP camera market, with mass production scheduled for 2026, targeting annual shipments in the tens of millions,” Kuo wrote in an online post shared on Monday, adding that Chinese tech firm GoerTek will be the exclusive assembly supplier.

Read more
This tiny smart puck can control your smart home without the need for mobile apps
The Linxura with four buttons on the screen.

My home is overrun with smart gadgets, and the main way I interact with them is through my smartphone. This is usually a robust way to control my gadgets, as my phone is never far from my side -- whether I need to toggle my smart lights, adjust my air purifier, or change the thermostat, my smartphone lets me tackle most of these tasks in a matter of seconds. However, having a physical button to perform these actions would certainly be preferred. Instead of diving into apps and wading through menus and automations, a physical controller would perform actions at the press of a button, much like a remote works for a TV.

That’s the idea behind the Linxura Smart Controller -- a disc-shaped object that lets you perform a long list of actions at the press of a button. After syncing it with your smart devices, you’ll no longer need your companion mobile apps. Just tap or double-tap its outer ring, and you can toggle hundreds of different products. I’ve been testing it out in my home, and while I think it’s a fun device, it falls short in a few areas. But if you’re craving a physical remote for your smart home, there’s good reason to take a closer look at the innovative gadget.
Simple, but tedious setup

Read more