Skip to main content

The Future of PCs and Home Computing

Back in 2005, noted British Telecommunication "Futurologist" Ian Pearson hypothesized that computing power would be so great, and our ability to tap into it so advanced, that by 2050 we could effectively utilize the technology to store and access human consciousness. "So when you die," Pearson so eloquently understated, "it’s not a major career problem." Supporting his now-celebrated speculation on future pseudo-immortality, Pearson also concluded that a "conscious computer with superhuman levels of intelligence" could be readied as early as 2020.

TerminatorHow close will Pearson come to the truth? What will computers look like in 2050? What will they be capable of? Well, let’s just say that this is the same guy who envisaged in 1999 that our pets would be robotic and our contact lenses would project HUD-like displays into the retina – a la The Terminator – by 2010. To be fair, the latter concept may not be that far off, and certainly Pearson’s a bright fellow who’s been proven correct enough times that his forecasts can’t be seen as mere pap.

Recommended Videos

Gordon MooreWe do know this much: In 1965, Intel co-founder Gordon Moore predicted the number of transistors on a circuit board – and thusly, its speed – would double every 24 months. His prophecy has become known as Moore’s Law, and it hasn’t been wrong yet. In fact, it looks like it won’t be wrong for quite a few more years.


Chip Speed and Processing Power

So what’s the big deal about Moore’s Law? It’s simple – computing speed, power, and miniaturization are the secret behind virtually all the major technological advancements we’ve seen so far, and will see in the near future. Just look how far we’ve come in the last few decades. Twenty years ago, the finest desktop computer CPUs featured perhaps 100,000 transistors and chugged along at 33MHz. Today, high-end quad-core CPUs scream along at 3GHz and brandish in excess of 800 million transistors. Indeed, some of today’s transistors are so small that millions could fit on the head of a pin.

In 2050, however, we will have long exhausted current design and manufacturing techniques and concepts, which up until now have involved "etching" multi-layered silicon wafers with ultraviolet light in a process called photolithography. There are several highly sophisticated reasons behind this, but suffice it to say that leading chipmakers such as Intel, already working in ridiculously sub-microscopic environments, will within the next two decades come up against a number of undeniable limitations. The current process and the current materials used in that process – and the accepted laws of physics – won’t support continued miniaturization and energy efficiency as we reach molecular levels.


New PC and Computing Technologies

That’s forcing scientists to look at new technologies. The bad news is that we’re not really sure right now which technology will win out. In the foreseeable future, recently discovered materials such as graphene may be used instead of silicon to form circuit board wafers. Graphene, essentially a single layer of the very same graphite used in pencils, conducts electricity much faster than silicon. In the more distant future, radical ideas such as optical computing, which uses protons and light in lieu of electrons and transistors, might be the ticket.

But by 2050, we may well be in the realm of quantum computing. This is a world best understood by the proverbial rocket scientist, though the general theory involves the harnessing of quantum mechanical phenomena (the same stuff that prevents us from continuing to miniaturize today’s silicon-based transistors) to do good rather than evil. Instead of utilizing "bits," which can either be on or off, like a light switch, quantum computing utilizes qubits (quantum bits), which can be on, off, or both.

Because quantum computing takes place at the atomic level, and because each qubit is capable of handling multiple computations simultaneously, a quantum-based computer of the future could very well make today’s desktop look like an abacus. The only major holdup – and it’s a gargantuan one – is in the development of a means of controlling and stabilizing all those qubits. If we can manage to do that, and we likely will before 2050 rolls around, the possibilities and the potential are staggering.

D-Wave Quantum Computing Processor

D-Wave Quantum Computing Processor


SSD, Flash and High-Tech Storage Devices

And what of storage devices? We’ve seen just recently that traditional hard drives with spinning platters are not the ideal we once thought. They’re simply too fragile, too noisy, too slow, and too big to be a reliable part of our increasingly demanding, and often mobile lifestyles. Instead, the next few years look to be the territory of flash memory-based and SSD (solid-state drive) devices. Indeed, 1TB (1,000 gigabyte) SSDs are already available, and 2TB drives are just around the corner.

Meanwhile, the future of large-scale storage may well lay in something called quantum holography. By definition, holography is a method of recreating a three-dimensional image of an object via patterns of light produced by a split laser beam. In "holographic storage," data is imprinted onto an input device called a spatial light modulator, with two laser beams intersecting at a predetermined location to read the data. By changing either the angle between the object and reference beams or the laser wavelength, multiple sets of data can be stored at the exact same coordinates.
 
Add "quantum" to the equation and you’re getting really, really small. And indeed, just this January, a team of Stanford physicists were able to permanently store 35 bits of information in the quantum space surrounding a single electron. This is just the beginning of an emerging technology that one day soon may be capable of storing "petabytes" (1,000,000 gigabytes) of data.

The real question may be whether we need or even want that much personal storage – or all that power we spoke of earlier – in the year 2050. Certainly if we want to personally store a few thousand HD movies, we’ll need all the storage we can get our hands on. But why even bother if the Internet "cloud" and "cloud computing" exists in the form many futurists agree it will?


Online Networked, Mobile and Cloud Computing

As we become increasingly mobile, the advantages, and, some say, the need for full-blown access to digital information/entertainment – whether it’s our own or stuff that lies in the public domain – is just as increasingly vital. Imagine a world where virtually everything – from cars, to airplanes, to "smart homes," to the public terminals we’ll likely see springing up all around us – are all, essentially, networked together. Imagine a world without thumb and portable hard drives, where all you need to do is connect and do what you need to do. Need to create a document? Do it online. Access your latest vacation pics? Do it online. Play games, listen to music from your own personalized library, edit one of your videos, peruse personal health records, or turn on your porch lights while you’re away from home? Again, do it online, potentially through some form of subscription service, no matter where you are.

Window AzureSure, the viability of such a future depends very much on whether the infrastructure can handle it, but that’s why mega-corporations like Microsoft, which for decades has made money from the fact that we’re all a bunch of partially-connected yet generally individual pods, is now dropping  billions on infrastructure upgrades and "data centers." And let’s not forget Windows "Azure," Microsoft’s new cloud-based application platform, designed to compete head-to-head with Google’s Google Apps.

Control freaks might not take to this whole cloud idea initially, but there are benefits apart from those discussed above. Cloud computing decreases the need for an ultra-powerful home-based system, and will likely reduce maintenance issues. And it should be less costly and less troubling than purchasing all that add-on software and high-end, high-energy hardware.

Still, those of us who really use our PCs – gamers, animators, 3D graphic designers, and the like – won’t give up their hot rods for some time to come, even if it means risking incarceration at the hands of the Green Police, which it may by 2050.

What certainly will change is the way we interact with our computers. Granted, today’s wireless keyboards and cordless optical mice are extremely easy to use even by futuristic standards – but the truth is that you could theoretically get a lot more done and experience and much more freedom if you weren’t constricted to smacking keys and sliding mice back and forth on your desktop.


Voice Recognition and Artificial Intelligence

Voice RecognitionTake voice recognition, for instance. The technology has been around for many years already, but it’s never been proficient enough to catch hold in the mainstream. The problems are many. Proper punctuation is difficult to decipher. Speech patterns and accents differ from one person to the next.  Many English words have double meanings. The solution to current voice recognition hassles involves not only better future software, but also serious computing horsepower upon which to run it. All of that will be here long before 2050, and that’s one of the reasons so many big time deals have been flying about lately between voice recognition researchers, developers, and industry giants such as Google.

And you can expect your computer device to talk back, too – with reasoned, thoughtful statements if Intel’s chief technology officer, Justin Rattner, can be trusted. Rattner prophesized at an August 2008 Intel forum that he believes the lines between human and machine intelligence will begin to blur by the middle of the next decade, and potentially reach "singularity" (techno babble for the era when artificial intelligence reaches the level of human intelligence) by 2050.

Intel CatomsAt the same forum, Rattner also discussed Intel’s recent research into artificially intelligent, grain-of-sand-sized robots dubbed "catoms." Though the idea might seem far-fetched right now (as did the concept of personal computers in 1960), millions of catoms could one day be manipulated by electromagnetic forces to clump together virtually any way we see fit. The real kicker? Catoms are also shape-shifters. According to Intel, a cell phone comprised of catoms may, by 2050, be able to morph into a keyboard-shaped device for text messaging.


Touchscreens and Motion-Sensitive Devices

And if catoms don’t reach their potential, perhaps we can look toward Nintendo’s Wii gaming system, or recent smart phones such as the iPhone, to get an inkling of what may be in store for future computer interaction. If a cell phone can offer multi-touch operations (wherein applications are controlled with several fingers) and functions based on how you tilt or move the device in space, a similarly capable computer interface can’t be too far away. Microsoft obviously believes strongly enough in multi-touch technology, having included support for it with its next operating system, Windows 7.

iPhone

Apple iPhone

But perhaps the ultimate solution to computer device interfacing doesn’t involve hands or the voice at all. Maybe it simply involves your brain.

The idea is nothing new. Researchers have, for many years, experimented with monkeys, implanting electrodes into their brains and watching as the primates perform simple tasks without physical input. But while that may all be well and good for those of us who’ll consent to such an incredibly invasive procedure, the real magic will happen when we’re able to monitor brain functions without going inside the skull.

A team at Microsoft Research – with input from several universities – has been dealing with these very issues for some time now, reporting highly accurate results when wearing non-invasive electroencephalography (EEG) caps and sensor-packed armbands to measure muscle activity. To which we say: Thank you, monkeys, for handling the really icky part.

Monkey Controlling Bionic Arm

Monkey Controlling Bionic Arm


LCD, OLED and 3D Displays

As for displays, the distant future isn’t quite so clear. Certainly, we know that LCD technology has done wonders for our eyes, our desktop space, and our energy consumption when compared to old school CRTs. Yet even now, LCD appears ultimately doomed, ousted by something called OLED (Organic Light Emitting Diode).

Heliodisplay 3D DisplayIn the works for many years, but only now beginning to appear on the market, OLED has several advantages over LCD, including the capacity for much thinner screens (so thin in fact that some can be rolled up and taken with you), far greater energy efficiency (they do not require backlighting), and brighter, more vivid images. Expect a full roster of OLED computer displays in the coming years.

But by 2050, we will have moved past OLED and potentially into the world of interactive 3D displays. 3D displays are intriguing, because they broadcast images in open space, much like Leia’s holographic message in the original Star Wars movie. Moreover, some forms of the technology will likely support interactivity. The technology is sketchy at present, and there is some question of it being a quirky sidestep, but there certainly are enough proponents and developers. Some approaches involve 3D glasses, and therefore should be written off immediately, and most sport curious names as volumetric, stereogram, and parallax.


Is the Desktop PC Dead?

All this speculation, and we still haven’t touched upon what is arguably the most important question of all: Will the venerable, traditional PC be dead and dormant by the year 2050?

Given all that we’ve looked at above, chances are it will. Indeed, chances are that people in 2050 will look back at the big, clunky cases and spider’s web of cables of today’s towers and mini-towers and chuckle in much the same way we now look back in wonderment at those monstrous AM floor radios of the mid-20th century.   

But what will replace it? Will we, as some futurists predict, become a nation of handheld computer users? Probably, though some say even today’s handhelds will look archaic next to the super-thin wearable computers we may have at our disposal by 2050. Yet there will always be a need for something slightly more substantial – if not for keyboards, which may be quite dispensable long before that, then for the big, beautiful (but energy-efficient) displays we’ll always crave.


Tomorrow’s Connected Home

Imagine this scenario. You arrive home with your always-connected, quantum-powered  portable computing device attached to your body or clothing. Your ultra-thin 40-inch display, or perhaps your fully-realized 3D display recognizes your approach, automatically switches on, says hello, and listens for your verbal instructions. Alternately, it may wait for you to slip on some form of brainwave-measuring headset so you need merely think your instructions.

In either case, you advise your PCD that you want to continue working on that presentation you’d begun earlier in the day. Hundreds of times more powerful than today’s fastest desktop, it obliges by retrieving the presentation from a remote, online storage center and sending it, wirelessly of course, to your display. The entire operation takes less than a second or two, and you’re soon yapping away comfortably, completing your presentation merely by speaking in plain English. If your display is of the 3D variety, you may reach out occasionally to manually manipulate portions of it.

When you’re finished, your PCD recognizes you’ve made a critical mistake along the way. It is, after all, capable of independent "thought" and has been trained to understand the way you work. It alerts you, helps you rethink your error, and congratulates you when you’ve perfected your presentation.

Feeling celebratory, you grab a cold one from the refrigerator and prepare for a romantic evening with your significant other – a robotic consort. Yep, we’ll likely have those too when 2050 rolls around, though that’s an entirely different subject we’ll save for another tale…

Christopher Nickson
Former Digital Trends Contributor
Chris Nickson is a journalist who's written extensively about music and related fields. He's the author of more than 30…
More BLUETTI Black Friday deals for home power backup and more — Save over $200
BLUETTI Charger 1 visualization of it connected to power station in back of jeep

Continuing coverage of BLUETTI Black Friday deals, including one of its latest portable power stations the Elite 200 V2, we have a few more callouts worth taking a look at. Where before the main focus was home battery backup or a solid portable power option for off-grid, these deals are smaller in scope. The BLUETTI AC180 solar portable power station, for instance, features a 1,152-watt-hour capacity, which is not nearly as massive as what you get with the Elite 200 V2, but that's the point. It's manageable, has 11 outlets to power multiple devices simultaneously, and has a built-in MPPT charge controller and 500-watt solar input for fast charging via solar. It's best for use on the go or on the road, just like the BLUETTI Charger 1.

Enough about the gear, let's talk about those deal prices. As an aside, right now you can use code DT01 for an extra 5% discount on top of all sale prices. That code is exclusive to Digital Trends readers and applicable to all products on BLUETTI's website.

Read more
Home Depot’s “Build Your Own” Black Friday power tool deal is its best-kept secret
Home Depot consumer

Covering cordless power tools during Black Friday is surprisingly fascinating. Of the Black Friday deals I've covered so far this year, both Black Friday cordless drill deals and Black Friday power tool deals have been included included in my retinue, alongside a host of similar articles for our "brother" site The Manual.

There's one deal bundle among them all, however, that I've been utterly fascinated by this whole time. It's what I'm calling Home Depot's "build your own" power tool deal. This intriguing offering, which feels under advertised by Home Depot, seemingly inverts what we think of when we think of power tool deals. Instead of feeling stuck buying buying batteries separately — with a drill, with a big bundle, or with the lucky random battery — you're able (now) to build our deal and get everything you need, at a flat price. This means you can get the exact tool you want, and all the batteries you want with it, for as low as $99. Check it out yourself by tapping the button below, or keep reading to see what makes this deal so unique, how to shop it, and a few suggestions as to what you should buy.

Read more
Apple’s smart home display already sounds like a convenience victory
Nest Hub Max

Over the past few weeks, rumors of Apple developing a smart display for home control have picked up pace. The company is said to be developing two versions, and one of them might even feature a robotic arm and revive an iconic Mac’s design. 

Now, Bloomberg has shared some juicy details about how the entry-level option will look and work. The device will offer a 6-inch screen with a square-ish format flanked by sensors, including a FaceTime camera in landscape orientation. 

Read more