Skip to main content

Computers can’t keep shrinking, but they’ll keep getting better. Here’s how

Why are modern computers so much better than old ones? One explanation relates to the enormous number of advances that have taken place in microprocessing power over the past several decades. Roughly every 18 months, the number of transistors that can be squeezed onto an integrated circuit doubles.

Recommended Videos

This trend was first spotted in 1965 by Intel co-founder Gordon Moore, and is popularly referred to as “Moore’s Law.” The results have propelled technology forward and transformed it into a trillion-dollar industry, in which unimaginably powerful chips can be found in everything from home computers to autonomous cars to smart household devices.

But Moore’s Law may not be able to go on indefinitely. The high tech industry might love its talk of exponential growth and a digitally-driven “end of scarcity,” but there are physical limits to the ability to continually shrink the size of components on a chip.

What is Moore’s Law?

Moore’s Law is an observation made by Intel co-founder Gordon Moore in 1965. It states that roughly every 18 months, the number of transistors that can be squeezed onto an integrated circuit doubles.”

Already the billions of transistors on the latest chips are invisible to the human eye. If Moore’s Law was to continue through 2050, engineers will have to build transistors from components that are smaller than a single atom of hydrogen. It’s also increasingly expensive for companies to keep up. Building fabrication plants for new chips costs billions.

As a result of these factors, many people predict Moore’s Law will peter out some time in the early 2020s, when chips feature components that are only around 5 nanometers apart. What happens after that? Does technological progress grind to a halt, as though we were stuck today using the same Windows 95 PC we owned a couple of decades ago?

Not really. Here are seven reasons why the end of Moore’s Law won’t mean the end of computing progress as we know it.

Moore’s Law won’t end ‘just like that’

Imagine the disaster that would befall us if, tomorrow, the law of thermodynamics or Newton’s three laws of motion ceased to function. Moore’s Law, despite its name, isn’t a universal law of this kind. Instead, it’s an observable trend like the fact that Michael Bay tends to release a new Transformers movie in the summer — except, you know, good.

Two Intel 8080 chips from the 1970s (top-left), the Intel 486 and Pentium from 1989 and 1992 (top-right), the Dual-Core Xeon Processor 5100 from 2006, and the i7 8th Generation from 2017.

Why do we bring this up? Because Moore’s Law isn’t going to just end like someone turning off gravity. Just because we no longer have a doubling of transistors on a chip every 18 months doesn’t mean that progress will come to a complete stop. It just means that the speed of improvements will happen a bit slower.

Picture it like oil. We’ve gotten the easy-to-reach stuff on the surface, now we need to use technologies like fracking to gain access to the tougher-to-get resources.

Better algorithms and software

Think of those NFL or NBA stars who make so much money that they don’t have to worry about making their existing savings last longer. That’s a slightly messy, but still pertinent, metaphor for the relationship between Moore’s Law and software.

Squeezing more performance out of the same chips will become a much higher priority.

While there’s beautifully coded software out there, a lot of the time programmers haven’t had to worry too much about streamlining their code to make it less sluggish year after year because they know that next year’s computer processors will be able to run it better. If Moore’s Law no longer makes the same advances, however, this approach can no longer be relied upon.

Squeezing more software performance out of the same chips will therefore become a much higher priority. For speed and efficiency, that means creating better algorithms. Beyond speed, hopefully it will mean more elegant software with a great level of focus on user experience, look-and-feel, and quality.

Even if Moore’s Law was to end tomorrow, optimizing today’s software would still provide years, if not decades, of growth — even without hardware improvements.

More specialized chips

With that said, one way for chip designers to overcome the slowing down of advances in general-purpose chips is to make ever more specialized processors instead. Graphics processing units (GPUs) are just one example of this. Custom specialized processors can also be used for neural networks, computer vision for self-driving cars, voice recognition, and Internet of Things devices.

Audi Vehicle to Infrastructure
As Moore's Law slows down, expect to see chipmakers ramp up production for more specialized chips. GPUs, for example, are already a driving force for computer vision in autonomous cars and vehic vehicle to infrastructure networks.
As Moore’s Law slows, chipmakers will ramp up production on specialized chips. GPUs, for example, are already a driving force for computer vision in autonomous cars and vehicle to infrastructure networks.

These special designs can boast a range of improvements, such as greater levels of performance per watt. Companies jumping on this custom bandwagon include market leader Intel, Google, Wave Computing, Nvidia, IBM, and more.

Just like better programming, the slowdown in manufacturing advances compels chip designers to be more thoughtful when it comes to dreaming up new architectural breakthroughs.

It’s no longer just about the chips

Moore’s Law was born in the mid-1960s, a quarter-century before computer scientist Tim Berners-Lee invented the World Wide Web. While the theory has held true ever since then, there’s also less need to rely on localized processing in an age of connected devices. Sure, a lot of the functions on your PC, tablet, or smartphone are processed on the device itself, but a growing number aren’t.

With Cloud computing  a lot of the heavy lifting can be carried out elsewhere.

Cloud computing means that a lot of the heavy lifting for big computational problems can be carried out elsewhere in large data centers, using massively parallel systems that utilize many, many times the number of transistors in a regular single computer. That’s especially true for A.I. intensive tasks, such as the smart assistants we use on our devices.

By having this processing carried out elsewhere, and the answer delivered back to your local machine when it’s calculated, machines can get exponentially smarter without having to change their processors every 18 months or so.

New materials and configurations

Silicon Valley earned its name for a reason, but researchers are busy investigating future chips which could be made of materials other than silicon.

For example, Intel is doing some amazing work with transistors which are built in an upwards 3D pattern instead of laying flat to experiment with different ways to pack transistors onto a circuit board. Other materials such as those based on elements from the third and fifth columns of the periodic table could take over from silicon because they are better conductors.

Right now, it’s not clear whether these substances will be scalable or affordable, but given the combined expertise of the tech industry’s finest — and the incentive that will go along with it — the next semiconductor material could be out there waiting.

Quantum computing

Quantum computing is probably the most “out there” idea on this list. It’s also the most second most exciting. Quantum computers are, right now, an experimental and very expensive technology. They are a different animal from the binary digital electronic computers we know, which are based on transistors.

IBM Research

Instead of encoding data into bits which are either 0 or 1, quantum computing deals with quantum bits, which can be 0, 1, and both 0 and 1 at the same time. Long story short? These superpositions could make quantum computers much faster and more efficient than currently existing mainstream computers.

Making quantum computers carries plenty of challenges (they need to be kept incredibly cold for one thing). However, if engineers can crack this problem we may be able to trigger enormous progress at a pace so rapid it would make Gordon Moore’s head spin.

Stuff we can’t think of yet

Very few people would have predicted smartphones back in the 1980s. The idea that Google would become the giant that it is or that an e-commerce website like Amazon would be on track to become the first $1 trillion company would have sounded crazy at the start of the 1990s.

The point is that, when it comes to the future of computing, we’re not going to claim to know exactly what’s around the corner. Yes, right now quantum computing looks like the big long term computing hope post-Moore’s Law, but chances are that in a few decades computers will look entirely different from the ones we use today.

Whether it’s new configurations of machines, chips made out of entirely new materials, or new types of subatomic research that open up new ways of packing transistors on to chips, we believe the future of computing — with all the ingenuity it involves — will be A-okay.

Luke Dormehl
Former Digital Trends Contributor
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
Intel’s promised Arrow Lake autopsy details up to 30% loss in performance
The Core Ultra 9 285K socketed into a motherboard.

Intel's Arrow Lake CPUs didn't make it on our list of the best processors when they released earlier this year. As you can read in our Core Ultra 9 285K review, Intel's latest desktop offering struggled to keep pace with last-gen options, particularly in games, and showed strange behavior in apps like Premiere Pro. Now, Intel says it has fixed the issues with its Arrow Lake range, which accounted for up to a 30% loss in real-world performance compared to Intel's in-house testing.

The company identified five issues with the performance of Arrow Lake, four of which are resolved now. The latest BIOS and Windows Updates (more details on those later in this story) will restore Arrow Lake processors to their expected level of performance, according to Intel, while a new firmware will offer additional performance improvements. That firmware is expected to release in January, pushing beyond the baseline level of performance Intel expected out of Arrow Lake.

Read more
You can get this 40-inch LG UltraWide 5K monitor at $560 off if you hurry
A woman using the LG UltraWide 40WP95C-W 5K monitor.

If you need a screen to go with the upgrade that you made with desktop computer deals, and you're willing to spend for a top-of-the-line display, then you may want to set your sights on the LG 40WP95C-W UltraWide curved 5K monitor. From its original price of $1,800, you can get it for $1,240 from Walmart for huge savings of $560, or for $1,275 from Amazon for a $525 discount. You should complete your purchase quickly if you're interested though, as there's no telling when the offers for this monitor will expire.

Why you should buy the LG 40WP95C-W UltraWide curved 5K monitor
5K monitors are highly recommended for serious creative professionals, such as graphic designers and filmmakers, for their extremely sharp details and precise colors, and the LG 40WP95C-W UltraWide curved 5K monitor is an excellent choice. We've tagged it as the best ultrawide 5K monitor in our roundup of the best 5K monitors, with its huge 40-inch curved screen featuring 5120 x 2160 resolution, 98% coverage of the DCI-P3 spectrum, and support for HDR10 providing striking visuals that you won't enjoy from most of the other options in the market.

Read more
Generative-AI-powered video editing is coming to Instagram
Instagram on iPhone against a colorful background.

Editing your Instagram videos will soon be as simple as typing out a text prompt, thanks to a new generative AI tool the company hopes to release in 2025, CEO Adam Mosseri announced Thursday.

The upcoming tool, which leverages Meta's Movie Gen model, will enable users to "change nearly any aspect of your videos," Mosseri said during his preview demonstration. Those changes range from subtle modifications, like adding a gold chain to his existing outfit or a hippo in the background, to wholesale alterations including swapping his wardrobe or giving himself a felt, Muppet-like appearance.

Read more