Skip to main content

IBM takes one small step for quantum mechanics, one giant leap for computing

ibm takes one small step for quantum mechanics giant leap computing mjin4up
Image Credit: IBM
IBM has announced it’s surmounted one of the biggest hurdles on the road toward creating the world’s first true usable quantum computer.

A number of analysts have predicted that the jump from traditional computing to quantum chips could be on par with the revolution we saw when the world moved from vacuum tubes to integrated circuits back in the early sixties.

Recommended Videos

The reason for this increased power is that quantum computers are capable of processing multitudes more calculations than traditional CPUs at once, because instead of a transistor existing in one of either two states — on, or off — independently of one another, a quantum bit can be both at the same time.

Please enable Javascript to view this content

How is that possible? Well, while the specifics of the mechanism that makes it work involves a bit more math than I could sit through in college, at its essence the computer is taking advantage of a quantum phenomena known as “superposition,” wherein an atom can act as both a wave and a particle at once.

In short, this means that at least in theory, quantum bits (or “qubits”), can process twice as much information twice as fast. This has made the race to create the world’s first true quantum computer a bit of a Holy Grail moment for big chip makers, who have found themselves inching closer to maxing out Moore’s Law as 22 nano-meter transistors shrink to to 14nm, and 14nm tries to make the jump to 10.

So far we’ve seen just one company pull out in front of the herd with its own entry, D-Wave, which first debuted all the way back in 2013. Unfortunately for futurists, the D-Wave is more a proof of concept that quantum computing is at least possible, but still not necessarily all that much quicker than what we have to work with today.

IBM has found a way for quantum computers to check their own errors, overcoming quantum decoherence.

Now though, according to a statement released by IBM Research, it seems Big Blue may have found a way around one of the biggest qualms in quantum computing by sorting out the problem of something known as “quantum decoherence.”

Decoherence is a stumbling block that quantum computers run into when there’s too much “noise” surrounding a chip, either from heat, radiation, or internal defects. The systems that support quantum chips are incredibly sensitive pieces of machinery, and even the slightest bit of interference can make it impossible to know whether or not the computer was able to successfully figure out that two plus two equals four.

IBM was able to solve this by upping the number of available qubits laid out on a lattice grid to four instead of two, so the computer can compensate for these errors by running queries against itself and automatically compensating for any difference in the results.

In laymen’s, this means that researchers can accurately track the quantum state of a qubit, without altering the result through the act of observing alone.

Quantum computing could be potentially transformative, enabling us to solve problems that are impossible or impractical to solve today,” said Arvind Krishna, senior vice president and director of IBM Research, in a statement.

While that may not sound huge, it’s still a big step in the right direction for IBM. The company believes the quantum revolution could be a potential savior for the supercomputing industry, a segment that is projected to be hardest hit by the imminent slowdown of Moore’s trajectory.

Other possible applications up for grabs include solving complex physics problems beyond our current understanding, testing drug combinations by the billions at a time, and creating unbreakable encryption through the use of quantum cryptography.

Chris Stobing
Former Digital Trends Contributor
Self-proclaimed geek and nerd extraordinaire, Chris Stobing is a writer and blogger from the heart of Silicon Valley. Raised…
This is the GPU I’m most excited for in 2025 — and it’s not by Nvidia
The AMD Radeon RX 7900 XTX graphics card.

The next few months will completely redefine every ranking of the best graphics cards. With Nvidia's RTX 50-series and AMD's RDNA 4 most likely launching in January -- and even Intel possibly expanding its Battlemage lineup -- there's a lot to look forward to.

But as for me, I already know which GPU I'm most excited about. And no, it's not Nvidia's rumored almighty RTX 5090. The GPU I'm looking forward to is AMD's upcoming flagship, which will presumably be the RX 8800 XT (or perhaps the RX 9070 XT). Below, I'll tell you why I think this GPU is going to be so important not just for AMD but also for the entire graphics card market.
Setting the pace

Read more
Google Street View camera captures highly suspicious act, leading to arrests
The Google Street View image showing someone loading a large bundle into the trunk of a car.

Imagery from Google’s Street View has reportedly helped to solve a murder case in northern Spain.

Street View is the online tool that lets you view 360-degree imagery captured by cameras mounted on Google’s Street View cars that travel the world.

Read more
AMD’s RDNA 4 may surprise us in more ways than one
AMD RX 7800 XT and RX 7700 XT graphics cards.

Thanks to all the leaks, I thought I knew what to expect with AMD's upcoming RDNA 4. It turns out I may have been wrong on more than one account.

The latest leaks reveal that AMD's upcoming best graphics card may not be called the RX 8800 XT, as most leakers predicted, but will instead be referred to as the  RX 9070 XT. In addition, the first leaked benchmark of the GPU gives us a glimpse into the kind of performance we can expect, which could turn out to be a bit of a letdown.

Read more