Skip to main content

IBM’s silicon photonics technology could propel data centers into the future

silicon photonics, computing, ibm
IBM
IBM announced on May 12 that their research lab has designed and tested a fully integrated wavelength multiplexed photonics chip. This chip will allow 100 Gb/s optical transceivers to be manufactured.

Data centers will be able to provide larger data rates and bandwidth in the future through this technology. In turn, this could expand the potential performance of cloud computing and Big Data applications.

Recommended Videos

Silicon photonics technology allows silicon chips to rely on pulses of light, rather than electrical signals. This enables data transfer at rapid speeds, even over long distances.

Please enable Javascript to view this content

“Making silicon photonics technology ready for widespread commercial use will help the semiconductor industry keep pace with ever-growing demands in computing power driven by Big Data and cloud services,” said Arvind Krishna, senior vice president and director of IBM Research.

IT systems and cloud computing typically have to process large volumes of Big Data, and this isn’t always a fast process. Data needs to be able to be moved quickly between all of the system components in order for it to be efficient. Through silicon photonics, response times can be reduced, and data can be delivered faster.

“Just as fiber optics revolutionized the telecommunications industry by speeding up the flow of data — bringing enormous benefits to consumers — we’re excited about the potential of replacing electric signals with pulses of light,” Krishna continued.

IBM’s silicon photonics chip uses four different colors of light traveling with an optical fiber. Within one second, it can share six million images, and it can download an entire high-definition movie within two seconds. The technology allows for the integration of various optical components side-by-side on a single chip.

In the future, this technology may allow computer hardware to communicate faster and more efficiently. Data centers might also be able to reduce the cost of space and energy while increasing performance for customers. For now though, it’s just a first step towards light speed computing.

Krystle Vermes
Former Digital Trends Contributor
Krystle Vermes is a professional writer, blogger and podcaster with a background in both online and print journalism. Her…
This is the GPU I’m most excited for in 2025 — and it’s not by Nvidia
The AMD Radeon RX 7900 XTX graphics card.

The next few months will completely redefine every ranking of the best graphics cards. With Nvidia's RTX 50-series and AMD's RDNA 4 most likely launching in January -- and even Intel possibly expanding its Battlemage lineup -- there's a lot to look forward to.

But as for me, I already know which GPU I'm most excited about. And no, it's not Nvidia's rumored almighty RTX 5090. The GPU I'm looking forward to is AMD's upcoming flagship, which will presumably be the RX 8800 XT (or perhaps the RX 9070 XT). Below, I'll tell you why I think this GPU is going to be so important not just for AMD but also for the entire graphics card market.
Setting the pace

Read more
Google Street View camera captures highly suspicious act, leading to arrests
The Google Street View image showing someone loading a large bundle into the trunk of a car.

Imagery from Google’s Street View has reportedly helped to solve a murder case in northern Spain.

Street View is the online tool that lets you view 360-degree imagery captured by cameras mounted on Google’s Street View cars that travel the world.

Read more
AMD’s RDNA 4 may surprise us in more ways than one
AMD RX 7800 XT and RX 7700 XT graphics cards.

Thanks to all the leaks, I thought I knew what to expect with AMD's upcoming RDNA 4. It turns out I may have been wrong on more than one account.

The latest leaks reveal that AMD's upcoming best graphics card may not be called the RX 8800 XT, as most leakers predicted, but will instead be referred to as the  RX 9070 XT. In addition, the first leaked benchmark of the GPU gives us a glimpse into the kind of performance we can expect, which could turn out to be a bit of a letdown.

Read more