Skip to main content

AMD’s multi-chiplet GPU design might finally come true

RX 7900 XTX installed in a test bench.
Jacob Roach / Digital Trends

An interesting AMD patent has just surfaced, and although it was filed a while back, finding it now is all the more exciting because this tech might be closer to appearing in future graphics cards. The patent describes a multi-chiplet GPU with three separate dies, which is something that could both improve performance and cut back on production costs.

In the patent, AMD refers to a GPU that’s partitioned into multiple dies, which it refers to as GPU chiplets. These chiplets, or dies, can either function together as a single GPU or work as multiple GPUs in what AMD refers to as “second mode.” The GPU has three modes in total, the first of which makes all the chiplets work together as a single, unified GPU. This enables it to share resources and, as Tom’s Hardware says, allows the front-end die to deal with command scheduling for all the shader engine dies. This is similar to what a regular, non-chiplet GPU would do.

Recommended Videos

The second mode is where it gets interesting. In this mode, every chiplet counts as an independent GPU. Each GPU handles its own task scheduling within its shader engines and doesn’t interfere with the other chiplets. Finally, the third mode is a mix of the two, where some GPUs function as their own entity while others combine the chiplets to function together.

The design of an AMD multi-chiplet GPU.
AMD

As mentioned, this patent is not new. It was filed on December 8, 2022, just after AMD released the RX 7900 XTX and the RX 7900 XT. Although leakers have predicted that AMD might go down the multi-chiplet route for at least a generation or two now, this architecture is currently only really used in AMD’s data center GPUs. AMD has already dipped its toes in similar tech in RDNA 3, though, with a design that used a graphics compute die (GCD) and multiple memory cache dies (MCMs) for the memory interface.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

There are tangible benefits to switching to this type of architecture, as per the patent: “By dividing the GPU into multiple GPU chiplets, the processing system flexibly and cost-effectively configures an amount of active GPU physical resources based on an operating mode.” If it could turn out to be cheaper to produce these types of GPUs rather than using increasingly larger monolithic dies, we might start seeing this design outside of the data center and in the GPUs we all use in our own computers.

Early leaks about RDNA 4 graphics cards teased AMD going with a full multi-chiplet design, and it’s easy to imagine that the final result could have resembled what we see in the patent. However, with the news that AMD is sticking to midrange graphics cards in this next generation, any hope of a multi-chiplet GPU seems lost for now. Perhaps we’ll see this design come to life in RDNA 5.

Monica J. White
Monica is a computing writer at Digital Trends, focusing on PC hardware. Since joining the team in 2021, Monica has written…
AMD may completely dominate CES 2025
AMD presenting its new Turin CPUs.

AMD might really go all-out during CES 2025 this January. According to a known leaker on the Chiphell forums, AMD is readying a slew of consumer products, ranging from more X3D desktop CPUs to handheld chips and RDNA 4 graphics cards. Here's what we have to look forward to.

The tantalizing bit of gossip comes from Zhangzhonghao, a leaker who's been known to discuss new releases ahead of time. Starting with laptops, AMD is reportedly set to release next-gen Kraken (or "Krackan" as referred to by this leaker) Point APUs alongside Ryzen AI Max (or Strix Halo) and, lastly, Fire Range CPUs.

Read more
It’s finally time to stop buying Nvidia’s RTX 30-series GPUs
RTX 3080 Ti in front of a window.

If you're looking for a budget GPU, the general advice is usually to buy from the previous generation of graphics cards. After all, as the new cards take over the market, the older ones are still waiting to be sold -- and while they're no longer among the best graphics cards, they're still perfectly acceptable alternatives.

We've now reached the point in the current generation of Nvidia GPUs where that advice no longer applies. If you want to get the best bang for your buck, it's time to stop buying Nvidia's RTX 30-series and look for other options.
The RTX 30-series arrived at the worst possible time

Read more
Everything you need to know about buying a GPU in 2024
RTX 4090.

The graphics card, also known as the GPU, is arguably one of the most exciting components in any PC build. Alongside the processor, your graphics card often has the greatest impact on the overall performance of your PC. That makes it a pretty high-stakes purchase, especially if you consider that GPUs can get pretty expensive.

The GPU market has a lot to offer, and that's regardless of your needs and your budget. Whether you're aiming for something super cheap to support some light browsing or a behemoth to handle the most GPU-intensive games, you have lots of options. In this guide, we'll show you what to look out for so that you can pick the best GPU that fits your budget and needs.
Nvidia, AMD, or Intel?
Consumer graphics cards are generally split into two categories -- integrated and discrete graphics. Since you're here, you're most likely looking for a discrete (or dedicated) GPU, and that's what we're going to focus on in this article.

Read more