Skip to main content

Digital Trends may earn a commission when you buy through links on our site. Why trust us?

Nvidia’s next GPUs will be designed partially by AI

During the GTC 2022 conference, Nvidia talked about using artificial intelligence and machine learning in order to make future graphics cards better than ever.

As the company chooses to prioritize AI and machine learning (ML), some of these advancements will already find their way into the upcoming next-gen Ada Lovelace GPUs.

Nvidia logo made out of microchips.
Nvidia

Nvidia’s big plans for AI and ML in next-gen graphics cards were shared by Bill Dally, the company’s chief scientist and senior vice president of research. He talked about Nvidia’s research and development teams, how they utilize AI and machine learning (ML), and what this means for next-gen GPUs.

Recommended Videos

In short, using these technologies can only mean good things for Nvidia graphics cards. Dally discussed four major sections of GPU design, as well as the ways in which using AI and ML can drastically speed up GPU performance.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

The goal is an increase in both speed and efficiency, and Dally’s example talks about how using AI and ML can lower a standard GPU design task from three hours to just three seconds.

Using artificial intelligence and machine learning can help optimize all of these processes.

This is allegedly possible by optimizing up to four processes that normally take a lot of time and are highly detailed.

This refers to monitoring and mapping power voltage drops, anticipating errors through parasitic prediction, standard cell migration automation, and addressing various routing challenges. Using artificial intelligence and machine learning can help optimize all of these processes, resulting in major gains in the end product.

Mapping potential drops in voltage helps Nvidia track the power flow of next-gen graphics cards. According to Dally, switching from using standard tools to specialized AI tools can speed this task up drastically, seeing as the new tech can perform such tasks in mere seconds.

Dally said that using AI and ML for mapping voltage drops can increase the accuracy by as much as 94% while also tremendously increasing the speed at which these tasks are performed.

Nvidia's slide on automated cell migration.
Nvidia

Data flow in new chips is an important factor in how well a new graphics card performs. Therefore, Nvidia uses graph neural networks (GNN) to identify possible issues in data flow and address them quickly.

Parasitic prediction through the use of AI is another area in which Nvidia sees improvements, noting increased accuracy, with simulation error rates dropping below 10 percent.

Nvidia has also managed to automate the process of migrating the chip’s standard cells, cutting back on a lot of downtime and speeding up the whole task. With that, 92% of the cell library was migrated through the use of a tool with no errors.

The company is planning to focus on AI and machine learning going forward, dedicating five of its laboratories to researching and designing new solutions in those segments. Dally hinted that we may see the first results of these new developments in Nvidia’s new 7nm and 5nm designs, which include the upcoming Ada Lovelace GPUs. This was first reported by Wccftech.

It’s no secret that the next generation of graphics cards, often referred to as RTX 4000, will be intensely powerful (with power requirements to match). Using AI and machine learning to further the development of these GPUs implies that we may soon have a real powerhouse on our hands.

Monica J. White
Monica is a computing writer at Digital Trends, focusing on PC hardware. Since joining the team in 2021, Monica has written…
Nvidia just scaled down DLSS 3, and that’s a good thing
The RTX 4080 Super graphics card sitting on a pink background.

Nvidia's signature tech, DLSS 3, just got yet another update -- and although it's subtle, it actually seems like a good thing for some of the best graphics cards. The latest version, 3.8.10, bundled with the GeForce 566.14 driver, doesn't seem to introduce any major changes, but Nvidia enthusiasts noticed that it's about half the size that it used to be. Where's that difference coming from?

No, Nvidia didn't downgrade DLSS 3 -- at least not in any major way. Although this hasn't been confirmed by Nvidia itself, it appears that the company removed a whole bunch of DLSS presets and replaced them with just two. These presets make it easier for gamers to choose the type of focus they want to apply to each game.

Read more
Nvidia’s RTX 40-series is coming to an end
Three RTX 4080 cards sitting on a pink background.

Out with the old, in with the new. According to Board Channels, Nvidia has now halted production for nearly all of its best graphics cards as it shifts focus to the RTX 50-series. Only one GPU remains in production, and some of the cards that are the most in demand are no longer being produced.

Nvidia hasn't officially announced that it's sunsetting the RTX 40-series, but we've been hearing more and more reports that imply that might be the case. The RTX 4090 was among the first cards to go out of production, and the discontinuation appears to have immediately affected the markets. Nvidia's behemoth flagship was hard to come by at the best of times, and now, as no more new units are being produced, it's safe to assume that this situation won't improve. The cheapest RTX 4090 I could find on Amazon costs nearly $2,000, but you can still snag one for .

Read more
Nvidia’s RTX 50-series may launch ‘soon,’ whatever that means
Nvidia CEO Jensen Huang with an RTX 4090 graphics card.

As we inch closer to the expected release date of Nvidia's RTX 50-series, the number of leaks is growing by the minute. Today, a reputable leaker weighed in on when we might see the RTX 50-series join the ranks of the best graphics cards. Could Blackwell make an appearance sooner than currently expected? It's certainly possible, but who even knows at this point?

The information comes from kopite7kimi, who, in typical tipster fashion, dropped a vague message on X (formerly Twitter) and then left without answering any questions. However, at this point in the GPU release cycle, even one vague sentence is enough to send the internet for a spin, which is what's happening in the reply section of Kopite's tweet.

Read more