Skip to main content

Nvidia researchers use artificial intelligence to upgrade your game’s graphics

At the Neural Information Processing Conference in Montreal, Canada, Nvidia researchers demonstrated that they could use the power of artificial intelligence to render synthetic, yet realistic, scenes with details and textures. Researchers claim that the work is still in its early stages, and it is unclear when the technology will be released to consumers, but there is big potential for Nvidia’s artificial intelligence-driven rendering in the gaming space.

“This work is about a new way of rendering computer graphics using neural networks,” Nvidia’s Vice President of Applied Deep Learning Bryan Catanzaro said in a conference call. Basically, researchers wanted to know how they can apply A.I. to make computer graphics better, and the solution is to apply machine learning to real-world videos in order to render new graphics.

Recommended Videos

“We’ve built a system that takes high-level representations of the physical world — basically taking a video sketch and convert that into a rendered scene,” Catanzaro said. “The model understands high-level information of objects in the real world, and then elaborate those to add texture and lighting information. The goal is to be able to synthesize new scenes with graphics.”

Image: Nvidia Image used with permission by copyright holder

Machine learning is used to analyze existing videos and Nvidia would apply computer vision techniques to add labels to objects and their properties. This means that A.I. would be able to recognize an urban cityscape and understand what objects are trees, cars, or building, for example.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

This technology is derived from existing research, like that from the University of California, Berkeley, according to Nvidia. The company has shown off other A.I.-based rendering techniques in the past, including one that would remove noise from an image.

Researchers were able to achieve real-time rendering on a Tensor Core GPU, but for the conference, Nvidia demonstrated the technology on its Titan V card. “Though you can do this on any processor, the real-time aspect does require a lot of A.I. throughput,” Catanzaro explained, noting that the Tensor Core GPU is important.

By being able to create content and adding them to virtual worlds, the gaming market could greatly benefit from the work created by this research. Developers, for example, could “remaster” old games by re-rendering old titles to add high-definition visuals, or they could add new levels to existing games with little effort.

As a basic example of how this would work, users can take a video of themselves, upload it to a game, and the rendering will be able to create highly personalized avatars for use in the game. Nvidia has open-sourced the code from its research right now, but Catanzaro cautions that the early work in this field is more suited for computer scientists than game developers.

In a separate demo, Nvidia showcased that it could analyze dance moves — like from the popular music video Gangnam Style — and transfer those moves to another person using the same computer vision techniques. “We analyze the code of the person, and that becomes our sketch,” Catanzaro said. “And the model renders the target person given that sketch.”

Nvidia cautions that this is still an early-stage technology, but you can also upgrade the graphics from an existing game and train a model on real-world imagery to re-render old games to make them look better. Because the technology requires the computer to analyze objects that are known in the real world, Catanzaro cautions that it won’t work on fantastic stuff, like rendering Santa’s elves. In theory, he admits, you can train the computer to render elves, but you need to create physical elves and capture images for the computer to learn.

Like the ray tracing technology that was introduced on Nvidia’s recent RTX series graphics cards for consumers, the company said that this A.I.-based rendering is used in a hybrid way and isn’t meant to replace traditional rendering techniques. Instead, A.I.-based rendering is meant to coexist and be used with traditional graphics rendering engines.

Right now, it’s unclear when this technology will hit the gaming market. It could take as little as a couple years, Catanzaro optimistically speculated. Combined with ray tracing, A.I.-based scene rendering will deliver quality visuals in game titles that are rendered in real time.

Chuong Nguyen
Silicon Valley-based technology reporter and Giants baseball fan who splits his time between Northern California and Southern…
This $99 external GPU dock is a complete game changer
The Minisforum DEG1 sitting on a desk.

I though external GPUs were a thing of the past. Why would you spend a ton of money on one of the best graphics cards, only to slap it inside an enclosure that could cut your performance by half, or even more? There's been some hope that external GPUs would make a resurgence with the release of Thunderbolt 5, but we haven't seen that materialize yet. A few years ago, you could find a dozen enclosures for sale. Today, you're left with no-name products packing pitiful reviews.

But then the came along.

Read more
How to check your GPU temperature
Nvidia GPU core.

Your gaming PC’s graphics card, otherwise known as a GPU, is one of its most important components, and it’s important to periodically check its temperatures to ensure longevity. Temperature is the primary factor affecting its decay, and you want it to last a long time -- especially in today’s market, where even the best graphics cards aren’t cheap.

Fortunately, keeping an eye on its temperatures isn’t difficult and can be accomplished in just a few moments with helpful tools. But there are a few other things to keep in mind, so let's dig in.
How to monitor your GPU’s temperature
By far, the easiest tool to check your GPU's temperature in Windows 10 can be found by firing up Windows Task Manager and jumping to the Performance tab. At the bottom of the list, you can find out what GPU you have in your system, where it is listed with its temperature between brackets. Use this to check your GPU temperature under a heavy gaming load, ensuring it is operating at or very close to 100% utilization.

Read more
These AMD and Nvidia release date updates are giving me whiplash
PNY RTX 4080 with the power connector attached.

If you're wondering about the future of Nvidia's and AMD's top graphics cards, you're not alone. We all know it's almost time for the next generation of GPUs to be released, but no one knows when exactly that's going to happen. Today, another source weighed in with conflicting information regarding the release dates of the RTX 50 series and the RX 8000 series, and honestly, it's all starting to give me whiplash at this point.

At the beginning of 2024, most enthusiasts and leakers alike believed that all three GPU makers -- AMD, Intel, and Nvidia -- would launch their next-gen products before the end of the year. In fact, early leaks pointed to an end-of-summer release for AMD. As time went on, we've all tempered our expectations as it became clear that we're unlikely to see any new graphics cards before early 2025.

Read more