Skip to main content

Nvidia researchers use artificial intelligence to upgrade your game’s graphics

At the Neural Information Processing Conference in Montreal, Canada, Nvidia researchers demonstrated that they could use the power of artificial intelligence to render synthetic, yet realistic, scenes with details and textures. Researchers claim that the work is still in its early stages, and it is unclear when the technology will be released to consumers, but there is big potential for Nvidia’s artificial intelligence-driven rendering in the gaming space.

“This work is about a new way of rendering computer graphics using neural networks,” Nvidia’s Vice President of Applied Deep Learning Bryan Catanzaro said in a conference call. Basically, researchers wanted to know how they can apply A.I. to make computer graphics better, and the solution is to apply machine learning to real-world videos in order to render new graphics.

Recommended Videos

“We’ve built a system that takes high-level representations of the physical world — basically taking a video sketch and convert that into a rendered scene,” Catanzaro said. “The model understands high-level information of objects in the real world, and then elaborate those to add texture and lighting information. The goal is to be able to synthesize new scenes with graphics.”

Image: Nvidia Image used with permission by copyright holder

Machine learning is used to analyze existing videos and Nvidia would apply computer vision techniques to add labels to objects and their properties. This means that A.I. would be able to recognize an urban cityscape and understand what objects are trees, cars, or building, for example.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

This technology is derived from existing research, like that from the University of California, Berkeley, according to Nvidia. The company has shown off other A.I.-based rendering techniques in the past, including one that would remove noise from an image.

Researchers were able to achieve real-time rendering on a Tensor Core GPU, but for the conference, Nvidia demonstrated the technology on its Titan V card. “Though you can do this on any processor, the real-time aspect does require a lot of A.I. throughput,” Catanzaro explained, noting that the Tensor Core GPU is important.

By being able to create content and adding them to virtual worlds, the gaming market could greatly benefit from the work created by this research. Developers, for example, could “remaster” old games by re-rendering old titles to add high-definition visuals, or they could add new levels to existing games with little effort.

As a basic example of how this would work, users can take a video of themselves, upload it to a game, and the rendering will be able to create highly personalized avatars for use in the game. Nvidia has open-sourced the code from its research right now, but Catanzaro cautions that the early work in this field is more suited for computer scientists than game developers.

In a separate demo, Nvidia showcased that it could analyze dance moves — like from the popular music video Gangnam Style — and transfer those moves to another person using the same computer vision techniques. “We analyze the code of the person, and that becomes our sketch,” Catanzaro said. “And the model renders the target person given that sketch.”

Nvidia cautions that this is still an early-stage technology, but you can also upgrade the graphics from an existing game and train a model on real-world imagery to re-render old games to make them look better. Because the technology requires the computer to analyze objects that are known in the real world, Catanzaro cautions that it won’t work on fantastic stuff, like rendering Santa’s elves. In theory, he admits, you can train the computer to render elves, but you need to create physical elves and capture images for the computer to learn.

Like the ray tracing technology that was introduced on Nvidia’s recent RTX series graphics cards for consumers, the company said that this A.I.-based rendering is used in a hybrid way and isn’t meant to replace traditional rendering techniques. Instead, A.I.-based rendering is meant to coexist and be used with traditional graphics rendering engines.

Right now, it’s unclear when this technology will hit the gaming market. It could take as little as a couple years, Catanzaro optimistically speculated. Combined with ray tracing, A.I.-based scene rendering will deliver quality visuals in game titles that are rendered in real time.

Chuong Nguyen
Silicon Valley-based technology reporter and Giants baseball fan who splits his time between Northern California and Southern…
Intel’s Battlemage might beat Nvidia and AMD to the punch
Intel Arc A770 GPU installed in a test bench.

Out of all the GPU news we've been getting in the last few weeks, information about Intel Arc Battlemage has been pretty scarce, Now, it appears that Intel might still surprise us. According to a new leak, Intel's next-gen desktop GPUs might join the ranks of the best graphics cards as early as next month. Launching in December would certainly give Intel an unexpected edge over AMD and Nvidia, and it's an edge that it could really use right now.

As always with these types of leaks, we're working with a vague message and reading into it to try and figure out what's going on. In this instance, the gossip comes from Golden Pig Upgrade Pack on Weibo, a user with a pretty good reputation.

Read more
Apple Intelligence may get an M4 upgrade
Apple Intelligence on the Apple iPhone 16 Plus.

According to Nikkei Asia, Apple is talking with its biggest iPhone manufacturing partner, Foxconn, about building new Apple Intelligence servers in Taiwan.

More servers will mean more processing power for Apple Intelligence features, allowing more people to complete more complex tasks. Existing Apple servers are currently powered by the M2 Ultra chip but there are plans to use one of the new M4 chips for future servers.

Read more
Nvidia’s RTX 50-series may launch ‘soon,’ whatever that means
Nvidia CEO Jensen Huang with an RTX 4090 graphics card.

As we inch closer to the expected release date of Nvidia's RTX 50-series, the number of leaks is growing by the minute. Today, a reputable leaker weighed in on when we might see the RTX 50-series join the ranks of the best graphics cards. Could Blackwell make an appearance sooner than currently expected? It's certainly possible, but who even knows at this point?

The information comes from kopite7kimi, who, in typical tipster fashion, dropped a vague message on X (formerly Twitter) and then left without answering any questions. However, at this point in the GPU release cycle, even one vague sentence is enough to send the internet for a spin, which is what's happening in the reply section of Kopite's tweet.

Read more