Skip to main content

Digital Trends may earn a commission when you buy through links on our site. Why trust us?

How Intel could use AI to tackle a massive issue in PC gaming

Intel is making a big push into the future of graphics. The company is introducing seven new research papers to Siggraph 2023, an annual graphics conference, one of which tries to address VRAM limitations in modern GPUs with neural rendering.

The paper aims to make real-time path tracing possible with neural rendering. No, Intel isn’t introducing a DLSS 3 rival, but it is looking to leverage AI to render complex scenes. Intel says the “limited amount of onboard memory [on GPUs] can limit practical rendering of complex scenes.” Intel is introducing a neural level of detail representation of objects, and it says it can achieve compression rates of 70% to 95% compared to “classic source representations, while also improving quality over previous work.”

Ellie holds a gun in The last of Us Part I.
Image used with permission by copyright holder

It doesn’t seem dissimilar from Nvidia’s Neural Texture Compression, which it also introduced through a paper submitted to Siggraph. Intel’s paper, however, looks to tackle complex 3D objects, such as vegetation and hair. It’s applied as a level of detail (LoD) technique for objects, allowing them to look more realistic from further away. As we’ve seen from games like Redfall recently, VRAM limitations can cause even close objects to show up with muddy textures and little detail as you pass them.

Recommended Videos

In addition to this technique, Intel is also introducing an efficient path-tracing algorithm that it says, in the future, will make complex path-tracing possible on mid-range GPUs and even integrated graphics.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

Path tracing is essentially the hard way of doing ray tracing, and we’ve already seen it be used to great effect in games like Cyberpunk 2077 and Portal RTX. For as impressive as path tracing is, though, it’s extremely demanding. You’d need a flagship GPU like the RTX 4080 or RTX 4090 to even run these games at higher resolutions, and that’s with Nvidia’s tricky DLSS Frame Generation enabled.

Intel’s paper is introducing a way to make that process more efficient. It’s doing so by introducing a new algorithm that is “simpler than the state-of-the-art and leads to faster performance,” according to Intel. The company is building upon the GGX mathematical function, which Intel says is “used in every CGI movie and video game.” The algorithm reduces this mathematical distribution to a hemispherical mirror that is “extremely simple to simulate on a computer.”

Screenshot of full ray tracing in Cyberpunk 2077.
Nvidia

The idea behind GGX is that surfaces are made up of microfacets that reflect and transmit light in different directions. This is expensive to calculate, so Intel’s algorithm essentially reduces the GGX distribution to a simple-to-calculate slope based on the angle of the camera, making real-time rendering possible.

Based on Intel’s internal benchmarks, it leads to upwards of a 7.5% speed up in rendering path-traced scenes. That may seem like a minor bump, but Intel seems confident that more efficient algorithms could make all the difference. In a blog post, the company says it will demonstrate how real-time path tracing can be “practical even on mid-range and integrated GPUs in the future” at Siggraph.

As for when that future arrives, it’s tough to say. Keep in mind this is a research paper right now, so it might be some time before we see this algorithm widely deployed in games. It would certainly do Intel some favors. Although the company’s Arc graphics cards have become excellent over the past several months, Intel still focused on mid-range GPUs and integrated graphics where path tracing isn’t currently possible.

We don’t expect you’ll see these techniques in action any time soon, though. The good news is that we’re seeing new techniques to push visual quality and performance in real-time rendering, which means these techniques should, eventually, show up in games.

Jacob Roach
Lead Reporter, PC Hardware
Jacob Roach is the lead reporter for PC hardware at Digital Trends. In addition to covering the latest PC components, from…
AMD could swipe some of the best features of Nvidia GPUs
AMD logo on the RX 7800 XT graphics card.

Nvidia overwhelmingly dominates the list of the best graphics cards, and that largely comes down to its feature set that's been enabled through DLSS. AMD isn't sitting idly by, however. The company is researching new ways to leverage neural networks to enable real-time path tracing on AMD graphics cards -- something that, up to this point, has only really been possible on Nvidia GPUs.

AMD addressed the research in a blog post on GPUOpen, saying that the goal is "moving towards real-time path tracing on RDNA GPUs." Nvidia already uses AI accelerators on RTX graphics cards to upscale an image via DLSS, but AMD is focused on a slightly different angle of performance gains -- denoising.

Read more
No, generative AI isn’t taking over your PC games anytime soon
Cyberpunk 2077 running on the Samsung Odyssey OLED G8.

Surprise -- the internet is upset. This time, it's about a recent article from PC Gamer on the future of generative AI in video games. It's a topic I've written about previously, and something that game companies have been experimenting with for more than a year, but this particular story struck a nerve.

Redditors used strong language like "pro-AI puff piece," PC Gamer itself issued an apology, and the character designer for Bioshock Infinite's Elizabeth called the featured image showing the character reimagined with AI a "half-assed cosplay." The original intent of the article is to glimpse into the future at what games could look like with generative AI, but without the tact or clear realization of how this shift affects people's jobs and their creative works.

Read more
Nvidia’s CEO — yes, one person — is now worth more than all of Intel
Jensen Huang at GTX 2020.

Nvidia is one of the richest companies in the world, so it's no surprise that the company's CEO, Jensen Huang, is quite wealthy. The most recent net worth numbers from Forbes puts into context just how wealthy the executive really is, though. Huang has an estimated net worth of $109.2 billion, which is around $13 billion more than the market cap of Intel across the entire company.

Although Nvidia makes some of the best graphics cards, the obscene amount of money the company has racked up over the past two years stems from its AI accelerators. In 2020, Forbes estimated that Huang was worth $4.7 billion, and even in 2023, after ChatGPT had already exploded onto the scene, the executive was worth $21.1 billion. Now, Huang is the 11th richest person in the world, outpacing Bill Gates, Michael Dell, and Michael Bloomberg.

Read more