Skip to main content

Digital Trends may earn a commission when you buy through links on our site. Why trust us?

How Intel could use AI to tackle a massive issue in PC gaming

Intel is making a big push into the future of graphics. The company is introducing seven new research papers to Siggraph 2023, an annual graphics conference, one of which tries to address VRAM limitations in modern GPUs with neural rendering.

The paper aims to make real-time path tracing possible with neural rendering. No, Intel isn’t introducing a DLSS 3 rival, but it is looking to leverage AI to render complex scenes. Intel says the “limited amount of onboard memory [on GPUs] can limit practical rendering of complex scenes.” Intel is introducing a neural level of detail representation of objects, and it says it can achieve compression rates of 70% to 95% compared to “classic source representations, while also improving quality over previous work.”

Ellie holds a gun in The last of Us Part I.
Image used with permission by copyright holder

It doesn’t seem dissimilar from Nvidia’s Neural Texture Compression, which it also introduced through a paper submitted to Siggraph. Intel’s paper, however, looks to tackle complex 3D objects, such as vegetation and hair. It’s applied as a level of detail (LoD) technique for objects, allowing them to look more realistic from further away. As we’ve seen from games like Redfall recently, VRAM limitations can cause even close objects to show up with muddy textures and little detail as you pass them.

Recommended Videos

In addition to this technique, Intel is also introducing an efficient path-tracing algorithm that it says, in the future, will make complex path-tracing possible on mid-range GPUs and even integrated graphics.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

Path tracing is essentially the hard way of doing ray tracing, and we’ve already seen it be used to great effect in games like Cyberpunk 2077 and Portal RTX. For as impressive as path tracing is, though, it’s extremely demanding. You’d need a flagship GPU like the RTX 4080 or RTX 4090 to even run these games at higher resolutions, and that’s with Nvidia’s tricky DLSS Frame Generation enabled.

Intel’s paper is introducing a way to make that process more efficient. It’s doing so by introducing a new algorithm that is “simpler than the state-of-the-art and leads to faster performance,” according to Intel. The company is building upon the GGX mathematical function, which Intel says is “used in every CGI movie and video game.” The algorithm reduces this mathematical distribution to a hemispherical mirror that is “extremely simple to simulate on a computer.”

Screenshot of full ray tracing in Cyberpunk 2077.
Nvidia

The idea behind GGX is that surfaces are made up of microfacets that reflect and transmit light in different directions. This is expensive to calculate, so Intel’s algorithm essentially reduces the GGX distribution to a simple-to-calculate slope based on the angle of the camera, making real-time rendering possible.

Based on Intel’s internal benchmarks, it leads to upwards of a 7.5% speed up in rendering path-traced scenes. That may seem like a minor bump, but Intel seems confident that more efficient algorithms could make all the difference. In a blog post, the company says it will demonstrate how real-time path tracing can be “practical even on mid-range and integrated GPUs in the future” at Siggraph.

As for when that future arrives, it’s tough to say. Keep in mind this is a research paper right now, so it might be some time before we see this algorithm widely deployed in games. It would certainly do Intel some favors. Although the company’s Arc graphics cards have become excellent over the past several months, Intel still focused on mid-range GPUs and integrated graphics where path tracing isn’t currently possible.

We don’t expect you’ll see these techniques in action any time soon, though. The good news is that we’re seeing new techniques to push visual quality and performance in real-time rendering, which means these techniques should, eventually, show up in games.

Jacob Roach
Lead Reporter, PC Hardware
Jacob Roach is the lead reporter for PC hardware at Digital Trends. In addition to covering the latest PC components, from…
I tried to settle the dumbest debate in PC gaming
settling borderless and fullscreen debate dt respec vs

Borderless or fullscreen? It's a question every PC gamer has run up against, either out of curiosity or from friends trying to get the best settings for their PC games. Following surface-level advice, such as what we lay out in our no-frills guide on borderless versus fullscreen gaming, will set you on the right path. Borderless is more convenient, but it might lead to a performance drop in some games. In theory, that's all you need to know. But the question that's plagued my existence still rings: Why? 

If you dig around online, you'll get wildly different advice about whether borderless or fullscreen is better for your performance. Some say there's no difference. Others claim huge improvements with fullscreen mode in games like PlayerUnkown's Battlegrounds. More still say you'll get better performance with borderless in a game like Fallout 4. You don't need to follow this advice, and you probably shouldn't on a more universal basis, but why are there so many different claims about what should be one of the simplest settings in a graphics menu?

Read more
Intel’s highly anticipated new graphics card is releasing before the end of the year
The Arc A770 graphics card running in a PC.

It looks like we'll be seeing Intel's upcoming Battlemage GPUs before the end of the year. Intel reportedly held a conference with Asus today that covered everything from updates to Intel's current instability crisis to upcoming Arrow Lake CPUs. But an interesting tidbit about Battlemage GPUs, which are gunning for a slot among the best graphics cards, is what stood out.

VideoCardz picked up the news, which was originally shared on the Weibo forums. The poster attended the event, and they claim that Intel says Battlemage graphics cards will be out before the end of the year. That makes sense. We know Intel's Lunar Lake CPUs, which include integrated graphics built on the same architecture, are due out in September. We've also seen several shipments of engineering samples emerge, suggesting a launch is close.

Read more
I’ve lost all hope in PC hardware this year
MSI RTX 4090 Suprim X on a pink background.

Going into 2024, I felt excited. With all the different releases slated for the second half of the year, it looked like 2024 would be a fantastic time for PC hardware. After all, AMD, Nvidia, and Intel were all rumored to launch new products, and you know what that means: lots of competition for who gets to rule the rankings of the best graphics cards and processors.

As we're now well into the second half of the year, the rumors have grown far less optimistic -- as have I. With delays on the horizon, and the ongoing Intel CPU instability crisis, it seems like 2024 may turn out to be a complete downer for PC hardware enthusiasts.
AMD may wait with RDNA 4

Read more