Skip to main content

Digital Trends may earn a commission when you buy through links on our site. Why trust us?

Nvidia’s DLSS 4 isn’t what you think it is. Let’s debunk the myths

how dlss 4 actually works dt respec 2
Digital Trends
The CES 2025 logo.
Read and watch our complete CES coverage here
Updated less than 3 minutes ago

Nvidia stole the show at CES 2025 with the announcement of the RTX 5090, and despite plenty of discourse about the card’s $2,000 price tag, it ushers in a lot of new technology. Chief among them is DLSS 4, which brings multi-frame generation to Nvidia’s GPUs, offering a 4X performance boost in over 75 games right away when Nvidia’s new RTX 50-series GPUs hit the streets.

I’ve seen way too much misunderstanding about how DLSS 4 actually works, though. Between misleading comments from Nvidia’s CEO and a radical redesign to how DLSS works, it’s no wonder there’s been misinformation floating around about the new tech, what it’s capable of, and, critically, what limitations it has.

So, let’s set the record straight, at least as much as I can before Nvidia’s new graphics cards are here and we all experience what DLSS 4 has to offer first-hand.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

No, it doesn’t ‘predict the future’

Nvidia CEO Jensen in front of a background.
Nvidia

One of the main issues about properly understanding how DLSS 4 works comes from a comment Nvidia CEO Jensen Huang made during a Q&A. Jarred Walton over at Tom’s Hardware asked Huang about how DLSS 4 works on a technical level, and Huang categorically denied that DLSS 4 uses frame interpolation. He said that DLSS 4 “predicts the future,” instead of “interpolating the past.” That’s a buzzy quote, for sure. Too bad it’s incorrect.

Huang has waxed poetic about DLSS Frame Generation in the past, and although this type of framing works for explaining a technology like DLSS 4 to a mainstream audience, it also leads to some misunderstandings about how it actually works. Following this quote, I actually had several readers reach out telling me that I was misunderstanding how DLSS 4 works. I’m not misunderstanding how it works, as it turns out, but I understand why there’s a lot of confusion.

DLSS 4’s multi-frame generation uses a technique called frame interpolation. This is the same technique that we saw in DLSS 3, and it’s the same technique you’ll find in other frame generation tools like Lossless Scaling and AMD’s FSR 3. Frame interpolation works like this: Your graphics card renders two frames, and then an algorithm steps in to calculate the difference between those frames. Then, it “generates” a frame to go in between, guessing what the interstitial frame would look like based on the difference between the two frames that were rendered.

A chart shing motion through Nvidia's DLSS 3.
Nvidia

And DLSS 4 uses frame interpolation. There has been some early research on new techniques to generate frames — in particular, research from Intel about frame extrapolation — but it’s still early days for that technology. There’s some details I can’t share quite yet, but I’ve confirmed with multiple sources now that DLSS 4 is, in fact, using frame interpolation. It makes sense, too. These types of rendering tools don’t just pop up out of nowhere, and there’s almost always a long lineage of research papers before any new rendering technique is turned into a marketable product like DLSS 4.

That doesn’t take away from what DLSS 4 is capable of. It might be using the same technique as DLSS 3 for creating new frames, but that shouldn’t distract you from what DLSS 4 can actually do.

Latency isn’t the issue you think it is

The latency widget in Special K.
Jacob Roach / Digital Trends

I understand why Nvidia doesn’t want to comment a lot on DLSS 4’s use of frame interpolation. That’s because frame interpolation introduces latency. You need to render two frames and then perform the interpolation before the first frame in the sequence in displayed, so when using any frame interpolation tool, you’re essentially playing on a slight delay. The assumption I’ve seen is that these extra frames linearly increase latency, which isn’t the case.

The Verge showed concern saying it wanted to “see how the new frame generation tech affects latency,” while TechSpot declared that “users are concerned that multi-frame rendering could compound the [latency] problem.” It’s a natural counter to the multiplied “fake” frames that DLSS 4 can spit out. If generating one frame causes a latency problem, surely generating three of them would cause a bigger latency problem. But that’s not how it works.

This is why it’s so important to understand that DLSS 4 uses frame interpolation. The idea of playing on a delay isn’t any different between DLSS 3 generating one extra frame and DLSS 4 generating three extra ones — the process still involves rendering two frames and comparing the difference between them. Your latency doesn’t significantly increase between inserting one, two, or three extra frames in between the two that were rendered. Regardless of the number of frames that go in between, the latency added by the frame interpolation process is largely the same.

Let me illustrate this. Let’s say you’re playing a game at 60 frames per second (fps). That means there’s 16.6 milliseconds between each frame you see. With DLSS 3, your frame rate would double to 120 fps, but your latency isn’t halved to 8.3ms. The game looks smoother, but there’s still 16.6ms between each rendered frame. With DLSS 4, you’ll be able to go up to 240 fps, quadrupling your frame rate, but once again, the latency doesn’t drop to 4.2ms. It’s still that same 16.6ms.

This is a very reductive look at PC latency — there’s overhead for DLSS Frame Generation to run, plus the latency added by your monitor and mouse — but it’s useful for understanding that the core latency doesn’t linearly increase when adding more frames to the frame interpolation process. The time between each rendered frame doesn’t change. The latency you experience is still largely the result of your base frame rate before DLSS Frame Generation and the overhead that the tool has.

DLSS 4 on Nvidia RTX 5080 First Look: Super Res + Multi Frame-Gen on Cyberpunk 2077 RT Overdrive!

You don’t have to just take my word for that. Digital Foundry has tested DLSS 4, including latency, and found exactly what I just described. “It seems to me that the majority of the extra latency still comes from buffering that extra frame, but adding further intermediate frames comes with a relatively minimal increase in latency,” wrote Digital Foundry’s Richard Leadbetter. The small amount of additional latency simply comes from DLSS calculating more frames in between the two that have been rendered, so the bulk of the latency increase with DLSS 4 isn’t much different from DLSS 3.

The latency issue with DLSS 4 is largely the same as it is with DLSS 3. If you’re playing at a low base frame rate, there’s a disconnect between the responsiveness you’re experiencing and the smoothness you’re seeing. That disconnect will be more significant with DLSS 4, but that doesn’t suddenly mean there’s a massive increase in latency as a result. That’s why Nvidia’s impressive new Reflex 2 isn’t required for DLSS 4; just like DLSS 3, developers only need to implement the first version of Reflex for DLSS 4 to work.

A completely new model

A showcase of how DLSS 4 works.
Nvidia

Clarifying how DLSS 4 works might lead you to believe it’s more of the same, but that’s not the case. DLSS 4 is a very significant departure from DLSS 3, and that’s because it uses a completely different AI model. Or, I should say, AI models. As Nvidia details, DLSS 4 runs five separate AI models for each rendered frame when using Super Resolution, Ray Reconstruction, and Multi-Frame Generation, all of which need to execute in a matter of milliseconds.

Because of what DLSS 4 entails, Nvidia did away with its previous Convolution Neural Network, or CNN, and it’s now using a vision transformer model. There are two big changes with a transformer model. First is something called “self-attention.” The model can track the importance of different pixels over multiple frames. Being self-referential in this way should allow the new model to focus more on problematic areas, such as thin details with Super Resolution that may show shimmering.

DLSS Ray Reconstruction with New Transformer Model | Alan Wake 2

Transformer models are also more scalable, allowing Nvidia to add far more parameters to DLSS than with the previous CNN approach. According to the company, the new transformer model has double the parameters, in fact.

DLSS Super Resolution with New Transformer Model | Horizon Forbidden West

As you can see in the videos above, Nvidia claims this new model has better stability and preservation of fine details compared to the previous CNN approach. These improvements aren’t exclusive to RTX 50-series GPUs, either. All RTX graphics cards will be able to leverage the new transformer model in DLSS 4 games, at least for the features that are supported by each generation.

I’ve seen DLSS 4 in action a couple of times, but the real test for the feature will be when Nvidia’s next-gen GPUs launch. Then, I’ll be able to evaluate how the feature works across several games and scenarios to see how it holds up. Regardless, there are a lot of changes with the feature, and according to what Nvidia has shared so far, those changes work to make DLSS even better.

Jacob Roach
Lead Reporter, PC Hardware
Jacob Roach is the lead reporter for PC hardware at Digital Trends. In addition to covering the latest PC components, from…
2024 still didn’t fix PC gaming
2024 didnt fix pc gaming dt respec gaming2

2023 was one of the most disastrous years for PC gaming I can remember. In terms of highlights, we had games like Star Wars Jedi: Survivor, The Outer Worlds: Spacer's Choice Edition, and Starfield, each hyped up as major launches. Regardless of what you thought about these games, they had terrible performance issues on PC at release, and as much as I hoped we'd see a course correction this year, that hasn't been the case.

The major issues facing PC gaming today are largely the same as the issues it faced at this same point last year. What's frustrating is that the issues in PC gaming today are solvable. Here's what we saw this year, and how I hope the situation can improve in the next 12 months.
Unreal Engine 5 still stutters
You can see from the frame time here the inconsistent performance in the open world, as well as routine stutters. Jacob Roach / Digital Trends

Read more
No, the Nvidia App isn’t killing your PC’s performance
The Nvidia app on the Windows desktop.

When I heard that the new Nvidia App could reduce performance by up to 15%, I was shocked. If this is the first you're hearing about it, I'm sure you're shocked, too. The news stems from Sebastian Castellanos, who posted on X about a big performance drop with the Nvidia App installed in both Black Myth: Wukong and The Talos Principle 2. Some news outlets ran with the claim, including Tom's Hardware and Dark Side of Gaming, showing original testing that backed up the performance loss.

The only problem? The Nvidia App isn't to blame.

Read more
The state of GPUs is about to drastically change
Several GPUs sitting next to each other.

Get ready -- the list of the best graphics cards is going to look a lot different in the next couple of months. For the first time, Nvidia, AMD, and Intel are set to launch new generations within weeks of each other. Whatever you know about the three major players is about to change. Not only are we getting new generations but there are also shifts in strategy between Nvidia and AMD, tariffs to contend with, and next-gen AI features like FSR 4 in the pipeline.

Over the next few months, everything we currently know about the current slate of GPUs will change -- that much I can say for sure. I'm here to not only catch you up to speed on the past 12 months of leaks, rumors, reports, and confirmations, but also distill all of that information to get a better grasp on the GPU market of 2025. A lot is changing -- some good and some bad -- but one thing is undeniable: We're standing on the edge of an exciting new era for PC gaming.
The easy one: Nvidia

Read more