Skip to main content

GDC 2024 in review: Path tracing, upscaling, and CPU-killing tech

A banner with GDC on it outside a conference center.
Jacob Roach / Digital Trends
Jacob Roach in a promotional image for ReSpec
This story is part of Jacob Roach's ReSpec series, covering the world of PC gaming and hardware.
Updated less than 5 days ago

ReSpec is a bit different this week. I’ve spent the week in sunny San Francisco at the Game Developers Conference (GDC), running from meeting to meeting and trying to find a moment of time to write a few words.

Instead of a normal column, we decided to post a sampling of entries from the newly launched ReSpec newsletter covering what I saw at GDC this week. If you want this same newsletter delivered to your inbox each week, sign up now and get in on the exclusive content.

Recommended Videos

Path tracing is a lie

A screenshot from Alan Wake 2, showing a dark environment.
Jacob Roach / Digital Trends

Maybe “lie” is too strong of a word to use, but path tracing is pretty tricky when it comes to its implementation in games. I sat in on sessions for path tracing in both Cyberpunk 2077 and Alan Wake 2 at GDC, both of which described a common thread for utilizing path tracing in a game meant to run in real time, at a playable frame rate. And it’s called ReSTIR Direct Illumination.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

First, how path tracing works: We take a pixel, and we trace a line from it away from the camera. It collides with something and bounces. And it keeps going, bouncing around the scene until it either goes off into the ether or ends at a light source. Developers want those paths that end at a light source, particularly for calculating shadows.

The problem in any sort of real-time context is that this process is extremely expensive. Calculating all of those rays and all of those bounces, despite the fact that only a small amount of them are going to be used, hogs a ton of resources. That’s why path tracing has been an offline technique for so long — you have to calculate the possible paths and average them.

That’s not the case for Alan Wake 2 and Cyberpunk 2077. For direct lights, ReSTIR works by weighting the light sources in a scene and only sampling a selection of them. Then, these samples are shared temporally (across frames) and spatially (with nearby pixels). In the case of a game like Alan Wake 2, certain lights are weighted heavier, such as the blue and red “cinematic” lights you see in a train station.

The result is an image that comes together much faster, at least fast enough that you can play a game at a reasonable frame rate with a healthy dash of upscaling and frame generation.

It’s an interesting tidbit, and something that will hopefully become more common with titan developers from Alan Wake 2 and Cyberpunk 2077 sharing their work.

Microsoft is putting its foot down on upscaling

Microsoft presenters on stage at a GDC session.
Jacob Roach / Digital Trends

At GDC, Microsoft finally talked more about DirectSR, and even managed to wrangle developers from AMD and Nvidia to sit on the same panel. Together, even! DirectSR isn’t a way to end the upscaling wars, as we originally thought, but it provides a unified framework for developers to add multiple upscaling features to their games.

A big part of that is input. When interfacing with DirectSR, there’s a standardized set of inputs that developers grant to the Application Programming Interface (API). It can then pass those inputs along to upscalers that are built-in, such as AMD’s FSR 2, or variants that require specific hardware, such as Nvidia’s DLSS.

It’s not dissimilar from Nvidia’s own Streamline framework, which was built to accomplish something similar before AMD decided not to play ball. It seems like Microsoft, being the neutral third-party in this battle, was the one that could bring everyone together.

I’m still not sure how this will actually look in games. DirectSR isn’t even available to developers yet. It’s possible nothing changes for end users, and we still see multiple upscaling options in graphics menus. Maybe Microsoft will update Windows to include a universal upscaling option depending on the hardware you have. It’s not clear, but DirectSR should still make it easier for developers to implement every flavor of upscaling in their games across DLSS, FSR, and Intel’s XeSS.

One upside that didn’t come through originally was how this system works with updates. Nvidia, AMD, and Intel constantly release new versions of their upscaling tech that make minor improvements to image quality or slightly tweak how the upscaling works. With DirectSR, developers won’t need to add all of these updates to their games — they’ll just work across the API.

This is all thumbs-up for me. Upscaling has been a major point of contention, especially for big releases like Starfield and Resident Evil 4 that only supported one upscaler at launch. The only downside is frame generation. That doesn’t appear to be in the cards for DirectSR right now, so there’s still going to be plenty of back and forth in the future for major graphics brands.

Death to your CPU? Not exactly

Intel's 14900K CPU socketed in a motherboard.
Jacob Roach / Digital Trends

One of the most exciting announcements out of GDC this year were Work Graphs. I talked about this last week in the newsletter, but I got a closer look at Work Graphs during Microsoft’s DirectX State of the Union. The idea behind them is to reduce the strain on your CPU by letting your GPU direct its own work.

There’s a little more nuance to the conversation. This gives more power to your GPU to decide what to do, similar to when programmable shaders were first introduced to graphics cards. A Work Graph is comprised of nodes, and those nodes can spawn further nodes for your GPU to work on rather than waiting for work from the CPU. Microsoft described it as a compute shader that can launch another compute shader.

The obvious advantage, and one PC gamers picked up on quickly, was GPU utilization. Microsoft explained that the current system requires a global sync point between the GPU and CPU. That often means that the GPU, being a highly parallel device, is without work for brief amounts of time while waiting for a sync to happen.

What I didn’t expect how it would impact memory. With current programming in DirectX 12, Microsoft explained you would use the ExecuteIndirect command, which requires you to hold several buffers. With Work Graphs, you don’t need to hold those buffers, as the GPU can start its own work and continue generating work for itself.

Robert Martin from AMD demonstrated how big of a deal this is with a scene that required 3.3GB of memory. With Work Graphs, the memory usage was just 113MB, and it came with a small performance boost. As the presentation described, “memory footprint scales with the size of the GPU, not the size of a workload.”

Work Graphs are invisible to end users, but it really is the next frontier for graphics programming, and as AMD, Nvidia, and Microsoft all said, it’s something that developers have been working toward for years. Less memory usage and better performance sounds good to me. We’ll just have to wait until Work Graphs make a splash across real games.

Jacob Roach
Lead Reporter, PC Hardware
Jacob Roach is the lead reporter for PC hardware at Digital Trends. In addition to covering the latest PC components, from…
Microsoft Flight Simulator 2024 needs 64GB of RAM for ‘ideal’ specs
A cockpit view in Microsoft Flight Simulator 2024.

Microsoft Flight Simulator 2024 is just a couple of months away from release, and developer Asobo Studio has released the system requirements for the game. For the most part, there isn't much to talk about. The game can scale down to an aging GTX 970 and up to a recent RTX 4080, but one requirement has players scratching their heads -- 64GB of RAM to meet the "ideal" system requirements.

That's an unheard of amount of memory for a game, even in 2024. For some context, I have a PC packing an RTX 4090 and Ryzen 7 7800X3D, which is just about the most powerful gaming PC you can buy right now. Even with all of that hardware, I still only have 32GB of RAM. Why would you need more? Even the most demanding games can never hope to saturate 32GB of RAM, and there's a strong argument that 16GB is enough for most titles. Microsoft Flight Simulator 2024 is changing that narrative.

Read more
Nvidia’s most underrated DLSS feature deserves far more attention
Alan Wake 2 running on the Samsung Odyssey OELD G9.

Since the introduction of Nvidia's Deep Learning Super Sampling (DLSS), the company has done an excellent job getting the feature in as many games as possible. As the standout feature of Nvidia's best graphics cards, most major game releases come with the feature at the ready.

That's only become truer with the introduction of DLSS 3 and its Frame Generation feature, showing up in recent releases like Ghost of Tsushima and The First Descendent. But one DLSS feature has seen shockingly low representation.

Read more
I tried to settle the dumbest debate in PC gaming
settling borderless and fullscreen debate dt respec vs

Borderless or fullscreen? It's a question every PC gamer has run up against, either out of curiosity or from friends trying to get the best settings for their PC games. Following surface-level advice, such as what we lay out in our no-frills guide on borderless versus fullscreen gaming, will set you on the right path. Borderless is more convenient, but it might lead to a performance drop in some games. In theory, that's all you need to know. But the question that's plagued my existence still rings: Why? 

If you dig around online, you'll get wildly different advice about whether borderless or fullscreen is better for your performance. Some say there's no difference. Others claim huge improvements with fullscreen mode in games like PlayerUnkown's Battlegrounds. More still say you'll get better performance with borderless in a game like Fallout 4. You don't need to follow this advice, and you probably shouldn't on a more universal basis, but why are there so many different claims about what should be one of the simplest settings in a graphics menu?

Read more