Skip to main content

After a decade, Nvidia is fixing the worst part of G-Sync

Alan Wake 2 on the Alienware 27 QD-OLED gaming monitor.
Jacob Roach / Digital Trends

It only took 11 years. Nvidia is finally doing away with its proprietary G-Sync module that’s been the bane of the best gaming monitors for several years. Although Nvidia eventually started offering G-Sync branding with any variable refresh rate (VRR) tech, display brands have still needed to shell out for a dedicated G-Sync module for proper certification — and pass along that cost to customers. Going forward, the G-Sync module is no more.

Nvidia’s G-Sync tech isn’t going anywhere, however. The company announced that it partnered with Mediatek to add every G-Sync feature to Mediatek scalers. If you’re unfamiliar, a scaler is basically a small chip inside your monitor that handles resolution scaling, along with a bunch of other processing including the on-screen display (OSD), color adjustments, HDR, and input selection. The G-Sync module itself was a scaler. Now that you’ll rarely find a gaming monitor without its own scaler, those features are being rolled into the chip already built into the display.

Nvidia hasn’t said which Mediatek chips it’s adding G-Sync features to, nor if there are any restriction on using it. Hopefully, the change means that you’ll get a suite of G-Sync features without even thinking about it. Mediatek’s chips are by far the most popular on the market, which should mean you get the full G-Sync range in most gaming displays. The recent Alienware 32 QD-OLED, for example, uses the Mediatek MT9810, which is what enables Dolby Vision HDR support.

G-Sync features coming to Mediatek scalers.
Nvidia

Although we don’t know which scalers will come with G-Sync features, Nvidia confirmed that the full range of G-Sync features is arriving. Here’s the list:

  • Variable refresh rate
  • Variable overdrive
  • 12-bit color
  • Ultra Low Motion Blur (ULMB)
  • Low latency HDR
  • Nvidia Reflex analyzer
  • Nvidia Pulsar
Get your weekly teardown of the tech behind PC gaming
Check your inbox!

The Nvidia Reflex analyzer is a big inclusion, and one of the main things that have separated G-Sync displays from open standards like FreeSync and VESA’s Adaptive Sync. It shows up in monitors like Alienware’s 500Hz gaming display, and it allows you to see a slew of metrics concerning input lag.

Pulsar is also a big inclusion. This is a piece of tech that concerns motion clarity. It doesn’t boost the refresh rate of the monitor, but Nvidia says it provides motion clarity to over 1,000Hz by syncing the monitor’s backlight strobing to the variable refresh rate. At Gamescom, Nvidia revealed the first three monitors launching with Pulsar, which you can see below.

The first monitors with Nvidia Pulsar.
Nvidia

The death of the G-Sync module is a huge boost for gaming monitors as a whole. We’ll no longer need to wait for separate G-Sync and FreeSync versions of displays, as we had to with the Alienware 34 QD-OLED, and you won’t need to dig through each display you’re interested in to see if it supports the G-Sync features you want. You’ll still need an Nvidia GPU to use G-Sync features, but this change should make the display side of things much easier to digest.

Jacob Roach
Lead Reporter, PC Hardware
Jacob Roach is the lead reporter for PC hardware at Digital Trends. In addition to covering the latest PC components, from…
AMD may lose a golden opportunity to beat Nvidia this year
AMD logo on the RX 7800 XT graphics card.

A year and a half after the launch of RDNA 3, AMD's graphics card lineup has grown a little stagnant -- as has Nvidia's. We're all waiting for a new generation, and according to previous leaks, AMD was getting ready to release RDNA 4 later this year. Except that now, we're hearing that it might not happen until CES 2025, which is still six months away.

Launching the new GPUs in the first quarter of 2025 is a decision that could easily backfire, and it's never been more important for AMD to get the timing right. In fact, if AMD really decides to wait until January 2025 to unveil RDNA 4, it'll miss out on a huge opportunity to beat Nvidia.
There's never been a better time
Who's a PC hardware enthusiast's best friend during the period between one generation of GPUs and the next? Various leakers, of course. Without them, we'd be kept in the dark for months on end.

Read more
This GPU just beat the RTX 4090 — and Nvidia didn’t make it
The board of the RTX 4090 Super graphics card.

Modders are doing what Nvidia won't. The team at Teclab put together a Frankenstein graphics card, which it calls the RTX 4090 Super, that was able to beat the RTX 4090 by 13% in benchmarks.

You can't buy this graphics card, of course, but it's an interesting look into how splicing together the best components can deliver big performance gains. The heart of the RTX 4090 Super is the RTX 4090 GPU, which is Nvidia's AD102 die. Teclab changed everything else about the graphics card, though.

Read more
DLSS 4 could be amazing, and Nvidia needs it to be
Nvidia GeForce RTX 4090 GPU.

I won't lie: Nvidia did a good job with Deep Learning Super Sampling (DLSS) 3, and there's almost no way that this success didn't contribute to sales. DLSS 3, with its ability to turn a midrange GPU into something much more capable, is pretty groundbreaking, and that's a strong selling point if there ever was one.

What comes next, though? The RTX 40-series is almost at an end, and soon, there'll be new GPUs for Nvidia to try and sell -- potentially without the added incentive of gen-exclusive upscaling tech. DLSS 3 will be a tough act to follow, and if the rumors about its upcoming graphics cards turn out to be true, Nvidia may really need DLSS 4 to be a smash hit.
When the GPU barely matters

Read more