Skip to main content

What is G-Sync?

When shopping for a gaming monitor, you’ll undoubtedly come across a few displays advertising Nvidia’s G-Sync technology. In addition to a hefty price hike, these monitors usually come with gaming-focused features like a fast response time and high refresh rate. To help you know where your money is going, we put together a guide to answer the question: What is G-Sync?

In short, G-Sync is a hardware-based adaptive refresh technology that helps prevent screen tearing and stuttering. With a G-Sync monitor, you’ll notice smoother motion while gaming, even at high refresh rates.

Recommended Videos

What is G-Sync?

Nvidia

G-Sync is Nvidia’s hardware-based monitor syncing technology. G-Sync solves screen tearing mainly, synchronizing your monitor’s refresh rate with the frames your GPU is pushing out each second.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

Your GPU renders a number of frames each second, and put together, those frames give the impression of smooth motion. Similarly, your monitor refreshes a certain number of times each second, clearing the previous image for the new frames your GPU is rendering. To keep things moving smoothly, your GPU stores upcoming frames in a buffer. The problem is that the buffer and your monitor’s refresh rate may get out of sync, causing a nasty line of two frames stitched together.

V-Sync emerged as a solution. This software-based feature essentially forces your GPU to hold frames in its buffer until your monitor is ready to refresh. That solves the screen tearing problem, but it introduces another: Input lag. V-Sync forces your GPU to hold frames it has already rendered, which causes a slight delay between what’s happening in the game and what you see on screen.

Nvidia’s first alternative to V-Sync was Adaptive VSync. Like the older technology, Nvidia’s driver-based solution locked the frame rate to the display’s refresh rate to prevent screen tearing. However, when the GPU struggled, Adaptive VSync unlocked the frame rate until the GPU’s performance improved. Once stable, Adaptive VSync locked the frame rate until the GPU’s performance dropped again.

Nvidia introduced a hardware-based solution in 2013 called G-Sync. It’s based on VESA’s Adaptive-Sync technology, which enables variable refresh rates on the display side. Instead of forcing your GPU to hold frames, G-Sync forces your monitor to adapt its refresh rate depending on the frames your GPU is rendering. That deals with input lag and screen tearing.

However, Nvidia uses a proprietary board that replaces the typical scaler board, which controls everything within the display like decoding image input, controlling the backlight, and so on. A G-Sync board contains 768MB of DDR3 memory to store the previous frame so that it can be compared to the next incoming frame. It does this to decrease input lag.

On the PC end, Nvidia’s driver can fully control the display’s proprietary board. It manipulates the vertical blanking interval, or VBI, which represents the interval between the time when a monitor finishes drawing the current frame and the beginning of the next frame.

With G-Sync active, the monitor becomes a slave to your PC. As the GPU rotates the rendered frame into the primary buffer, the display clears the old image and gets ready to receive the next frame. As the frame rate speeds up and slows down, the display renders each frame accordingly as instructed by your PC. Since the G-Sync board supports variable refresh rates, images are often redrawn at widely varying intervals.

G-Sync system requirements

27'' UltraGear 4K UHD Nano IPS 1ms 144Hz G-Sync Compatible Gaming Monitor
Image used with permission by copyright holder

For years, there’s always been one big caveat with G-Sync monitors: You need an Nvidia graphics card. Although you still need an Nvidia GPU to fully take advantage of G-Sync — like the recent RTX 3080 — more recent G-Sync displays support HDMI variable refresh rate under the “G-Sync Compatible” banner (more on that in the next section). That means you can use variable refresh rate with an AMD card, though not Nvidia’s full G-Sync module. Outside of a display with a G-Sync banner, here’s what you need:

Desktops

  • GPU – GeForce GTX 650 Ti BOOST or newer
  • Driver – R340.52 or higher

Laptops connected to G-Sync monitors

  • GPU – GeForce GTX 980M, GTX 970M, or GTX 965M GPU or newer
  • Driver – R340.52 or higher

Laptops with built-in G-Sync displays

  • GPU – GeForce GTX 980M, GTX 970M, or GTX 965M or newer
  • Driver – R352.06 or higher

G-Sync vs. G-Sync Compatible vs. G-Sync Ultimate

Because G-Sync is a hardware solution, certified monitors must include Nvidia’s proprietary board. Fortunately, most major monitor manufacturers like Asus, Philips, BenQ, AOC, Samsung, and LG offer G-Sync displays.

Nvidia currently lists three monitor classes: G-Sync Ultimate, G-Sync, and G-Sync Compatible. Here’s a breakdown of each:

G-Sync Compatible

  • 24 to 88 inches
  • Validated no artifacts

G-Sync

  • 24 to 38 inches
  • Validated no artifacts
  • Certified +300 tests

G-Sync Ultimate

  • 27 to 65 inches
  • Validated no artifacts
  • Certified +300 tests
  • Best quality HDR
  • 1000 nits brightness

For G-Sync Ultimate displays, you’ll need a hefty GeForce GPU to handle HDR visuals at 4K. They’re certainly not cheap, but they provide the best experience.

As for G-Sync Compatible, it’s a newer category. These displays do not include Nvidia’s proprietary G-Sync board, but they do support variable refresh rates. These panels typically fall under AMD’s FreeSync umbrella, which is a competing technology for Radeon-branded GPUs that doesn’t rely on a proprietary scaler board. Nvidia tests these displays to guarantee “no artifacts” when connected to GeForce-branded GPUs. Consider these displays as affordable alternatives to G-Sync and G-Sync Ultimate displays.

Overall, resolutions range from Full HD to 4K while refresh rates range from 60Hz max to 240Hz max. Nvidia provides a full list of compatible monitors on its website. Prices range from about $100 to well over $1,000, like the Asus’ ROG Swift PG279Q 27-inch monitor selling for $698.

G-Sync TVs

Image used with permission by copyright holder

Since G-Sync launched in 2013, it has always been specifically for monitors. However, Nvidia is expanding. Last year, Nvidia partnered with LG to certify recent LG OLED TVs as G-Sync Compatible. You’ll need some drivers and firmware to get started, which Nvidia outlines on its site. Here are the currently available TVs that support G-Sync:

  • LG BX 2020 (50-, 65-, and 77-inch)
  • LG CX 2020 (50-, 65-, and 77-inch)
  • LG GX 2020 (50-, 65-, and 77-inch)
  • LG B9 2019 (50- and 65-inch)
  • LG C9 2019 (50-, 65-, and 77-inch)
  • LG E9 2019 (50- and 65-inch)

FreeSync: the G-Sync alternative

best ultra-wide monitors
Bill Roberson/Digital Trends

As we pointed out earlier, AMD’s FreeSync derives from VESA’s Adaptive-Sync technology. One of the main differences is that it doesn’t use proprietary hardware. Rather, FreeSync-certified displays use off-the-shelf scaler boards, which lessens the cost. The only AMD hardware you need for FreeSync is a Radeon-branded GPU. AMD introduced AdaptiveSync support in 2015.

FreeSync has more freedom in supported monitor options, and you don’t need extra hardware. So, FreeSync is a budget-friendly alternative to G-Synch compatible hardware. Asus’ MG279Q is around $100 less than the aforementioned ROG Swift monitor.

No matter which you choose, each technology has advantages. There are also numerous graphics cards and monitors to up your gaming experience. FreeSync covers graphical glitches caused by monitor and GPU synchronization issues.

Some downsides

One downside is the price. Whether you’re looking at a laptop or desktop, G-Sync requires both a capable monitor and graphics card. While there are many G-Sync compatible graphics cards, giving you plenty of budgetary options, G-Sync monitors are almost always more expensive than their AMD Freesync counterparts. Compatible laptops may be even more expensive.

In addition, users point to a lack of compatibility with Nvidia’s Optimus technology. Optimus, implemented in many laptops, adjusts graphics performance on the fly to provide the necessary power to graphics-intensive programs and optimize battery life. Because the technology relies on an integrated graphics system, frames move to the screen at a set interval, not as they are created as seen with G-Sync. One can purchase an Optimus-capable device or a G-Sync-capable device, but no laptop exists that can do both.

Kevin Parrish
Former Digital Trends Contributor
Kevin started taking PCs apart in the 90s when Quake was on the way and his PC lacked the required components. Since then…
Nvidia CEO in 1997: ‘We need to kill Intel’
NVIDIA CEO Jensen Huang at GTC

Those headline above includes strong words from the maker of the best graphics cards you can buy, and they have extra significance considering where Nvidia sits today in relation to Intel. But in 1997, things were a bit different. The quote comes from the upcoming book The Nvidia Way, written by columnist Tae Kim, and was shared as part of an excerpt ahead of the book's release next month.

The words from Nvidia CEO Jensen Huang came as part of an all-hands meeting at the company in 1997 following the launch of the RIVA 128. This was prior to the release of the GeForce 256, when Nvidia finally coined the term "GPU," and it was a precarious time for the new company. Shortly following the release of the RIVA 128, Intel launched its own i740, which came with an 8MB frame buffer. The RIVA 128 came with only a 4MB frame buffer.

Read more
Nvidia’s next-gen GPU plans could be good news for Intel and AMD
Two RTX 4070 Ti Super graphics cards sitting next to each other.

According to a new leak from Benchlife, Nvidia may launch the vast majority of the RTX 50-series in the first quarter of 2025 -- but one GPU is notably missing from the early lineup. That could be very good news for AMD and Intel. While Nvidia will rule the high-end market, the other two brands may get to swoop in with some of the best graphics cards for gamers on a budget and get some breathing room before Nvidia strikes back.

Benchlife reveals that we'll see many of the RTX 50-series staples arrive in the first quarter of the year. The flagship RTX 5090 and the RTX 5080 arriving in January feel like a sure thing at this point, but many leakers also suggest that we'll see other GPUs make their debut during CES 2025.

Read more
Nvidia may keep producing one RTX 40 GPU, and it’s not the one we want
The Alienware m16 R2 on a white desk.

The last few weeks brought us a slew of rumors about Nvidia potentially sunsetting most of the RTX 40-series graphics cards. However, a new update reveals that one GPU might remain in production long after other GPUs are no longer being produced. Unfortunately, it's a GPU that would struggle to rank among Nvidia's best graphics cards. I'm talking about the RTX 4050 -- a card that only appears in laptops.

The scoop comes from a leaker on Weibo and was first spotted by Wccftech. The leaker states that the RTX 4050 is "the only 40-series laptop GPU that Nvidia will continue to supply" after the highly anticipated launch of the RTX 50-series. Unsurprisingly, the tipster also reveals that the fact that both the RTX 4050 and the RTX 5050 will be readily available at the same time will also impact the pricing of the next-gen card.

Read more