Tech Guided is supported by readers. If you buy products from links on our site, we may earn a commission. This won't change how much you pay for the products and it doesn't influence our decision in which products we recommend. Learn more

Is G-Sync Worth It in 2022?

Is G-Sync Worth ItNVIDIA’s G-Sync technology can help improve your in-game experience, but is it worth the extra cost? In this post, we go over what G-Sync is and whether or not it would be worth it for you.

Look at the best gaming monitors today and you’ll notice that many of them offer ‘G-Sync’ capability. In a market in which many of these monitors compete very closely on all performance metrics – response time, refresh rate, contrast ratio, and so on – G-Sync capability can often become a deciding factor.

G-Sync is NVIDIA’s take on variable refresh rate (VRR) technology, designed to eliminate screen tearing without causing a performance hit. It isn’t the only VRR technology, but over the years it’s become recognisable as somewhat of a gold standard: if you own a G-Sync monitor, you can be pretty sure that screen tearing won’t occur.

This doesn’t tell us whether G-Sync is worth it, however. G-Sync monitors are often more expensive than their FreeSync or ‘G-Sync Compatible’ alternatives because they have a proprietary hardware module built into the screen that helps your GPU control the monitor’s refresh rate. Whether G-Sync is worth it depends on how well it performs compared to its often-cheaper alternatives.

What is VRR?

If you’ve ever gamed on a non-G-Sync, non-FreeSync monitor whilst also having VSync disabled in-game, you’ve probably noticed the occasional ‘screen tear’, where the displayed picture appears to be horizontally torn in two, with the top portion being slightly offset from the bottom portion.

This occurs because your graphics card churns out frames at a rate that often doesn’t sync up with when your screen displays these frames (the monitor’s ‘refresh rate’). When your framerate and refresh rate are out of sync, your monitor might attempt to draw a frame to the screen but then receive a new frame to display mid-way through drawing. It might then begin to draw the new frame mid-way through the drawing process. The result is two frames briefly being displayed next to one another, which makes it look like the screen is torn.

VRR is a technology that attempts to prevent such screen tearing. If screen tearing is caused by your monitor’s refresh rate and GPU’s framerate being out of sync, to fix it you can ensure that your framerate is synchronised with your refresh rate or vice versa.

VRR technology ensures that your monitor’s refresh rate is synchronised with your framerate, which, in practice, causes less input delay than doing it the other way around. It works by ensuring that your monitor’s refresh rate is constantly adjusting to remain synchronised with your framerate, and this eliminates screen tearing.

What does G-Sync do?

G-Sync is NVIDIA’s proprietary VRR technology, which uses an NVIDIA hardware scaler module inside the monitor to help control its refresh rate. When paired with NVIDIA G-Sync software, a compatible GPU will use the monitor’s scaler module to adjust its refresh rate on the fly so that it stays synchronised with the framerate being outputted by the GPU. In other words, G-Sync ensures the monitor’s refresh rate is constantly adjusted to remain in sync with a game’s framerate.

While you don’t need one of the very best graphics cards to use G-Sync, you will need an NVIDIA GPU of 600-series or later. If you have an older NVIDIA card, or an AMD one, you won’t be able to use full G-Sync technology. While it’s true that the latest G-Sync monitors are now compatible with AMD cards, this just means that these new monitors now support Adaptive Sync – another kind of VRR tech that uses an open standard – which can be used by AMD cards. If you want to use full G-Sync via the hardware module, you still need an NVIDIA card with G-Sync capability.

When looking for a G-Sync monitor, you might notice that some are listed as ‘G-Sync Compatible’ and others as ‘G-Sync Ultimate’. A G-Sync Compatible monitor is one that doesn’t use a G-Sync hardware scaler, but which has been verified by NVIDIA to work well with G-Sync enabled by using the open Adaptive Sync VRR standard. A G-Sync Ultimate monitor, on the other hand, uses full G-Sync via the NVIDIA hardware scaler, but also has some extra capabilities such as HDR support. NVIDIA provides a handy list of all G-Sync, G-Sync Ultimate, and G-Sync Compatible monitors.

How to enable G-Sync

Enabling G-Sync is simple, providing you have a G-Sync capable graphics card and a G-Sync monitor. NVIDIA even provides their own guide on how to do so. In your NVIDIA control panel, click ‘Set up G-Sync’, then tick the ‘Enable G-Sync, G-Sync Compatible’ box, and select the monitor(s) on which you want to enable G-Sync. Hit ‘apply’, restart your PC, and G-Sync should now be enabled. To test whether it’s working, you can use NVIDIA’s G-Sync Pendulum Demo with the ‘test pattern’ enabled – if there are no screen tears or artifacts, you’re good to go.

Is G-Sync worth it?

When deciding whether G-Sync is worth it, there are two things to consider. First, is VRR technology worth it? Second, is G-Sync itself worth it when compared to other VRR technologies such as FreeSync?

Is VRR worth it?

The answer to the first question is a resounding ‘yes’. VRR technology makes for a better gaming experience, full-stop. Some people don’t mind screen tearing too much, but even if it doesn’t bother you that much it’s still better to have a tear-free experience. Back when VSync was the only option this was more debatable because by enabling VSync you’d often get input delay. However, modern VRR technology gives no noticeable performance hit and causes no noticeable input delay. And now that VRR tech is standard, even cheap gaming monitors support it.

G-Sync vs alternatives

The answer to the second question is a more qualified ‘yes’. G-Sync is often worth it, but this varies on a case-by-case basis. Because G-Sync monitors use NVIDIA’s proprietary hardware scaler, they often cost more than their non-G-Sync competition. The price differential isn’t as great as it used to be, but it still exists.

The main alternative is a FreeSync or G-Sync Compatible monitor. Both use the open (VESA standard) Adaptive Sync technology instead of NVIDIA’s scaler module, and FreeSync monitors are designed for AMD cards while G-Sync Compatible monitors are designed for NVIDIA ones. Because these technologies utilise the free and open Adaptive Sync standard, they cost less to implement and do the same thing as G-Sync, albeit in a slightly different way.

A FreeSync monitor is one that’s been certified by AMD to have good Adaptive Sync performance, and a G-Sync Compatible monitor is one that’s been certified by NVIDIA to have good Adaptive Sync performance. Broadly speaking, both technologies offer comparable performance to full NVIDIA G-Sync technology, so opting for one of these over a G-Sync monitor might therefore be preferable on the price front.

However, G-Sync monitors have a long and venerable history of offering a consistently good experience. For instance, all G-Sync monitors support VRR even at very low refresh rates, whereas some FreeSync or G-Sync Compatible monitors don’t. This is why deciding between a G-Sync monitor and a FreeSync or G-Sync Compatible one should be evaluated on a case-by-case basis. A FreeSync monitor that offers ultra-low and ultra-high VRR framerate sync capability might be just as expensive as a G-Sync monitor. While this is becoming less and the less the case because Adaptive Sync monitors are becoming better and better, it’s still something to bear in mind when comparing individual monitors.

For most use cases, you can’t go wrong with a G-Sync, FreeSync, or G-Sync Compatible monitor, so you should opt for whichever one offers the best bang for your buck considering other things like maximum refresh rate, response time, and contrast ratio. But if you want the best possible VRR experience, you should either opt for a G-Sync monitor or otherwise make sure that the specific G-Sync Compatible or FreeSync monitor that you’re considering can match G-Sync’s quality standards.

Jacob Fox

Jacob's been tinkering with computer hardware for over a decade, and he's written hardware articles for various PC gaming websites. Outside of the wonderful world of PC hardware, he's currently undertaking a PhD in philosophy, with a focus on topics surrounding the meaning of life.

4 thoughts on “Is G-Sync Worth It in 2022?”

  1. G-Sync is still crucial for emulators running at weird (for today’s standards) refresh rates. E.g. for MAME, SNES emulators, WinUAE etc. G-Sync grants perfect smoothness also for these emulated machines running at 55.7Hz, 49.97Hz etc. etc. cause visible jittering if using simple V-Sync or fixed framerate cap.

  2. @Temptor
    With Gsync, the imput lag is 1 ms more, which can be considered an infelicity (1000 ms in a second).

    For everyone who plays competitive shooters and has a computer capable of delivering stable 120 fps and above, I recommend enabling this technology.
    The Gsync technology itself is amazing.

  3. You left out a huge huge point. Gsync causes input lag. If your sensitive to small changes on your monitor you will definitely notice it. If your playing competitive FPS shooters you may just want to deal with the screen tear because your aim will suffer with gsync. Also gsync only works within a certain frame rate just like Freesync. If you go above your refresh rate gsync turns off. You will need to cap your frames -3 below your refresh rate to stop this from happening. I just bought as gsync monitor and I don’t like it what a waste of money. It’s not as good as you think if you play fast competitive shooters

  4. Thank you for this article. It provided me the information that I was searching for when wondering what Gsync/FreeSync was, in addition it’s been updated to reflect Nvidia’s new position and support. Thank you!


Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.