Tech Guided is supported by readers. If you buy products from links on our site, we may earn a commission. This won't change how much you pay for the products and it doesn't influence our decision in which products we recommend. Learn more

G-Sync vs FreeSync: Which Adaptive Sync Tech is Better?

G-Sync vs FreeSyncIf you’re in the market for a new high-end gaming monitor, but you’re unsure of whether you should get a G-Sync or FreeSync display, in this article, we highlight the differences between both adaptice sync technologies to help you make the right decision.

Back when VSync was the only solution to screen tearing, PC gamers often had to decide between atrocious input lag and obnoxious screen tearing. But, for the past few years, many of the best gaming monitors have offered VSync alternatives: NVIDIA G-Sync and AMD FreeSync.

There’s a lot of jargon involved in the G-Sync vs FreeSync debate: Do you want a G-Sync ‘ultimate’ monitor? A G-Sync ‘compatible’ one? Should you go for FreeSync 1 or FreeSync 2? The list goes on, but don’t worry, the differences between many of these marketing terms are pretty simple.

What’s more is nowadays you don’t have to pay quite so much attention to whether you have an NVIDIA or AMD graphics card, because many of the best graphics cards – whether from NVIDIA or AMD – should be compatible with the adaptive sync tech of many of the very latest G-Sync or FreeSync monitors. You should always check compatibility in advance, though.

What is Screen Tearing?

When you game, your graphics card renders a certain number of frames each second (your ‘framerate’) which it sends to your screen to display. Your monitor displays the frames it receives at a set number per second, and the rate at which it does this is called the monitor’s ‘refresh rate’ and is measured in Hz. A 60Hz monitor, for example, displays 60 of the frames that it’s received from the GPU per second.

Screen Tearing Example

Problems can occur, however, when the monitor’s refresh rate and your game’s rendered framerate don’t align, because without synchronised framerate and display output a monitor might sometimes display multiple frames on the screen at once. When a monitor displays two or more frames at once, this makes it look like the screen is torn – with the bottom half of the screen displaying one frame and the top half displaying another, for example. This effect is what is meant by ‘screen tearing’.

What is G-Sync?

NVIDIA G-Sync is an adaptive sync technology that prevents screen tearing while gaming. The traditional way to stop screen tearing used to be to enable ‘VSync’ (vertical sync) in games. VSync works by limiting the number of frames sent by the GPU to the monitor each second so that this matches the screen’s refresh rate. For example, if your GPU is outputting 100 fps (frames per second) but your monitor’s refresh rate is 60Hz, VSync would limit your framerate to 60fps so that it’s synchronised with the monitor’s refresh rate. This synchronisation prevents screen tearing.

However, while VSync prevents screen tearing, it also generates input delay, because inputted movements only appear on the screen after the GPU has waited for the monitor to refresh. Furthermore, having VSync enabled locks your framerate to a single refresh rate, which can cause visual stutters if your framerate drops below this point.

G-Sync, as an adaptive sync technology, resolves these problems. It essentially works in the opposite way: rather than limiting the GPU’s outputted frames so that it matches the monitor’s refresh rate, G-Sync changes the monitor’s refresh rate on the fly to match the GPU’s outputted framerate. All the changes happen on the monitor frame display side, not the GPU render side.

A proprietary NVIDIA scaler module sits inside all G-Sync monitors. This module helps match the monitor’s refresh rate to the GPU’s outputted frame rate to prevent screen tearing. When paired with an NVIDIA graphics card of 600-series or later, NVIDIA software can enable G-Sync to work alongside this module to eliminate screen tearing.

G-Sync Compatible vs G-Sync

G-Sync monitors have an NVIDIA scaler module; monitors that are only ‘G-Sync compatible’ do not. ‘G-Sync compatible’ is essentially a marketing term used to describe those monitors which lack the proprietary scaler module but have nevertheless been validated by NVIDIA to be compatible with enabling G-Sync in NVIDIA software to run G-Sync tech smoothly.

When G-Sync is enabled in the software without a corresponding scaler module inside the monitor – i.e., when using a G-Sync ‘compatible’ monitor – the software is instead using the monitor’s built-in ‘Adaptive Sync’. Adaptive Sync is an open standard that forms part of the DisplayPort spec. If a monitor is designated ‘G-Sync compatible’ by NVIDIA, this means that NVIDIA has confirmed that gamers should have a good experience enabling G-Sync on the monitor, which will use the open ‘Adaptive Sync’ rather than proprietary G-Sync hardware.

What is FreeSync?

AMD has developed its own adaptive sync technology to compete with G-Sync, called FreeSync. Unlike G-Sync, FreeSync relies on the open ‘Adaptive Sync’ standard built into the DisplayPort specification. FreeSync monitors don’t have a FreeSync scaler module inside for AMD software to communicate with, rather AMD software instead communicates with the monitor’s own firmware and uses Adaptive Sync’s variable refresh rate technology to allow the GPU to adaptively synchronise the monitor’s refresh rate with a game’s framerate.

In other words, FreeSync monitors do the same as G-Sync compatible monitors, except they do so by interfacing with ‘FreeSync’ software rather than ‘G-Sync’ software.

G-Sync vs FreeSync: Which one should you use?

Deciding whether to use a G-Sync or FreeSync monitor used to be pretty simple: if you have an NVIDIA card go with G-Sync and if you have an AMD card go with FreeSync. But NVIDIA cards can now use Adaptive Sync if they’re listed as ‘G-Sync compatible’, and AMD cards can use Adaptive Sync on the latest G-Sync monitors. This means it’s not so cut and dry, anymore.

If you’re considering buying a G-Sync or FreeSync gaming monitor, you should of course check whether it will be compatible with your graphics card. However, once this is confirmed, there’s little to sway it one way or another.

Both G-Sync and FreeSync monitors’ adaptive sync technologies are near-flawless. Both use variable refresh rates to eliminate screen tearing, and both do this well. Both also offer some form of low framerate compensation (LFC) on monitors with a large enough minimum and maximum refresh rate disparity, where frames are doubled to maintain synchronisation when framerates dip below a certain minimum threshold. Both even support adaptive sync with HDR enabled, either via G-Sync Ultimate or FreeSync Premium.

How to decide, then? The honest answer is: You can’t go wrong with either a G-Sync, G-Sync compatible, or FreeSync monitor, providing your graphics card supports the respective technology. Over the years both standards have become near-perfect, which is probably why NVIDIA has decided to open G-Sync tech to AMD and vice versa. When deciding on a gaming monitor, focus on its resolution, refresh rate, response time, and all other such factors. Whether it’s FreeSync or G-Sync, if your graphics card is compatible, it should work a treat and eliminate screen tears without any performance hit.

Jacob Fox

Jacob's been tinkering with computer hardware for over a decade, and he's written hardware articles for various PC gaming websites. Outside of the wonderful world of PC hardware, he's currently undertaking a PhD in philosophy, with a focus on topics surrounding the meaning of life.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.