Tech Guided is supported by readers. If you buy products from links on our site, we may earn a commission. This won't change how much you pay for the products and it doesn't influence our decision in which products we recommend. Learn more

G-Sync vs FreeSync: Which Adaptive Sync Tech is Better?

G-Sync vs FreeSyncIf you’re in the market for a new high-end gaming monitor, but you’re unsure of whether you should get a G-Sync or FreeSync display, in this article, we highlight the differences between both adaptice sync technologies to help you make the right decision.

Back when VSync was the only solution to screen tearing, PC gamers often had to decide between atrocious input lag and obnoxious screen tearing. But, for the past few years, many of the best gaming monitors have offered VSync alternatives: NVIDIA G-Sync and AMD FreeSync.

Thereโ€™s a lot of jargon involved in the G-Sync vs FreeSync debate: Do you want a G-Sync โ€˜ultimateโ€™ monitor? A G-Sync โ€˜compatibleโ€™ one? Should you go for FreeSync 1 or FreeSync 2? The list goes on, but donโ€™t worry, the differences between many of these marketing terms are pretty simple.

Whatโ€™s more is nowadays you donโ€™t have to pay quite so much attention to whether you have an NVIDIA or AMD graphics card, because many of the best graphics cards โ€“ whether from NVIDIA or AMD โ€“ should be compatible with the adaptive sync tech of many of the very latest G-Sync or FreeSync monitors. You should always check compatibility in advance, though.

What is Screen Tearing?

When you game, your graphics card renders a certain number of frames each second (your โ€˜framerateโ€™) which it sends to your screen to display. Your monitor displays the frames it receives at a set number per second, and the rate at which it does this is called the monitorโ€™s โ€˜refresh rateโ€™ and is measured in Hz. A 60Hz monitor, for example, displays 60 of the frames that itโ€™s received from the GPU per second.

Screen Tearing Example

Problems can occur, however, when the monitorโ€™s refresh rate and your gameโ€™s rendered framerate donโ€™t align, because without synchronised framerate and display output a monitor might sometimes display multiple frames on the screen at once. When a monitor displays two or more frames at once, this makes it look like the screen is torn โ€“ with the bottom half of the screen displaying one frame and the top half displaying another, for example. This effect is what is meant by โ€˜screen tearingโ€™.

What is G-Sync?

NVIDIA G-Sync is an adaptive sync technology that prevents screen tearing while gaming. The traditional way to stop screen tearing used to be to enable โ€˜VSyncโ€™ (vertical sync) in games. VSync works by limiting the number of frames sent by the GPU to the monitor each second so that this matches the screenโ€™s refresh rate. For example, if your GPU is outputting 100 fps (frames per second) but your monitorโ€™s refresh rate is 60Hz, VSync would limit your framerate to 60fps so that itโ€™s synchronised with the monitorโ€™s refresh rate. This synchronisation prevents screen tearing.

However, while VSync prevents screen tearing, it also generates input delay, because inputted movements only appear on the screen after the GPU has waited for the monitor to refresh. Furthermore, having VSync enabled locks your framerate to a single refresh rate, which can cause visual stutters if your framerate drops below this point.

G-Sync, as an adaptive sync technology, resolves these problems. It essentially works in the opposite way: rather than limiting the GPUโ€™s outputted frames so that it matches the monitorโ€™s refresh rate, G-Sync changes the monitorโ€™s refresh rate on the fly to match the GPUโ€™s outputted framerate. All the changes happen on the monitor frame display side, not the GPU render side.

A proprietary NVIDIA scaler module sits inside all G-Sync monitors. This module helps match the monitorโ€™s refresh rate to the GPUโ€™s outputted frame rate to prevent screen tearing. When paired with an NVIDIA graphics card of 600-series or later, NVIDIA software can enable G-Sync to work alongside this module to eliminate screen tearing.

G-Sync Compatible vs G-Sync

G-Sync monitors have an NVIDIA scaler module; monitors that are only โ€˜G-Sync compatibleโ€™ do not. โ€˜G-Sync compatibleโ€™ is essentially a marketing term used to describe those monitors which lack the proprietary scaler module but have nevertheless been validated by NVIDIA to be compatible with enabling G-Sync in NVIDIA software to run G-Sync tech smoothly.

When G-Sync is enabled in the software without a corresponding scaler module inside the monitor โ€“ i.e., when using a G-Sync โ€˜compatibleโ€™ monitor โ€“ the software is instead using the monitorโ€™s built-in โ€˜Adaptive Syncโ€™. Adaptive Sync is an open standard that forms part of the DisplayPort spec. If a monitor is designated โ€˜G-Sync compatibleโ€™ by NVIDIA, this means that NVIDIA has confirmed that gamers should have a good experience enabling G-Sync on the monitor, which will use the open โ€˜Adaptive Syncโ€™ rather than proprietary G-Sync hardware.

What is FreeSync?

AMD has developed its own adaptive sync technology to compete with G-Sync, called FreeSync. Unlike G-Sync, FreeSync relies on the open โ€˜Adaptive Syncโ€™ standard built into the DisplayPort specification. FreeSync monitors donโ€™t have a FreeSync scaler module inside for AMD software to communicate with, rather AMD software instead communicates with the monitorโ€™s own firmware and uses Adaptive Syncโ€™s variable refresh rate technology to allow the GPU to adaptively synchronise the monitorโ€™s refresh rate with a gameโ€™s framerate.

In other words, FreeSync monitors do the same as G-Sync compatible monitors, except they do so by interfacing with โ€˜FreeSyncโ€™ software rather than โ€˜G-Syncโ€™ software.

G-Sync vs FreeSync: Which one should you use?

Deciding whether to use a G-Sync or FreeSync monitor used to be pretty simple: if you have an NVIDIA card go with G-Sync and if you have an AMD card go with FreeSync. But NVIDIA cards can now use Adaptive Sync if theyโ€™re listed as โ€˜G-Sync compatibleโ€™, and AMD cards can use Adaptive Sync on the latest G-Sync monitors. This means itโ€™s not so cut and dry, anymore.

If youโ€™re considering buying a G-Sync or FreeSync gaming monitor, you should of course check whether it will be compatible with your graphics card. However, once this is confirmed, thereโ€™s little to sway it one way or another.

Both G-Sync and FreeSync monitorsโ€™ adaptive sync technologies are near-flawless. Both use variable refresh rates to eliminate screen tearing, and both do this well. Both also offer some form of low framerate compensation (LFC) on monitors with a large enough minimum and maximum refresh rate disparity, where frames are doubled to maintain synchronisation when framerates dip below a certain minimum threshold. Both even support adaptive sync with HDR enabled, either via G-Sync Ultimate or FreeSync Premium.

How to decide, then? The honest answer is: You canโ€™t go wrong with either a G-Sync, G-Sync compatible, or FreeSync monitor, providing your graphics card supports the respective technology. Over the years both standards have become near-perfect, which is probably why NVIDIA has decided to open G-Sync tech to AMD and vice versa. When deciding on a gaming monitor, focus on its resolution, refresh rate, response time, and all other such factors. Whether itโ€™s FreeSync or G-Sync, if your graphics card is compatible, it should work a treat and eliminate screen tears without any performance hit.

Jacob Fox

Jacob's been tinkering with computer hardware for over a decade, and he's written hardware articles for various PC gaming websites. Outside of the wonderful world of PC hardware, he's currently undertaking a PhD in philosophy, with a focus on topics surrounding the meaning of life.

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.