If you’re a gamer that is looking to upgrade their monitor, one feature you’ve undoubtedly heard about is NVIDIA’s G-Sync. And, it is likely that you’ve heard a mixed range of views on whether or not G-Sync is worth it.
In this guide, we’re going to quickly break down what G-Sync is, how it helps gamers out, some of its downsides, what the alternative options are, and, ultimately, whether or not it is worth it to use.
Update on G-Sync for 2019
NVIDIA recently released new drivers that allow NVIDIA GPUs to work with certain adaptive sync and FreeSync monitors. This completely changes the view on G-Sync monitors, as it gives gamers with NVIDIA GPUs an opportunity to get a more affordable G-Sync monitor option.
As of right now, only a handful of FreeSync monitors are getting support for NVIDIA’s G-Sync technology. But this move by NVIDIA is a shift in philosophy and will help them level the playing field with AMD in the mid-range/budget GPU market.
What is G-Sync and How Can it Help You?
Standard monitors operate at a fixed refresh rate which means that they operate at the same speed (frequency) all of the time. GPUs, on the other hand, do not operate at a fixed rate. So, a GPU can (and often does) produce images (frames) faster or slower than a monitor is able to display those same images.
This difference in frames rendered and frames displayed between your GPU and monitor creates screen tearing. When your GPU is creating frames (frames per second) at a lower rate than your monitor’s refresh rate, your monitor will show the next frame before it is fully rendered. When your GPUs framerate is faster than your monitor’s refresh rate, your monitor will start to show the next frame before it has finished showing the previous frame.
In both of these instances, you, the user, see what is known as screen tearing because, to you, it will look like your image is split.
VSync is one feature that fixes screen tearing. Essentially, it forces the graphics card to produce frames at the same rate as the monitor’s refresh rate by capping the graphics card’s framerate. This effectively eliminates screen tearing. However, it creates another common issue called stuttering.
Even though the graphics card is capped to never exceed the monitor’s refresh rate, because graphics cards don’t produce frames at a fixed rate, they can potentially fall below the fixed rate of the monitor. And, stuttering is caused when your graphics card can’t keep up with the monitor’s refresh rate and the monitor has to wait for the next frame to be rendered. It creates what looks like a “stuttering” in movement on the user’s end.
G-Sync to the Rescue
G-Sync solves both the problem of screen tearing and stuttering at the same time. As you might have already guessed, rather than capping the graphics card’s frame rate so that it cannot exceed the monitor’s refresh rate, NVIDIA’s G-Sync technology allows the monitor to operate at a variable frequency to match the output of the graphics cards.
So, with a G-Sync monitor, if your graphics card is producing 75 frames per second (FPS), your monitor will operate at a 75Hz refresh rate. If you run into a more demanding situation in a game that requires your graphics card’s framerate to drop, then G-Sync will drop its refresh rate frequency to match the new framerate that the GPU is working at.
This eliminates screen tearing and stuttering and helps games feel much smoother to play.
The Downsides of G-Sync
While NVIDIA’s G-Sync technology works well and, for the most part, does what it sets out to do, it isn’t without its downsides.
Currently, there are two main issues that users have with G-Sync: cost and compatibility.
G-Sync is Expensive (But, Cheaper G-Sync Monitors Are Starting to Arrive!)
Unlike other adaptive sync technologies out there (like AMD’s Freesync), NVIDIA’s G-Sync is a hardware-based solution. What that means is that rather than utilizing software to force the monitor to use a variable refresh rate, G-Sync monitors actually have a module installed in them that allows the monitor to operate at a variable refresh rate.
Getting monitor manufacturers to implement G-Sync chips inside of their monitors and in order to make them G-Sync ready drives up the cost of those monitors. When you look at two monitors side-by-side, all other features equal, but one of the monitors has G-Sync, that monitor is going to cost significantly more than the monitor that doesn’t have G-Sync.
UPDATE 2019: The good news, though, is that NVIDIA has finally caved to pressure to start allowing their GPUs to work with FreeSync monitors. As of Janurary 15th, NVIDIA has released GeForce drivers that make NVIDIA GPUs compatible on a select number of FreeSync monitors.
This means that you can now get the benefits that NVIDIA’s G-Sync offers through a more affordable FreeSync monitor that meets NVIDIA’s G-Sync requirements.
G-Sync Won’t Work With Your AMD GPU
If you already have an AMD graphics card, or you were planning on getting one, you won’t be able to utilize NVIDIA’s G-Sync technology. G-Sync can only be utilized by NVIDIA graphics cards. You could still buy a G-Sync monitor, but you would pay a premium to do so, and, again, you wouldn’t be able to use G-Sync.
If you do have a newer AMD graphics card or are planning on buying one, AMD’s Freesync technology is their answer to G-Sync. Unlike G-Sync, Freesync is an open-source software-based solution and so Freesync-compatible monitors are much less expensive than G-Sync-compatible monitors.
However, while G-Sync is more expensive, it also offers a more consistent experience. NVIDIA works directly with display manufacturers to ensure that G-Sync is implemented correctly. This means that every G-Sync monitor out there has to have first met NIVIDA’s specific set of requirements for G-Sync certification.
Freesync, on the other hand, is much more open to hardware manufacturers and, as a result, the usefulness of Freesync can vary from monitor to monitor. On cheaper Freesync monitors, there may only be a specific range in which Freesync will use a variable refresh rate. For instance, on the ~$130 AOC G2460VQ6 24” monitor, Freesync will only operate between framerates of 48-75 FPS. If you get higher or lower framerates than that when using that monitor, Freesync will not work and you’ll have the same issue as if you had a fixed rate monitor.
So, while Freesync is more affordable and can produce similar results to G-Sync, just because a monitor is Freesync-enabled, doesn’t mean that it will work in every instance.
So… Are G-Sync Monitors Worth It?
The answer to whether or not G-Sync is worth it is that it depends on the user. If you have a larger budget, you’ll likely want to check out one of the higher-end G-Sync displays out there that still have G-Sync built directly into the monitor.
If you are working with a tighter budget and you want to get a budget-friendly NVIDIA GPU, you’re not completely out of luck. As of January 15th, there are a handful of FreeSync monitors that now support NVIDIA’s G-Sync technology. So, that is a route you might want to look into.
If you already have a computer, though, and it has an AMD graphics card in it, you’re either going to have to switch to an NVIDIA GPU (which is going to up your costs even more) or—probably the better option—choose a Freesync monitor instead.
Ultimately, where in the past G-Sync monitors were mostly only suited for NVIDIA users who had a large budget and a high-end gaming computer, the move to include G-Sync support on FreeSync monitors has opened up the opportunity to utilize a G-Sync monitor for anyone who has an NVIDIA GPU.