When looking for a new monitor, one of the most difficult decisions an enthusiast has to face is deciding between Nvidia G-Sync and AMD FreeSync. For the most demanding, choosing between one of these dynamic refresh rate technologies is mandatory, in order not to run into problems such as stuttering, input lag, and screen tearing. In this article, you can know about the difference between Nvidia GSync vs AMD FreeSync.
Although G-Sync monitors have been made for PCs with Nvidia GPUs and FreeSync monitors for users using AMD GPUs, both promise the user one thing: a fluid image. But with the same configurations, is there one technology better than the other?
In this article, we compare technologies to understand if there is a real winner. To compare the two solutions, we equipped ourselves with two almost identical AOC monitors, an Agon AG241QG with G-Sync and an Agon AG241QX with FreeSync (to say the whole model, it is also G-Sync Compatible). We will compare the gaming experience on both monitors, speed, and accuracy, both before and after calibration, to see if there is a real difference between the two.
A short story
For years, games with rendered graphics have suffered from an artifact called frame tearing. Put simply. It occurs when the video card sends frames that are out of sync with the monitor’s refresh rate. For most PC monitors, we’re talking about 60 Hz, but panels that can operate at 144 Hz and beyond also show this problem.
What was needed to solve this annoying artifact was to find a way to synchronize the monitor and video card so that the frames were rendered only at the beginning of the refresh cycle. The simplest solution was to create monitors with varying refresh rates. And that was what Nvidia did in 2014.
In its first version, G-Sync was a kit that users could buy for $ 200 and then have to install on their monitors. Obviously, it was not a long-term sustainable solution, which is why manufacturers began to include the module in specially built displays. This is the form in which G-Sync is presented today.
AMD did not, however, stand by and watch and created something even more revolutionary. In 2015 he presented FreeSync, a technology with the same basic concept but included in the DisplayPort specifications. It was and still is totally free.
All a manufacturer needs to do is support DisplayPort 1.2a and implement the function via firmware. A remarkable and decidedly cheaper result for AMD users, who were thus able to enjoy Adaptive Sync without paying more for a “special” monitor.
Difference Between Nvidia GSync vs AMD FreeSync
G-Sync vs. FreeSync: comparison of features
G-Sync and FreeSync do the same thing, in the sense that they connect the monitor and the graphics card so that the frames are drawn only at the beginning of a panel refresh cycle. The refresh rate varies in real-time, so there is never an incomplete frame on the screen. In a nutshell: there is no difference between what a G-Sync user or a FreeSync sees. Or if you have tearing or not. If adaptive sync works, you won’t be able to distinguish between G-Sync and FreeSync.
In 2015 we set up a blind test to determine if users were able to notice any difference. Although there were some basic problems in this methodology, the participants chose G-Sync quite clearly. It was a choice attributable to the relative novelty of FreeSync and the wider operating range of G-Sync. Is the situation the same in 2019?
Today we have “G-Sync Ultimate” and “FreeSync 2 HDR” to support HDR content and its wider bandwidth. Both are updates of the original solutions and offer greater functionality and better performance in a greater variety of situations than in the past.
And
FreeSync 2 HDR requires DisplayPort 1.4 for a complete implementation. In addition to FreeSync, it supports 144 Hz operation, HDR10, 4K games, and the extended DCI-P3 color gamut. To get the same support from Nvidia G-Sync, simply spend a little more respect and buy a monitor equipped with G-Sync with the same specifications.
There is one more thing to keep in mind: some FreeSync monitors, labeled “Compatible G-Sync,” can actually activate G-Sync when connected to an Nvidia GTX 10 or better graphics card, but without HDR.
Most manufacturers today manufacture and market versions of the same monitor with G-Sync and FreeSync. Obviously, if a user already has an AMD GPU rather than an Nvidia on their PC, the choice between FreeSync and G-Sync will be required. If instead you are building a system from scratch and you have to make a choice, which one to opt for?
G-Sync vs. FreeSync: test and comparison
As previously written, we have equipped ourselves with two monitors: AOC Agon AG241QG (G-Sync) and AOC Agon AG241QX (FreeSync). Both have a TN panel with QHD resolution (2560 × 1440). The FreeSync monitor reaches 144 Hz, while the G-Sync overclocks up to 165 Hz. They offer the same compatibility as the SDR signal. Both do not have HDR or support an extended color.
I was speaking of the available ports. The QX has a DisplayPort and two HDMI 2.0, while the QG comes with a DisplayPort and an HDMI 1.4. The QX also offers a VGA and a DVI. Both screens have four downstream USB 3.0, plus an upstream, a built-in headphone jack and speakers. Aside from a difference of one watt for the speakers, they are almost identical on paper.
Other than that, there isn’t much that differentiates the two monitors. Physically they are identical in size, shape, and style. Neither has a blur reduction function, and while one has a controller (QX), the other uses buttons on the frame to navigate the OSD (QG).
To understand if G-Sync or FreeSync offers players something different, we put our two monitors in front of a synthetic color and luminance test suite to check if one is more accurate or has better contrast than the other. In theory, the test results should be similar for both, since these monitors use the same AU Optronics panel.
Luminance, color and response values
The final results surprised us, in particular, the QX stood out in the brightness (maximum white) and contrast tests. Obviously, different teams have designed the same panel because the black levels and dynamic range (maximum contrast and calibrated contrast) are literally different.
The AG241QX with FreeSync offered typical TN panel performance today, with a contrast greater than 918: 1. The calibrated results were almost identical. The result is clear: when you look at them side by side, the QX has a better image. The HQ, in comparison, seems faded.
The results of the gamma and color tests are much closer, even if the G-Sync screen required a calibration while the FreeSync did not. With factory scores of 2.17dE and 2.79dE for grayscale and color, respectively, the QX did not require major adjustments. The same cannot be said of the HQ, which definitely needs optimization—further details in the CalMAN charts below.
In the speed tests, the difference is relative to the 144 Hz of the QX against the 165Hz of the HQ. The raw input lag between the two displays is the same and stands at 25 ms, while the panel response has differed by 2 ms. Nobody can perceive a difference of 2 ms. When it comes to response, the two panels are almost identical.
Default color and grayscale values
CalMAN charts help illustrate the values we have recorded. The AG241QG comes out of the box with warm tones lacking in blue. The range is also quite far from the lens. This adds to the flat image of a monitor with less than average contrast. There is a lot of brightness, but the detail in the foreground does not emerge as it should, and the blacks seem more an average shade of gray.
The AG241QX, on the other hand, comes with a grayscale fairly close to D65, which corresponds to any content burned with the Adobe RGB or sRGB color space. Only with 100 percent brightness does it go beyond 3dE, and only barely. The range is also acceptable.
Grayscale and gamma precision greatly affects color. Unfortunately, the AG241QG has hue errors visible in red, green, and cyan, thanks to a white point that tends towards yellow. Red is also slightly undersaturated, but this seems to depend on the panel because the AG241QX exhibits the same behavior.
Calibrated color and grayscale measurements
In addition to that undersaturated red, the QX is solid in the accuracy of the color range it can offer. Hue errors are less than HQ, and all errors are below the visible threshold. This is due to its grayscale and correctly adjusted range.
Although both monitors benefit from some interventions, the AG241QG is the one that has the most feedback. Now HQ wins the grayscale match with a score of 0.76dE. However, the QX is still well within our standard of accuracy.
Color gamut errors are now at a minimum between the two, with a difference of just 0.01dE. If you have one of these monitors, feel free to try our calibration settings.
What do the FreeSync and G-Sync offer?
They suggest eliminating the above misunderstanding by forcibly synchronizing the screen and the graphics accelerator: a new picture is issued exactly when the update starts. This approach saves energy, saves monitor resource,s and makes gameplay or watching videos visually more enjoyable.
There is only one potential drawback in this interaction. In games where the computer is not capable of providing sufficient performance (and films that usually use an anachronically low frame rate of 23.976). The user will notice a characteristic flickering of the picture (a dark bar running from one side of the screen to another – because the image is not updated entirely and immediately, but line by line). But solutions to this annoying little thing also exist. The simplest thing that comes to mind is a banal doubling of frames (or any other multiple increases by adding intermediate frames to existing frames, with or without motion compensation. That’s what many TVs with prohibitively high refresh rates do, on which the image looks livelier) ‘).
What are the differences?
The fundamental difference between the approach of AMD and Nvidia is that the former implements control of the scanning frequency of the screen through the DisplayPort video interface. And the latter use their own chip installed in a monitor compatible with G-Sync.
A lot of consequences follow from this.
- G-Sync will cost more than FreeSync. Acer XG270HU 27-inch monitor with G-Sync support costs $ 100 more than the same monitor without technology support.
- G-Sync will work with all the more or less relevant Nvidia graphics cards, not only the latest ones. All that is needed is a driver and a compatible monitor update. At the same time, on the AMD side, FreeSync support is implemented in only six graphics adapters (it will be supported in all future GPUs as well).
- Technologies have different acceptable sweep frequency ranges. AMD allows a wider range: 9-240 fps (in fact, here, the figure is limited only by the hardware of the monitor used). Nvidia loses noticeably ( 30-144 fps).
Suppose the frame rate in the game falls below the minimum acceptable monitor sweep frequency. AMD suggests disabling synchronization completely and setting the screen sweep frequency to a minimum. Nvidia, for its part, currently does not provide the option to turn off synchronization. But can automatically duplicate frames to get into the monitor’s operating range.
And what is better?
Times change. The difficult choice between soap and awl remains.
Neither Nvidia nor AMD has decisive advantages in implementing adaptive image scanning technology. Each company’s solution has its own strengths. However, the owners of Nvidia video cards were left with a moral gain. They do not need to buy a new video card to try G-Sync. Just connect a monitor compatible with the technology.
At the moment, Intel is the only player who missed his turn. Whether she will present her own variation of FreeSync and G-Sync will show the near future.