Updated

G-Sync vs. FreeSync: Adaptive sync gaming monitors explained

Two rival technologies promise to rid games of stuttering and tearing.

IDG

Once you’ve used a variable refresh rate monitor, you can’t go back. AMD’s FreeSync and Nvidia’s G-Sync technology promise PC gamers a buttery-smooth experience free of stuttering, screen tearing, or V-Sync-like input lag. But both ecosystems have their perks—and their downsides.

Now that adaptive sync monitors have several years of maturation under their belts, it’s time to take stock of FreeSync vs. G-Sync yet again. Here’s everything you need to know about AMD and Nvidia’s gaming displays.

Editor’s note: This article was last updated to include availability information for FreeSync 2 and G-Sync HDR displays, and link to our review of the Acer Predator X27 G-Sync HDR monitor.

What is adaptive sync, or variable refresh rate?

Before we dive into the differences between FreeSync and G-Sync, let’s take a quick look at the adaptive sync, or variable refresh rate, technology underneath both.

Your graphics card pushes images to your monitor as fast as it can, but traditional monitors refresh their image at a set rate—a 60Hz monitor, say, refreshes every 1/60th of a second. When your graphics card delivers frames outside of that schedule, your monitor shows a portion of one frame and the next frame onscreen simultaneously, resulting in the dreaded screen tearing. It looks as if the picture is trying to split itself in two and take off in different directions, and it only worsens the more dynamic your game’s frame rate becomes. It’s ugly. Damned ugly.

AMD

Screen tearing is gross. FreeSync and G-Sync eliminate it.

The VSync setting for your graphics card helps but introduces some negatives of its own: stuttering and sluggish input lag, as the technology tells your graphics card to wait on a new frame until the monitor is ready for it.

FreeSync and G-Sync eradicate all those problems by synchronizing the refresh rate of your monitor with the refresh rate of your graphics card (up to the monitor’s maximum refresh rate). When your video card pushes out a new frame, the adaptive sync monitor displays it—simple as that. If your graphics card is generating 52 frames a second, your monitor refreshes at 52Hz. The end result? Wonderfully smooth gameplay.

FreeSync vs. G-Sync: Implementation

AMD and Nvidia take two very different approaches to adaptive sync technology.

FreeSync piggybacks atop the VESA Adaptive-Sync standard, which is part of DisplayPort 1.2a. It’s compatible with off-the-shelf display scalers that monitor makers can use, and AMD doesn’t charge royalties or licensing costs. There is little extra cost to include FreeSync in a monitor. Because of that openness, you’ll find FreeSync available in a wide range of monitors, from affordable entry-level displays all the way up to premium gaming hardware.

Nvidia

G-Sync monitors rely on a proprietary hardware module from Nvidia.

G-Sync requirements are much stricter. The technology requires display makers to use a proprietary hardware module and Nvidia keeps a firm grip on quality control, working with manufacturers on everything from initial panel selection to display development to final certification.

That’s a decent amount of added cost, and G-Sync monitors tend to start at higher prices as it’s considered a premium add-on for premium gaming displays. You won’t often find G-Sync monitors paired with budget or mainstream gaming PCs as a result—though you’ll always know what you’re getting with G-Sync.

AMD FreeSync advantages and disadvantages

FreeSync’s main advantage is its openness and low cost of implementation.

The modest barrier of entry means you’ll find AMD’s adaptive sync tech in monitors as affordable as $130, which means that even gamers on a strict budget can enjoy FreeSync’s perks. G-Sync can’t compare. The cheapest G-Sync monitor currently on Newegg is the 27-inch Lenovo 65BEGCC1US, on sale for $330, and the vast majority of G-Sync displays cost well north of $500. Newegg’s FreeSync listings include 154 different monitors under $500.

Being actually affordable is one hell of an advantage for AMD. Normal people can actually buy these things.

Brad Chacos/IDG

Screenshot of Newegg’s AMD FreeSync monitor listings.

FreeSync’s openness has helped it spread far and wide. There are more than three times as many FreeSync monitors on Newegg as G-Sync panels. (Nvidia tells me there are more than 120 G-Sync displays and laptops available, though.)

FreeSync’s openness does have some drawbacks. Shopping for a FreeSync monitor is a pain in the ass compared to buying a G-Sync display. FreeSync monitors only support adaptive sync within a specified frame-rate range: 48Hz to 75Hz in the case of many low-cost models, for example. Every monitor supports a different range, and some are actually pretty restrictive. Fortunately you can peruse them all on AMD’s website, in the “monitors” section of the table at the bottom.

Brad Chacos/IDG

AMD’s FreeSync page lets you check the supported adaptive sync refresh-rate ranges and whether FreeSync works over DisplayPort or HDMI.

The loose standards mean you’ll also need to keep a close eye on the monitor’s features. For example, AMD introduced a feature called Low Framerate Compensation (LFC) to FreeSync that improves how FreeSync monitors behave beneath their minimum supported refresh rate (48Hz, in the prior example). Monitors with LFC duplicate frames when refresh rates are below the FreeSync minimum, enabling the refresh rate to enter the FreeSync range. If your graphics card is pumping out 30 frames per second, LFC duplicates the frames and runs the display at 60Hz, keeping things smooth. It’s great!

It’s also not mandatory, and largely found in pricier panels. Without LFC, moving into and out of FreeSync range is jarring, as you’ll go from silky-smooth gameplay one second to stuttering or screen-tearing the next. Again: You need to do some research to get the best possible FreeSync experience.

Gordon Mah Ung/IDG

Some monitors let you use HDMI connections for FreeSync, rather than the usual DisplayPort requirement.

Another FreeSync advantage is connectivity. The use of a standard display scaler means FreeSync monitors tend to have a full selection of ports. G-Sync displays are largely limited to DisplayPort and HDMI only. And while both FreeSync and G-Sync originally worked over DisplayPort alone, AMD has introduced FreeSync over HDMI, which helps bring the technology to even more monitors. It adds to FreeSync’s versatility, but it’s another variable to consider before you buy. (AMD’s website lists display compatibility on the same chart as the aforementioned FreeSync ranges.)

Next page: Nvidia G-Sync, graphics cards, HDR, and more

Nvidia G-Sync advantages and disadvantages

G-Sync’s biggest advantage: A consistent, high-quality gaming experience.

Nvidia

Every G-Sync display needs to pass Nvidia’s strict certification process, which has rejected monitors in the past. Nvidia hasn’t publicly detailed the requirements, but representatives tell me that the company works directly with panel makers like AU Optronics to optimize refresh rates, flicker properties, response times, and visual quality; then works with the display makers (like Asus and Acer) to fine-tune the on-screen display and more. Every monitor is calibrated to the sRGB color gamut. 

Every G-Sync monitor supports the equivalent of AMD’s Low Framerate Compensation, guaranteeing a smooth gaming experience. All G-Sync monitors also support “frequency dependent variable overdrive.” Without diving into too much detail, the technology prevents ghosting on G-Sync displays—an issue that severely affected early FreeSync panels, though the issue’s less prevalent now.

Nvidia

How G-Sync ULMB works.

While G-Sync started with a focus on adaptive sync, Nvidia says it has expanded beyond that to offer “the ultimate gaming monitors.”

Some G-Sync panels include added perks like refresh rate overclocking and Ultra Low Motion Blur, which combats the notorious blurring of text and other elements at very high refresh rates. It’s a killer feature for e-sports games. Early ULMB monitors (rightfully) earned a reputation for being somewhat dim, but new 240Hz G-Sync monitors from Asus, Acer, and Alienware pulse their backlight at a searing 400 nits of brightness, eliminating the concern. You can’t use G-Sync adaptive sync and ULMB simultaneously, however.

As far as negatives go, the limited port selection could be a bummer in some scenarios, but G-Sync’s biggest drawback is sheer price. G-Sync offers a premium gaming experience limited to premium-priced monitors. You definitely get what you pay for, but adaptive sync also pairs very well with budget graphics cards that struggle to push 60fps. Most GeForce GTX 1050 owners will never experience it. The high price of G-Sync monitors limits Nvidia’s tech to high-end users alone.

FreeSync vs. G-Sync: Graphics cards

Brad Chacos

The biggest bummer about AMD and Nvidia’s adaptive sync tech is that neither works with its rival’s graphics cards. GeForce graphics card owners need a G-Sync monitor, and Radeon graphics card owners can only use FreeSync displays.

To use G-Sync, you’ll need a GeForce GTX 600-series (or newer) graphics card. FreeSync requires a Radeon Rx 200-series (or newer) graphics card, though some individual models aren’t supported. You’ll need to do some research to make sure your Radeon is compatible if you’re using anything that precedes the Radeon RX 400-series era. Hit AMD’s FreeSync page yet again and click on the “Products” tab on the chart at the bottom to see a full list of supported graphics cards.

[ Further reading: The best graphics cards for PC gaming ]

FreeSync vs. G-Sync: Laptops

Nvidia

Numerous G-Sync-equipped laptops are available, from virtually every gaming laptop manufacturer. Earlier models were restricted to 75Hz displays, but newer laptops can push that up to 120Hz.

FreeSync has mostly been neglected in the mobile space, where Nvidia graphics are much more prevalent, but the $1500 Asus ROG Strix GL702ZC—the first-ever AMD Ryzen laptop—includes Radeon graphics and a FreeSync display. HP also rolled out a version of its Omen 17 gaming laptop with a Radeon RX 580 and FreeSync ($1,200 on Best Buy).

FreeSync 2 vs G-Sync HDR: The future

Now that ultra-high-resolution displays are trickling into the mainstream and high dynamic-range PC monitors have finally arrived, AMD and Nvidia are rolling out a new era of gaming monitors with FreeSync 2 and G-Sync HDR. Both are optimized for HDR and will likely be limited to very pricey monitors in the near term. Hit the links below for a more extensive overview of each, but here are the key points.

AMD

FreeSync 2 bucks FreeSync’s openness in favor of a more controlled ecosystem. To earn the FreeSync 2 badge, monitors need to include Low Framerate Compensation (LFC), and AMD certifies displays for low latency and a minimum allowed dynamic color and brightness range that’s twice as vibrant as standard sRGB displays.

The FreeSync 2 API lets a game know your HDR monitor’s native characteristics, enabling the software to match your screen properties directly, providing the best possible image quality while also eliminating lag. Huzzah! FreeSync 2 also automatically switches to a “FreeSync mode” when you boot into a game that supports it, cranking the brightness levels and color gamut, so you don’t need to leave visual settings at eye-searing levels while you’re cruising around the Windows desktop.

Samsung

One of the first FreeSync 2 displays was this wild 49-inch ultrawide behemoth from Samsung.

The first FreeSync 2 monitors hit the streets in 2017: Samsung’s curved, 32-inch CHG70 ($700 on Amazon) and the massive 49-inch, curved CHG90 ($1,416 on Amazon). We’ve tested the CHG70 and it’s a great monitor, even though the (improving) state of HDR on Windows still needs some work. A handful of other displays have been released, and Microsoft’s Xbox One X console supports FreeSync 2 as well.

G-Sync HDR monitors have been announced in both 4K 144Hz and ultrawide 200Hz varieties, and well, they basically look like the holy grail of PC displays. Every one shines at a whopping 1,000 nits of brightness and hundreds of backlight zones, augmented by sky-high refresh rates, 99 percent of the AdobeRGB color gamut, and quantum dot technology.

Nvidia

The Asus ROG Swift PG27UQ G-Sync monitor.

As with standard G-Sync displays, Nvidia hasn’t revealed the formal standards for G-Sync HDR displays—which require a different, next-gen version of Nvidia’s proprietary hardware module—but representatives say that every G-Sync HDR display will look at least as good as these initial models.

The first 4K G-Sync HDR displays—the Acer Predator X27Remove non-product link and Asus ROG Swift PG27UQRemove non-product link—are both available now after a lengthy delay. Each costs $2,000. The cutting-edge doesn’t comes cheap, but as you can see in our Acer Predator X27 review, the speed and spectacular visual quality of these panels are second to none. G-Sync HDR truly feels like it’s advancing the state of the art, and by a substantial amount.