Updated

G-Sync vs. FreeSync: Adaptive sync gaming monitors explained

Two rival technologies promise to rid games of stuttering and tearing.

1 2 Page 2
Page 2 of 2

Nvidia G-Sync advantages and disadvantages

G-Sync’s biggest advantage: A consistent, high-quality gaming experience.

gsync Nvidia

Every G-Sync display needs to pass Nvidia’s strict certification process, which has rejected monitors in the past. Nvidia hasn’t publicly detailed the requirements, but representatives tell me that the company works directly with panel makers like AU Optronics to optimize refresh rates, flicker properties, response times, and visual quality; then works with the display makers (like Asus and Acer) to fine-tune the on-screen display and more. Every monitor is calibrated to the sRGB color gamut. 

Every G-Sync monitor supports the equivalent of AMD’s Low Framerate Compensation, guaranteeing a smooth gaming experience. All G-Sync monitors also support “frequency dependent variable overdrive.” Without diving into too much detail, the technology prevents ghosting on G-Sync displays—an issue that severely affected early FreeSync panels, though the issue’s less prevalent now.

gsync ulmb Nvidia

How G-Sync ULMB works.

While G-Sync started with a focus on adaptive sync, Nvidia says it has expanded beyond that to offer “the ultimate gaming monitors.”

Some G-Sync panels include added perks like refresh rate overclocking and Ultra Low Motion Blur, which combats the notorious blurring of text and other elements at very high refresh rates. It’s a killer feature for e-sports games. Early ULMB monitors (rightfully) earned a reputation for being somewhat dim, but new 240Hz G-Sync monitors from Asus, Acer, and Alienware pulse their backlight at a searing 400 nits of brightness, eliminating the concern. You can’t use G-Sync adaptive sync and ULMB simultaneously, however.

As far as negatives go, the limited port selection could be a bummer in some scenarios, but G-Sync’s biggest drawback is sheer price. G-Sync offers a premium gaming experience limited to premium-priced monitors. You definitely get what you pay for, but adaptive sync also pairs very well with budget graphics cards that struggle to push 60fps. Most GeForce GTX 1050 owners will never experience it. The high price of G-Sync monitors limits Nvidia’s tech to high-end users alone.

FreeSync vs. G-Sync: Graphics cards

Nvidia AMD graphics cards Brad Chacos

The biggest bummer about AMD and Nvidia’s adaptive sync tech is that neither works with its rival’s graphics cards. GeForce graphics card owners need a G-Sync monitor, and Radeon graphics card owners can only use FreeSync displays.

To use G-Sync, you’ll need a GeForce GTX 600-series (or newer) graphics card. FreeSync requires a Radeon Rx 200-series (or newer) graphics card, though some individual models aren’t supported. You’ll need to do some research to make sure your Radeon is compatible if you’re using anything that precedes the Radeon RX 400-series era. Hit AMD’s FreeSync page yet again and click on the “Products” tab on the chart at the bottom to see a full list of supported graphics cards.

[ Further reading: The best graphics cards for PC gaming ]

FreeSync vs. G-Sync: Laptops

gsync laptop Nvidia

Numerous G-Sync-equipped laptops are available, from virtually every gaming laptop manufacturer. Earlier models were restricted to 75Hz displays, but newer laptops can push that up to 120Hz.

FreeSync has mostly been neglected in the mobile space, where Nvidia graphics are much more prevalent, but the $1500 Asus ROG Strix GL702ZC—the first-ever AMD Ryzen laptop—includes Radeon graphics and a FreeSync display. HP also rolled out a version of its Omen 17 gaming laptop with a Radeon RX 580 and FreeSync ($1,200 on Best Buy).

FreeSync 2 vs G-Sync HDR: The future

Now that ultra-high-resolution displays are trickling into the mainstream and high dynamic-range PC monitors have finally arrived, AMD and Nvidia are rolling out a new era of gaming monitors with FreeSync 2 and G-Sync HDR. Both are optimized for HDR and will likely be limited to very pricey monitors in the near term. Hit the links below for a more extensive overview of each, but here are the key points.

freesync 2 AMD

FreeSync 2 bucks FreeSync’s openness in favor of a more controlled ecosystem. To earn the FreeSync 2 badge, monitors need to include Low Framerate Compensation (LFC), and AMD certifies displays for low latency and a minimum allowed dynamic color and brightness range that’s twice as vibrant as standard sRGB displays.

The FreeSync 2 API lets a game know your HDR monitor’s native characteristics, enabling the software to match your screen properties directly, providing the best possible image quality while also eliminating lag. Huzzah! FreeSync 2 also automatically switches to a “FreeSync mode” when you boot into a game that supports it, cranking the brightness levels and color gamut, so you don’t need to leave visual settings at eye-searing levels while you’re cruising around the Windows desktop.

samsung chg90 freesync 2 Samsung

One of the first FreeSync 2 displays was this wild 49-inch ultrawide behemoth from Samsung.

The first FreeSync 2 monitors hit the streets in 2017: Samsung’s curved, 32-inch CHG70 ($700 on Amazon) and the massive 49-inch, curved CHG90 ($1,416 on Amazon). We’ve tested the CHG70 and it’s absolutely glorious, even though the (improving) state of HDR on Windows still needs some work. A handful of other displays have been released, and Microsoft’s Xbox One X console supports FreeSync 2 as well.

G-Sync HDR monitors have been announced in both 4K 144Hz and ultrawide 200Hz varieties, and well, they basically look like the holy grail of PC displays. Every one shines at a whopping 1,000 nits of brightness and hundreds of backlight zones, augmented by sky-high refresh rates, 99 percent of the AdobeRGB color gamut, and quantum dot technology.

asus nvidia gsync hdr Nvidia

The Asus ROG Swift PG27UQ G-Sync monitor.

As with standard G-Sync displays, Nvidia hasn’t revealed the formal standards for G-Sync HDR displays—which require a different, next-gen version of Nvidia’s proprietary hardware module—but representatives say that every G-Sync HDR display will look at least as good as these initial models.

The first 4K G-Sync HDR displays—the Acer Predator X27 and Asus ROG Swift PG27UQ—are both available now after a lengthy delay. Each costs $2,000. The cutting-edge doesn’t comes cheap.

To comment on this article and other PCWorld content, visit our Facebook page or our Twitter feed.
1 2 Page 2
Page 2 of 2
  
Shop Tech Products at Amazon