It’s happening. After years of tedious technical groundwork, the gorgeous future of PC displays finally—finally—materialized at CES 2017. At this year’s gadget show, a wave of fresh standards emerged to bring luscious high dynamic range image support to computers.
So what’s the big deal? A quick glance at any current HDR TV—like the Samsung 9800—should make the technology’s benefits instantly apparent. High dynamic range greatly expands a display’s contrast and color range, resulting in vibrant, more accurate colors that “pop” against HDR’s deeper, more accurate blacks. Indeed, to most peoples’ eyes, the visual impact of HDR is far more impressive than the sheer pixel mass of a 4K resolution—not that the difference matters much, as industry sources hint that many (but not all) PC-bound HDR monitors will sport 4K resolutions as well.
Before we dive into the hardware, let’s dig into the software and other technical groundwork that’s finally making HDR on PCs possible.
Laying the pipe
You could see this coming if you were paying attention.

Before HDR monitors could happen, graphics cards needed to be able to actually display HDR images.
HDR first appeared in the tightly integrated world of TVs long ago, but the technology’s arrival to the messier, wide-open world of PCs is a more recent development. Both Nvidia’s GeForce GTX 10-series and AMD’s Radeon RX 400-series graphics cards baked in HDR rendering capabilities when they launched last summer. The new generation of graphics cards was quickly followed by Shadow Warrior 2 launching in October as the first-ever PC game with HDR (it also included Nvidia’s wonderful multi-res shading technology). Then, in early December, AMD’s Crimson ReLive software unlocked the rival Dolby Vision and HDR-10 standards in Radeon hardware.
Thus the stage was set for HDR’s arrival on PCs at CES 2017. But an important part of the debut is, well, yet more groundwork.

An HDMI Forum illustration of the differences between HDR types.
Crucially, the HDMI Forum revealed the HDMI 2.1 specification, which includes support for dynamic metadata. (The HDMI Forum calls it “dynamic HDR,” which means “dynamic high dynamic range,” which makes my brain hurt, so the more-accurate “dynamic metadata” it is.)
Whereas HDMI 2.0a sticks to using a single HDR grade for a video, dynamic metadata allows displays to optimize individual scenes and even frames to the capabilities of your specific hardware—meaning you’ll always see the best brightness, contrast, color gamut, et cetera rather than a one-size-fits-all HDR implementation. It makes gorgeous displays even more beautiful, in other words.
But there’s a catch: Right now, only the proprietary Dolby Vision supports dynamic HDR, though the industry is working toward integrating it into the HDR-10 open standard, as well. The HDMI Forum plans to release the final HDMI 2.1 specification in the second quarter. (DisplayPort 1.4 already supports dynamic HDR metadata.)

Traditional HDR tone mapping versus how AMD’s FreeSync 2 handles tone mapping.
AMD’s FreeSync 2 likewise aims to make vivid HDR displays even more luscious (among many other nifty tricks). A cousin to AMD’s successful FreeSync stutter-killing technology for monitors, FreeSync 2 informs your graphics card about your display’s capabilities, letting your PC pump out a single round of HDR tone mapping, rather than separate passes for the game and display. That optimizes the image and reduces lag.

FreeSync 2 also imposes mandatory dynamic color and brightness ranges to ensure you’re getting appropriate bang for your buck (though, as with any display technology, how good images look onscreen depends in part on how well the content is tuned for HDR).
AMD’s working with multiple display vendors on multiple projects, and hopes to launch FreeSync 2 in the first half of the year.
HDR PC monitors
With all the groundwork set, the launch of actual HDR-compatible PC monitors feels almost anticlimactic—though no less welcome.

LG’s 32UD99 HDR monitor.
LG struck first, teasing the 32-inch, 4K-resolution LG 32UD99 even before the holidays rolled around. Beyond all those pixels and the HDR support, the monitor also packs a USB-C connection capable of simultaneously delivering 4K images, charging a connected laptop, and transfering data over a single cable.
Nvidia didn’t leave HDR to AMD’s FreeSync 2 alone. The company announced two debut G-Sync HDR displays at CES 2017 and they’re basically the holy grail of PC monitors, with 4K resolutions, 144Hz speeds, Quantum Dot technology, 1000 nits of brightness, and (of course) Nvidia’s glorious G-Sync game-smoothing technology.

This Asus model is one of the first gaming-focused G-Sync HDR monitors.
The Asus ROG Swift PG27UQ and Acer Predator XB272-HDR are G-Sync HDR’s standard-bearers, due to arrive sometime in the second quarter for an undisclosed (but no doubt lofty) price. I saw the Asus model at Nvidia’s suite and it was absolutely stunning; one image of an explosion made me instinctively flinch and cover my face to protect against heat that I could feel but wasn’t actually there. This screen is vivid.

Dell’s S2718D HDR monitor.
Dell’s 27-inch S2718D Ultrathin Monitor, meanwhile, takes its inspiration from the Dell XPS 13 laptop’s superb display: It’s got a barely there InfinityEdge bezel, and a profile so thin, it borderline boggles the mind. The 2560×1440-resolution display checks pretty much every box an image geek could ask for, with 178-degree viewing angles, 400 nits brightness, 99-percent sRGB color gamut, a 1000:1 contrast ratio, and a USB-C connection of its own.
While Dell’s touting the S2718D as an HDR monitor, Tom’s Hardware reports that “the standard [Dell] uses is different from what TV makers are using,” a detail corroborated by Engadget. Whatever that means, it’s clear that this will be a vibrant, highly accurate display. Dell says the S2718D will go on sale on Dell.com on March 23 for a cool $700.
That’s not cheap, and don’t expect any of the 4K HDR displays to cost any less. Sure, HDR has a better chance of being the future of displays than the 3DTV fad could ever hope for, but it’s as bleeding-edge as bleeding-edge gets on PCs. Being an early adopter is never for the faint of heart—or faint of wallet.
Now we need HDR games
Here’s hoping that PC games and videos jump on the HDR bandwagon to make the investment worthwhile sooner than later. With the PlayStation 4 Pro and Xbox One S recently embracing HDR as well, there’s good chance we won’t have long to wait.

The Acer G-Sync HDR monitor with Mass Effect: Andromeda.
That was reinforced by a chat I had with Mass Effect: Andromeda producer Fabrice Condominas in Nvidia’s CES suite.
Andromeda will offer day-one support for HDR (and Nvidia’s amazing Ansel screenshot tool) when it launches on March 21. Condominas said integrating HDR into Bioware’s game was a fast process as EA’s Frostbite engine already supported the technology. Major engines like Unreal Engine 4 and Unity also support HDR, Condominas said, and console-centric developers are all over it thanks to those new consoles. He expects PC games with high dynamic range to start appearing quickly now that we finally—finally—have HDR PC monitors on the way.
Editor’s note: This article was updated to include G-Sync HDR monitors and Condominas’s comments.