We break down what you need for deliciously eye-searing visuals.
By Matt Smith
High Dynamic Range, or HDR, is often casually brought up by game studios touting the cutting-edge visuals of a new title, as if it’s a common feature PC gamers can easily enjoy. That’s not the reality. Awesome HDR gaming is still difficult to achieve on a Windows PC.
Yet it’s a goal worth pursuing. At it’s best, HDR is a rare example of a true game-changing technology. HDR can smack you straight across your face with the single most noticeable gain in gaming visuals. This article will explain what you need to know about HDR gaming on PCs, from technology to cable and GPU concerns to settings tweaks.
HDR quality is reliant on a display’s peak brightness, especially when gaming. This is the single most important concept to understand. You should look for a monitor or television with a maximum peak brightness of 1,000 nits or better.
Think back to your favorite spot to watch a sunset in Skyrim or Assassin’s Creed. It no doubt looked gorgeous. But, without HDR, it’s limited. It will never be so bright you feel a need to shield your eyes, and you’ll lose out details near the horizon that are overwhelmed by glare.
HDR can fix that, but only if you have a display bright enough to make a difference.
Brightness isn’t the only benefit of HDR, of course, but games benefit from it more than other content, like movies. There’s a good reason for this: games must be playable.
Aside from a few genres, like horror or simulation, games don’t spend much time in the dark. No one likes to be killed by an enemy they had no hope of seeing. Competitive gamers go so far as to artificially increase the brightness of games to better see opponents. Flashy, vibrant games are the trend, and this leans into the strengths of super-bright HDR displays.
How do you judge a monitor’s brightness? Look for its VESA DisplayHDR certification. VESA’s certification program isn’t perfect but provides a solid baseline for visual quality. The VESA DisplayHDR 1000 standard requires a peak brightness of 1,000 nits or better, so that’s the minimum you should look for. Most HDR monitors you’ll find—especially more affordable ones—are DisplayHDR 400 or 600 certified. DisplayHDR 1000 monitors start around $800.
Televisions don’t usually participate in VESA DisplayHDR certification, however. You’ll need to read reviews of a television you’re considering to know if it can hit its advertised brightness.
Mini-LED is a big deal
Brightness is key, but it has limitations unless it’s supported by a display that can dim some areas of the screen while keeping others bright. This is only possible with a type of display most commonly called Mini-LED (also called full array local dimming). Mini-LED is ideal for HDR.
Most modern monitors use an edge-lit LED backlight, which means the LEDs are along the edges of the screen. That’s why it’s common to see hazy, distracting bright spots along a monitor’s edge. It’s difficult to engineer a backlight that can light the entire display without increasing brightness at the edges.
You can find extremely bright HDR monitors that use edge-lit LED technology. Samsung’s radical 49-inch Odyssey G9 is one example. The Odyssey G9 has local dimming that will turn off strips of edge-lit LEDs in an attempt to enhance contrast, but you’ll inevitably see hazy halos around smaller objects. This problem is known as blooming.
Mini-LED mostly solves this by placing LEDs directly behind the LCD panel. Blooming can still occur but is less common and harder to notice.
OLED never has a problem with blooming because each pixel creates its own light. This is academic unless you plan to buy a television, though, because OLED PC gaming monitors don’t exist (yet).
You don’t need a TV for great HDR, but it’s more affordable
The lack of OLED PC gaming monitors has driven gamers to consider televisions as an alternative. HDR is also a great reason to consider a television. Televisions deliver better HDR performance than a computer monitor at any given price point.
Televisions provide more HDR bang for your buck. They also tend to have a better contrast ratio and superior color. On the other hand, they’re too large to use without seriously adjusting your gaming setup. A television’s pixel density is also rather low compared to a monitor, which can be a surprising disappointment if you sit too close.
Mini-LED vs. OLED
OLED PC gaming monitors don’t exist, but there’s plenty of OLED televisions. Gamers looking for great HDR on the PC are likely to pit OLED and Mini-LED televisions against each other. Which is better? It’s a complicated question, but the simple answer is Mini-LED.
That’s probably not the answer you expected. OLED is beloved for its ability to achieve an effectively perfect black level. In other words, it can display a luminance level of zero. This does wonders for contrast, shadow detail, and depth.
But OLED has a problem. The best OLED panels can just barely hit a peak brightness of 1,000 nits for a few seconds, and most can achieve 600 to 800 nits. Mini-LED HDR televisions can sometimes exceed 2,000 nits. The Asus ROG Swift PG32UQX Mini-LED monitor can hit 1,400 nits.
This is important to HDR performance in games because, as I touched on earlier, most games are bright and flashy. They tend to avoid extremely dark scenes because darkness makes a game hard to play. That starts to rub away OLED’s best perk and leans into Mini-LED’s best trait: an extremely high peak brightness.
Don’t mistake this as a hard-and-fast rule. OLED looks fantastic in many situations. It’s especially well suited for movies and television, where dark scenes are more common. You might also like OLED better if you enjoy horror or simulation games, as these genres are more likely to rely on shadow detail and deep, inky black levels. In general, though, Mini-LED is more suited to the way modern PC games are presented. It’s brighter and more vibrant. This will lead to more impressive visuals in HDR games.
HDR10 is the only standard that matters
HDR10 was introduced in 2015 by the Consumer Technology Association. It’s an open standard, so it’s unsurprisingly the most widely supported. It’s so common in the world of PC hardware that game studios and monitor manufacturers rarely bother to market HDR10 support. It’s implied.
Other HDR formats are rarely relevant to PC gaming. Dolby Vision is the exception that proves this. Games that release on both console and PC will sometimes carry over Dolby Vision support, and a few laptops are sold with displays that are Dolby Vision compatible. These are small in number, though, and many laptops that support the standard aren’t the best choice for PC gaming.
Your graphics card is probably up to the task
AMD, Nvidia, and Intel have embraced HDR in their graphics card architectures for years now. We list the minimum video card requirements in our full PC HDR guide, but odds are your rig is HDR ready. Any graphics hardware that can push even 30 frames per second to a 4K HDR display will have HDR support.
There’s no significant difference in HDR quality between AMD, Nvidia, and Intel. They all support HDR10 standard and will deliver comparable visuals. Check out our guide to the best graphics cards if you’re looking to level up your GPU firepower.
And so is your DisplayPort cable
HDMI and DisplayPort support HDR, and have for years. You might not be shocked to hear graphics solutions embraced HDR just as HDR-capable versions of HDMI and DisplayPort became standard. This happened through 2015 and 2016.
Any PC with an HDR-capable graphics solution will have an HDR-capable video output, and any HDR-capable display will have an appropriate video input. The type of HDMI or DisplayPort cable you use isn’t important, either.
Yes, it’s possible to find a cable that doesn’t work if you end up with a cheap cable that’s out of spec or use a very, very old cable. However, the vast majority of cables work without issue. Our guide to the best HDMI cables can help you out if you need a new one.
HDR support can be inconsistent
One major problem that can trip us PC gamers is inconsistent implementation between games. You’ll find that some games let you turn HDR on in-game without first turning HDR on in Windows. However, other games may need you to turn on HDR in Windows’ display settings.
Many HDR monitors and displays will automatically detect an HDR signal and switch to an HDR mode, but some don’t, or don’t do it reliably. When in doubt, follow this order. First turn on HDR support in Windows. Then turn it on in the game you’d like to play. Finally, use your monitor’s on-screen menu to manually select an HDR mode.
HDR will disable some monitor settings
Activating a monitor’s HDR mode will do something you may find strange: disable the monitor’s brightness customization. HDR is meant to provide a wider range of potential luminance. That range would be reduced if you cranked the monitor up its maximum brightness and left it there, so brightness customization is usually disabled.
Instead of controlling brightness through monitor settings, it’s handled through Windows or the game you’re playing. Windows has a brightness adjustment setting available in Advanced HDR settings, which you’ll find in the Display section of the Settings menu. Most HDR games also have brightness calibration available.
Brightness isn’t the only setting that you may find disabled. Color customization, gamma, and other settings are often disabled as well.
This HDR TV supports Nvidia’s G-Sync
LG CX-series 4K UHD OLED TV (48-inch class, model OLED48CXPUB)
This is another area televisions have an edge over monitors. HDR monitors tend to offer a small handful of poorly explained HDR options. Televisions typically allow a broad range of user customization. TV enthusiasts would revolt if they didn’t!
The exact details of what is and is not available changes from display to display, often even among the same product line. You’ll have to dive into a display’s user manual if you want to know the exact settings that can be used with HDR enabled (and, frankly, even the manual may not provide full details).
PC gaming on HDR is awesome, but full of tough choices
Great HDR is simple in theory. All you need is a capable, color-accurate Mini-LED (or OLED) display that can reach a peak brightness of 1,000 nits or better. Details like your video card or your display connection aren’t likely to cause problems.
HDR support in Windows and games can be annoyingly inconsistent, but it works well enough to make a difference. There’s no shortage of HDR games, and dozens more are released every month.
Here’s the real problem: few PC gaming monitors can deliver the performance needed to enjoy impressive HDR. There’s less than three dozen displays certified for DisplayHDR 1000 or better, and not all of those have a good local dimming implementation. Others are discontinued or designed for professionals, not PC gamers.
The best displays, like the Asus ROG Swift PG32UQX and Acer Predator X35, are expensive and difficult to find. HDR televisions are more affordable and easy to find in stock but won’t work for everyone.
It’s all a bit grim, then, but there’s light at the end of the tunnel. HDR makes a difference, a big difference, and it’s simple to start using once you have the right display. So, start saving: HDR is an upgrade worth the expense.
Note: When you purchase something after clicking links in our articles, we may earn a small commission. Read ouraffiliate link policyfor more details.