Televisions provide more HDR bang for your buck. They also tend to have a better contrast ratio and superior color. On the other hand, they’re too large to use without seriously adjusting your gaming setup. A television’s pixel density is also rather low compared to a monitor, which can be a surprising disappointment if you sit too close.
Mini-LED vs. OLED
OLED PC gaming monitors don’t exist, but there’s plenty of OLED televisions. Gamers looking for great HDR on the PC are likely to pit OLED and Mini-LED televisions against each other. Which is better? It’s a complicated question, but the simple answer is Mini-LED.
That’s probably not the answer you expected. OLED is beloved for its ability to achieve an effectively perfect black level. In other words, it can display a luminance level of zero. This does wonders for contrast, shadow detail, and depth.
But OLED has a problem. The best OLED panels can just barely hit a peak brightness of 1,000 nits for a few seconds, and most can achieve 600 to 800 nits. Mini-LED HDR televisions can sometimes exceed 2,000 nits. The Asus ROG Swift PG32UQX Mini-LED monitor can hit 1,400 nits.
This is important to HDR performance in games because, as I touched on earlier, most games are bright and flashy. They tend to avoid extremely dark scenes because darkness makes a game hard to play. That starts to rub away OLED’s best perk and leans into Mini-LED’s best trait: an extremely high peak brightness.
Don’t mistake this as a hard-and-fast rule. OLED looks fantastic in many situations. It’s especially well suited for movies and television, where dark scenes are more common. You might also like OLED better if you enjoy horror or simulation games, as these genres are more likely to rely on shadow detail and deep, inky black levels. In general, though, Mini-LED is more suited to the way modern PC games are presented. It’s brighter and more vibrant. This will lead to more impressive visuals in HDR games.
HDR10 is the only standard that matters
HDR10 was introduced in 2015 by the Consumer Technology Association. It’s an open standard, so it’s unsurprisingly the most widely supported. It’s so common in the world of PC hardware that game studios and monitor manufacturers rarely bother to market HDR10 support. It’s implied.
Other HDR formats are rarely relevant to PC gaming. Dolby Vision is the exception that proves this. Games that release on both console and PC will sometimes carry over Dolby Vision support, and a few laptops are sold with displays that are Dolby Vision compatible. These are small in number, though, and many laptops that support the standard aren’t the best choice for PC gaming.
Your graphics card is probably up to the task
AMD, Nvidia, and Intel have embraced HDR in their graphics card architectures for years now. We list the minimum video card requirements in our full PC HDR guide, but odds are your rig is HDR ready. Any graphics hardware that can push even 30 frames per second to a 4K HDR display will have HDR support.
There’s no significant difference in HDR quality between AMD, Nvidia, and Intel. They all support HDR10 standard and will deliver comparable visuals. Check out our guide to the best graphics cards if you’re looking to level up your GPU firepower.
And so is your DisplayPort cable
HDMI and DisplayPort support HDR, and have for years. You might not be shocked to hear graphics solutions embraced HDR just as HDR-capable versions of HDMI and DisplayPort became standard. This happened through 2015 and 2016.