If you’re in the market for a new gaming monitor, there’s a high chance that most listings you’ll be looking at will have three recognizable letters listed as a feature: HDR. High-dynamic range is a relatively new display technology that has gone through numerous iterations on TVs. Gaming monitors are far behind what is capable on living room displays, but that doesn’t mean you shouldn’t be considering it when shopping for a monitor.
We’ve put together a guide on everything you need to know about HDR on PC monitors, including exactly what HDR is, all the different specifications, and what to look out for when buying a display, especially a gaming monitor. While we will touch on HDR in TVs, this won’t be the primary focus of this guide.
What is HDR?
High-dynamic range (HDR) is a specification that determines whether a display can render an image that has deeper contrast, a wider color gamut, and generally better representation of brightness than that of a standard-dynamic range (SDR) display. This means that with content that supports HDR and a correctly calibrated display, you will often be able to spot more details in both brighter and darker scenes while also enjoying an image that can look slightly more saturated or color-accurate, depending on the implementation.
“Often” is an important distinction here, though, because good HDR production relies on two key factors: the display and the content. It is possible for a monitor to have an HDR specification but display HDR content poorly, while some HDR content can be badly implemented and end up looking worse than SDR (Red Dead Redemption 2 at launch on consoles was a good example of this).
Since you can’t control the HDR implementation in a game or film, the best you can do is pick a display that is equipped to show HDR content in its best light. For that, you must look out for three important features: overall peak brightness, quality of local dimming, and support for a wide color gamut. Peak brightness will determine the overall contrast ratio your display can produce, highlighting bright areas of an image with an intensity that SDR is not capable of. Similarly, local dimming and how it’s implemented greatly affects your display’s ability to keep dark areas suitably dark when a bright source is also being displayed, ensuring a high contrast ratio and an image that isn’t washed out. A wide color gamut means that your display can produce more colors with its RGB pixels than an SDR one, which can be important for accurate colors in both gaming and productivity.
You need a javascript enabled browser to watch videos.
Click To Unmute Bloom, HDR, And HDR Displays – PC Graphics Settings Explained ShareSize:640 × 360480 × 270
Want us to remember this setting for all your devices?
Please use a html5 video capable browser to watch videos. This video has an invalid file format. 00:00:00Sorry, but you can’t access this content!
Please enter your date of birth to view this video
JanuaryFebruaryMarchAprilMayJuneJulyAugustSeptemberOctoberNovemberDecember12345678910111213141516171819202122232425262728293031Year20242023202220212020201920182017201620152014201320122011201020092008200720062005200420032002200120001999199819971996199519941993199219911990198919881987198619851984198319821981198019791978197719761975197419731972197119701969196819671966196519641963196219611960195919581957195619551954195319521951195019491948194719461945194419431942194119401939193819371936193519341933193219311930192919281927192619251924192319221921192019191918191719161915191419131912191119101909190819071906190519041903190219011900
By clicking ‘enter’, you agree to GameSpot's Terms of Use and Privacy Policy
Types of HDR
When it comes to the different degrees of HDR, this differs drastically from TVs to monitors. HDR started on TVs with standard HDR (or HDR10), which was meant to specify support for a wide color gamut, some form of local dimming, and brightness levels up to 1,000 nits. Since then, this specification has been stretched and redefined by TV manufacturers, including terms such as Dolby Vision, HDR10+, HLG, and more. However, none of these are important to monitors right now, so we won’t be going into depth on those here.
Monitors started out on the same path, with many supporting HDR10. Unlike TVs, however, there was no regulation on what monitors had to support in order to slap this label on the box, which did lead to many products that used the specification more as a marketing term as opposed to an indication of correctly implemented support. Instead, any monitor with some semblance of good HDR support today is regulated by VESA’s HDR standards, starting from VESA DisplayHDR 400 and going all the way up to VESA DisplayHDR 1400.
The easiest way to tell the difference between all the different specifications is by the number, which correlates with a display’s maximum advertised brightness. DisplayHDR 400, for example, will have a maximum brightness of 400 nits, while DisplayHDR 1000 will peak at 1,000 nits. If you recall from TVs, many introductory HDR panels peaked at 1,000 nits, which is widely regarded as the sweet spot for HDR content (this differs between LCD and OLED panels, but that’s another discussion altogether). That means that while a monitor might have an official DisplayHDR 400 specification, it doesn’t mean you’re getting the best HDR experience on the market.
In addition to the maximum brightness, each level of DisplayHDR also has auxiliary standards that need to be met for specification. DisplayHDR 400, for example, can still be used on monitors without any local dimming and emulated wide color gamut, both of which aren’t true representations of HDR. DisplayHDR 600 tightens this up, requiring some form of local dimming and a wide, 10-bit color gamut. Each step upwards ensures a better HDR experience, which explains why you can find a large range of DisplayHDR 400 displays at varying price ranges but fewer, more expensive monitors with higher specifications.
Perhaps one of the most important features of better HDR specifications is the inclusion of local dimming, which ensures that parts of your screen that are displaying bright objects don’t inadvertently wash out darker areas. At DisplayHDR 600, simple edge-lit local dimming is required, which means that the display can be divided into strips that can have brightness individually adjusted. This works in some cases but can result in a big portion of the image lighting up if even just one pixel is meant to be at the peak brightness of the display, which can be distracting in HDR content.
From DisplayHDR 1000 and up, FALD (full-array local dimming) is required. This can split the display into hundreds, if not thousands, of zones that can be individually adjusted in terms of brightness, producing a more contrast-accurate image. On LCD TVs, this is as close as you can get to an OLED display (where individual pixels can turn on and off, making it the most desirable panel for HDR), but monitors are still far behind in this regard. Although DisplayHDR 1000 requires FALD, the specification doesn’t ensure that there are enough zones to make it as useful as implementations on TVs. It’s still a head above edge-lit locally dimming, so aim for it if you can.
You can check out the full breakdown of all DisplayHDR specifications below. Note that DisplayHDR True Black is restricted to OLED panels, which generally have lower overall brightness but make up for this with perfect contrast levels. There are also almost no gaming monitors using OLED currently, so it might be a while before this specification factors into your purchasing decision.
Specification | Brightness (nits) | Color gamut | Local dimming |
---|---|---|---|
DisplayHDR 400 | 400 | 8-bit, sRGB | None/ Display Level |
DisplayHDR 500 | 500 | 10-bit, 90% DCI-P3 | Edge-Lit |
DisplayHDR 600 | 600 | 10-bit, 90% DCI-P3 | Edge-Lit |
DisplayHDR 1000 | 1,000 | 10-bit, 90% DCI-P3 | Full-Array |
DisplayHDR 1400 | 1,400 | 10-bit, 95% DCI-P3 | Full-Array |
DisplayHDR 400 True Black | 400 | 10-bit, 95% DCI-P3 | Full-Array/OLED |
DisplayHDR 500 True Black | 500 | 10-bit, 95% DCI-P3 | Full-Array/OLED |
What you need for HDR
So now that you understand what HDR is in its many different forms, you might be wondering what you need to get everything up and running. HDR needs to be compatible through all your hardware chains, including your monitor, graphics card, display cable, and, lastly, the game you’re playing.
HDR-compatible displays
There are literally hundreds of displays that support HDR, but for the most part, you’ll want to stick to ones that fall under the VESA DisplayHDR specification. If you see a monitor just saying “HDR” or “HDR10,” that just means it can accept an HDR signal, but not really do anything with it. In those cases, just stick to SDR.
Below are some suggestions for monitors that fall under the DisplayHDR 400, 600, and 1000 ranges. As you might expect, the better the HDR implementation, the more expensive the monitor. You can find a comprehensive list of all HDR monitors and laptops that are VESA certified here.
Display cables
HDR is supported on most display cables you probably already have, but if you’re using a high refresh rate monitor, you might want to make sure you’re using the best output on your GPU. For 4K at 60Hz, you can get away with a traditional high-speed HDMI cable, while you’ll need an ultra high-speed one for any refresh rates higher than that (HDMI 2.1, which is new in the monitor space). Normal HDMI 1.4 will also likely require you to change from a full RGB output to YCbCr442 to display HDR at 4K without issues, which you can change in your Nvidia and AMD control panels (both the PS4 and Xbox One do this when displaying HDR content). If you’re using a DisplayPort cable, you’re good to go for high refresh rates and HDR.
HDR-compatible graphics cards
Thankfully you don’t need the most up-to-date GPU to enjoy HDR (although more horsepower does help). On Nvidia’s side, any GPU from the GTX 950 onward supports HDR, while all AMD cards newer than the R9 380 will work too. Intel integrated GPUs from 7th-generation CPUs and onwards support HDR, too, which is why many modern laptops support the feature as well. The latest cards from Nvidia and AMD also support HDMI 2.1, which can be useful if you’re forking out for the most premium monitors on the market today.
Turning on HDR in Windows 10
The final step is to activate HDR in Windows 10, which can be surprisingly well-hidden if you haven’t used it before. Just head to Settings > Apps > Video Playback and toggle on “Play HDR games and apps” to activate it. This will cause your screen to quickly flicker on and off to toggle from SDR to HDR. If your desktop looks a bit weird (potentially washed out) don’t worry–this is a common complaint with HDR on Windows 10, but games that use HDR will display correctly. Sadly, not all games can turn HDR on by themselves, which means you’ll have to learn to check your settings before playing.
Speaking of games, supported software and how it deals with HDR will be different across the board. Some will need Windows 10 HDR to be on, while others will potentially break if it is. Others turn HDR on automatically if detected when started, while others will need you to manually turn it on in the settings. There’s no easy way to differentiate between all these permutations, so just be sure to experiment a bit when trying a new game.
If you are looking for some of the best examples of HDR that you can try right now, the list below is filled with titles that look great with HDR. A full list of PC games that support HDR can be found here, too.
Battlefield VCall of Duty: Modern WarfareDestiny 2Devil May Cry VDoom EternalGears 5Hitman 3Horizon Zero DawnMass Effect: AndromedaMetro ExodusOri and the Will of the WispsResident Evil 2Star Wars Jedi: Fallen OrderTetris Effect
Is HDR worth it?
There’s no question–setting up HDR is a big investment. You need a great monitor, a capable GPU, and games that support it to really see the benefits. But once you do, it will be difficult to go back. While there’s absolutely nothing wrong with SDR content, the added fidelity of good HDR implementations backed up by hardware that is up to the task of displaying it is a drastic improvement. There’s still a long way to go for a base standard that can make HDR content sing, but there’s certainly a lot you can achieve with it today.
If you’re looking to buy a new gaming monitor in 2021, there’s a lot to keep in mind beyond just HDR when it comes to monitor technologies. Be sure to check out our guides to display panel technology and G-Sync vs. FreeSync; plus, if you’re split between getting a new monitor or TV, we’ve broken down the differences between a gaming monitor vs. a gaming TV and how they compare in 2021. And for specific recommendations, we have picks for the best cheap gaming monitors, the best monitors for PS5 and Xbox Series X, and the best 4K TVs. Plus, check out the best PC games to play right now.
More Tech Picks From GameSpot
Best Computer Speakers For Gaming Best 4K TVs For Gaming Best Monitor For PS5, Xbox Series X + Show More More Tech Picks From GameSpot Links (6) The Best Gaming Chair In 2024 Best Cheap Gaming Monitors In 2022 Best PC Headsets In 2023 The Best Nintendo Switch Controllers You Can Buy The Best VR Headsets In 2020 The Best Webcam For 2021: Top Picks For Streaming On Twitch And YouTube