HDR is a technology that greatly enhances image quality by expanding the dynamic range of a display – this means brighter lights, darker shadows, and more detail in both. It also uses a wider colour gamut which makes colors pop.
Many gamers still haven’t had a chance to experience what 4K is all about, but the tech world moves fast and by its standards, 4K is already old news.
HDR is all the rage now!
But while 4K is pretty straightforward (in that all 4K displays feature the same number of pixels), HDR is a lot more vague and dare we say predatory in how it delivers on the promises it sets. It certainly doesn’t help matters that there are several standards for HDR like HDR10, HDR10+, DolbyVision, and more than only add to the confusion.
So we’re here to clear up all the confusion by first explaining what HDR is and how it’s supposed to work, and then running down the encoding standards one by one and explaining how they actually works.
Table of ContentsShow
What Is HDR?
The premise behind HDR is simple – to enhance the viewing experience by expanding the dynamic range. HDR does, after all, stand for High-Dynamic Range, as opposed to SDR or Standard-Dynamic Range.
Dynamic range is the difference between the brightest and darkest parts of an image. There’s more to it than contrast, but you’ll get the gist of it if you compare it to contrast.
Increasing the dynamic range should result in an image with darker shadows and brighter lights while still retaining the detail within those shadows and lights.
We should also keep in mind that, in order to use HDR, you would first need content that supports it. This mostly applies to movies, although certain video games support it too, and there’s even an attempt at broadcasting HDR, as you’ll soon see.
The marketing pitch here is simply: If you feel that movies appear washed out on your SDR TV, get an HDR TV and you’ll get a true cinematic experience in the comfort of your living room.
Translating the premise into practice goes about as well as you’d expect if you were a cynic, especially if you use HDR10.
This is the most widespread encoding standard for HDR, partly because it’s open source so manufacturers don’t have to pay any royalties to feature it in their displays. It also has the benefit of supporting the most content. But it has one major flaw.
HDR10 relies on static metadata. What this means is that your TV will be fed information on what the brightest and darkest colors are in a movie. Your TV will then adjust its brightness to best fit this movie.
There are two major problems with this:
- Your TV may not be able to handle HDR at all. Most HDR content is filed with the idea that the brightest colors will be displayed at a brightness of 1000 nits. Most budget HDR TVs will only support a maximum brightness of 300 nits at best. This means your TV will assign its own maximum brightness value to the brightest color in the movie and try to work out the proportions from there. Consequently, most movies will look better in SDR than in HDR on such TVs.
- Even if your TV has good HDR, the fact that static metadata only uses two points of reference for an entire film can be a big problem. Let’s say you’re watching a horror that’s overall dimly lit but there’s one super-bright explosion scene in it. Well, your TV may display the whole movie brighter than it’s supposed to be because of how bright that one explosion is.
Unlike HDR10, which is an open-source encoding standard, Dolby Vision is owned by Dolby and it actually addresses both issues we’ve just pointed out.
Firstly, HDR10 doesn’t have any strict standards. It only has recommended requirements that TVs should meet to properly display HDR content. This is why you can buy HDR10 TVs that make HDR content look worse than SDR. The TV only needs to be able to receive HDR signals for it to be marketed as an HDR TV – it has nothing to do with its ability to display images in proper HDR…
Dolby, on the other hand, is stricter about the specs TVs need to have to support it. And because manufacturers need to pay royalties to implement this proprietary technology, they’re less likely to put it in a TV that isn’t fit for HDR. All of this is to say, you aren’t likely to find a Dolby Vision TV that sucks when it comes to displaying HDR content.
What’s also great about it is that it relies on Dynamic Metadata. This means that the TV knows how bright the brightest and darkest colors are supposed to be on a frame-by-frame or a scene-by-scene basis, rather than only receiving two values to an entire movie.
With all this in mind, it’s safe to say that Dolby Vision is superior to HDR10, both on paper and in practice. The only downside (aside from price) is that less content is made for Dolby Vision than for HDR10, but thankfully Dolby Vision TVs can still display HDR10 content in HDR.
It was apparent from the start that the Static Metadata used by HDR10 would pose a problem for its longevity. So when you take the open-source nature of HDR10 and add in Dynamic Metadata capabilities, what you get is HDR10+.
The pros here are obvious – more supported content and a cheaper price than Dolby Vision.
But this only solves the second problem described in the segment on HDR10. The first one is still present – you can easily find TVs that support HDR10+ but feature garbage-tier HDR capabilities that ruin the viewing experience rather than enhance it.
Lastly, we want to mention HLG or Hybrid Log-Gamma.
The problem with other HDR encodings is that they aren’t broadcast-friendly. To view something in HDR it would have to specifically support HDR. And here, cable content simply couldn’t keep up with DVDs, Blu-Rays, and streaming services.
So BBC and NHK teamed up to create HLG – a form of HDR that could be broadcast such that folks with HDR TVs could view the content in HDR while the folks with SDR TV could watch that same broadcast in SDR.
HLG may not look as good as all other types of HDR, but it’s an ingenious way to please both the SDR and the HDR crowd.
Now before we close things off, we want to dedicate a segment of this article to HDR monitors, as they follow completely different standards than HDR TVs.
That’s right! If you thought things were confusing so far, buckle up!
Gaming monitors aren’t advertised as HDR10, HDR10+, or Dolby Vision. Instead, what you’ll find as designations like DisplayHDR400, HDR600, HDR1000, and so on.
Such monitors are VESA certified. And for the first time, the number that comes after the HDR isn’t all nonsense – it actually tells you what’s the peak brightness that the monitor can reach. You can find the full summary of DisplayHDR Specs here.
Now not all HDR gaming monitors are VESA certified. You can still find some that feature HDR and we’re just supposed to take their word for it. Don’t do it. You will regret it if you do. The feature is technically there, insofar as it can be turned on and off, but it will make games look worse.
In fact, even DisplayHDR400 can be insufficient when it comes to offering a good HDR experience. We’d argue that DisplayHDR600 is the lowest you should go if you’re serious about gaming in HDR.
Another thing to keep in mind is that not all games support HDR. No game released before 2017 was made with this feature in mind, although some have received retroactive HDR support, either in the form of official patches, like The Witcher 3, or fan-made mods, like Doom:1993…
You can find a list of games that support this feature here.
The switch from SDR to HDR arguably enhances the viewing experience more than the jump from 1080p to 4K does.
But while you can be sure that every 4K monitor shows the advertised number of pixels, HDR monitors won’t always feature good HDR. This is mainly due to there being so many encoding standards for it.
So let’s gloss over them quickly:
- HDR10 is completely obsolete nowadays. It persists only for marketing purposes.
- HDR10+ is capable of great things, but not all TVs that support it are properly specced out.
- Dolby Vision is arguably the best standard, but it’s also the most expensive.
- HLG is an attempt at bringing HDR to broadcasting.
And finally, if you’re looking for an HDR monitor, make sure it’s at least VESA certified, although those who want good HDR capabilities should shoot for at least DisplayHDR 600.
Or, if all of this seems confusing, you can always check out this guide where we’ve compiled the best HDR monitors on the market.