High Dynamic Range (HDR) is changing the way we experience our favorite movies, shows, and video games. Similar to surround sound codecs from companies like Dolby and DTS, there are several HDR formats supported by everything from TVs and soundbars to AV receivers and game consoles. One of these formats happens to be called HDR10+, and it has quite the interesting history.
Maybe you’ve heard of the other HDR formats? As of the publication of this article, there’s HDR10, Dolby Vision, HLG, and a couple of less standard offerings, including Advanced HDR by Technicolor. So how exactly does HDR10+ fit into the grand scheme of things, and does your TV support it? We’re about to find out!
What is HDR?
Before we can dive into HDR10+, we need to make sure we understand HDR. We’ve got a few fantastic deep dives on this technology that you can peruse at your leisure, but for the sake of a quick introduction, high-dynamic range as it pertains to TVs allows for video and still images with much greater brightness, contrast, and better color accuracy than what was possible in the past. HDR works for movies, TV shows, and video games. Unlike increases in resolution (like 720p to 1080p), which aren’t always immediately noticeable — especially when viewed from a distance — great HDR material is eye-catching from the moment you see it.
HDR requires two things at a minimum: A TV that is HDR-capable and a source of HDR video, such as a 4K HDR Blu-ray disc and compatible Blu-ray player, or an HDR movie on Netflix or other streaming service that supports it. Confused consumers often conflate 4K and HDR, but they are very different technologies; not all 4K TVs can handle HDR, and some do it much better than others. That said, most new TVs support both 4K UHD and HDR.
But saying “HDR” is like saying “digital music”: There are several different types of HDR, and each has its own strengths and weaknesses.
What is HDR10?
Every TV that is HDR-capable is compatible with HDR10. It’s the minimum specification. The HDR10 format allows for a maximum brightness of 1,000 nits (a measure of brightness), and a color depth of 10 bits. On their own, those numbers don’t mean much, but in context they do: Compared to regular SDR (standard dynamic range), HDR10 allows for an image that is over twice as bright, with a corresponding increase in contrast (the difference between the blackest blacks and the whitest whites), and a color palette that has one billion shades, as opposed to the measly 16 million of SDR.
As with all HDR formats, how well HDR10 is implemented depends upon the quality of the TV on which you view it. When utilized properly, HDR10 makes video content look really good, but it is no longer the top of the HDR food chain.
What is HDR10+?
As the name suggests, HDR10+ takes all of the good parts of HDR10 and improves upon them. It quadruples the maximum brightness to 4,000 nits, which thereby increases contrast. But the biggest difference is in how HDR10+ handles information.
With HDR10, the “metadata” that is fed by the content source is static, which means there’s one set of values established for a whole piece of content, like an entire movie. HDR10+ makes this metadata dynamic, allowing it to change for each frame of video. This means every frame is treated to its own set of colors, brightness, and contrast parameters, making for a much more realistic-looking image. Areas of the screen that might have been oversaturated under HDR10 will display their full details with HDR10+. But wait, there’s more — Samsung, long a proponent of HDR10+, has kicked things up yet another notch. The company’s HDR10+ Adaptive technology allows your TV to detect the brightness of your viewing space and make micro adjustments to the brightness, contrast, etc., in response to changes in the room.
When the HDR10+ picture standard first rolled out, it was difficult to find the codec supported by TV brands other than Samsung and Panasonic. One of the biggest reasons behind this is that HDR10+ was developed by a consortium made up of 20th Century Fox, Samsung, and Panasonic. Currently, though, HDR10+ is starting to show up on other TVs, including TCL, Hisense, and Toshiba.
And as for the streaming landscape, as it currently stands, you can find HDR10+ media on Amazon Prime Video, AppleTV+, Hulu, Paramount+, YouTube, and the Google Play Movie and TV apps. A number of streaming devices also support the picture standard, including Samsung’s web-connected lineup of Blu-ray players, the Apple TV 4K (2022), and various Roku devices, including the Roku Express 4K, Roku Express 4K+, and Roku Ultra (2022).
So … what about Dolby Vision?
HDR10+ isn’t the only HDR format with ambitions of becoming the next king of the HDR castle. Dolby Vision is an advanced HDR format created by Dolby Labs, the same organization behind the famous collection of Dolby audio technologies like Dolby Digital and Dolby Atmos. Dolby Vision is very similar to HDR10+ in that it uses dynamic, not static, metadata, giving each frame its own unique HDR treatment. But Dolby Vision provides for even greater brightness (up to 10,000 nits) and more colors, too (12-bit depth, for a staggering 68 billion colors).
Thanks to continued improvements in HDMI technology, the latest HDMI 2.1 protocol allows for up to 16-bit depth levels within the Rec.2020 color space. While it’s going to be a while before consumer displays can decode these 16-bit signals, HDMI 2.1 does support the 12-bit data you’ll get from Dolby Vision signals. Of course, this means you’ll need to have a TV that’s capable of decoding those 12-bit Dolby Vision signals, along with a few other AV essentials (more on that below).
Unlike HDR10+ though, which only had its official launch in 2018, Dolby Vision has been around for several years and enjoys wide industry support, which could help make it the HDR standard going forward.
Oh no, not another format war!
Does the presence of competing HDR formats like HDR10+ and Dolby Vision mean we’re in for another format war? Not exactly. Unlike previous tech tiffs like Blu-ray versus HD-DVD, HDR formats are not mutually exclusive. This means there’s nothing stopping a movie studio from releasing a Blu-ray that contains HDR10, HDR10+, and Dolby Vision metadata on a single disc.
A TV that supports HDR can support multiple HDR formats, and many of today’s TVs do just that. The most common combo is HDR10 and Dolby Vision support on a single TV; however, we’re also just beginning to see the arrival of TVs that add HDR10+ and even HLG (the version of HDR favored by digital TV broadcasters) to that mix. It’s also possible that some TVs that shipped from the factory with support for just two formats — say HDR10 and Dolby Vision — could be updated via a firmware upgrade to handle HDR10+.
Blu-ray players and media streamers can also support multiple HDR formats. The challenge is that, despite the ability to support multiple HDR formats, very few TVs, playback hardware devices, streaming video services, or Blu-rays actually do. This means that, as consumers, we need to pay close attention to the labels to understand the capabilities of the devices and content we own — and the ones we plan on buying.
Many Blu-ray players, for instance, only offer support for HDR10, while some models, like Sony’s UBP-X700, add Dolby Vision support. The same considerations apply to set-top streaming boxes. Right now, there are many different peripherals that support all three major HDR formats (HDR10, HDR10+, and Dolby Vision), including the Apple TV 4K (2022), the Amazon Fire TV Stick 4K and 4K Max, the Roku Streaming Stick 4K and Roku Ultra (2022), and the Chromecast with Google TV (4K).
What equipment do I need to get HDR10+?
To summarize, HDR10+ is a new format of HDR that offers higher levels of brightness and contrast plus more true-to-life colors and detail. To get it, you’ll need:
- A source of HDR10+ video, such as a Blu-ray movie, Hulu, Amazon Prime Video, etc.
- A device that is capable of reading HDR10+ encoded material, like a compatible Blu-ray player or media streamer
- A TV that is HDR10+ compatible (these may also have built-in apps that let you sidestep the need for a playback device)
One more thing: If you’re using a media streamer or a Blu-ray player for your HDR10+ content and it does not plug directly into your TV, the HDMI cable that you’re using should ideally be compatible with HDMI 2.1. The reason is that HDR10+ (and Dolby Vision) use far more data bandwidth than conventional HDR10, and older HDMI 2.0 cables may not be able to support that extra demand.
So that’s that! Whether you’re looking to upgrade your home theater system or you just want to understand this cool tech, that’s really all you need to know. Stay tuned for updates!