If you followed the news out of CES closely you probably heard the word HDR tossed around a lot. This coming year we’ll see TVs for under $500 with the feature, and fancy monitors for nearly $1000. But what does HDR even mean?
HDR stands for high dynamic range. Originally the term was applied exclusively to a style of still photography that greatly diminished shadows and highlights in photos. It made it useful for architects and real estate agents, people who would want to represent the insides of their buildings without all the nasty glare of sun and the darkness of shadowy corners. Yet HDR also found fans amongst really bad photographers with access to Photoshop, and consequently an unattractive HDR aesthetic emerged in still photography.
That aesthetic, thankfully, can’t translate to moving pictures. But HDR in movies and TVs is still about revealing details in areas of extreme brightness and darkness.
A display accomplishes the feat by having a truly exceptional contrast ratio. The UHD Alliance, a consortium of TV makers, content creators and distributors, actually defines the peak brightness and darkness a TV needs to produce to be HDR-compliant.
Specifically, the UHD Alliance says a TV has to be able to put out 1000 nits (twice as bright as Samsung Galaxy S7 phone in sunlight) and get as dark as .05 nits—something a lot of LED TVs can do. Or it has to get as bright as 540 nits and as dark as 0.0005— something only an OLED display is really capable of. It also must be able to display content at a minimum 4K resolution and produce wider color gamut than that the one used for the last 30 years, Rec. 709.
Rec. 709, or sRGB, is a color gamut that was used to master nearly every show, movie, and video game until only recently. Your TVs, phones, and even your computers were all calibrated to that specific set of colors, but rec. 709 can only replicate about 34 percent of what the human eye can see. Which is why the movie mode on your TV or laptop always makes things look sort of brown and washed out.
Newer TVs and monitors are capable of replicating a lot more colors. Most high-end TV sets can reproduce the color gamut found in the digital movie projectors at your local theater: DCI P3. That’s about 46-percent of what the human eye can see. It’s an extremely popular gamut of colors—including being the gamut of choice for all 2016 and newer Apple products and, as announced recently, Instagram.
But being able to display less than half of all colors a pristine human eye can see still feels like a waste. That is why, in 2012, a new, even wider color gamut was introduced: rec. 2020.
Rec. 2020 gamut represents a whopping 67 percent of what the human eye can see. So everything on your TV would be brighter, more vibrant, than anything you’d seen on a TV before. Unfortunately, not a single consumer display is yet capable of rec. 2020. So only a handful of HDR formats actually require it over the less impressive DCI P3.
Which brings us to the actual methods for distributing HDR content. Currently, the most popular HDR format, because there are many, is HDR10. Its popularity stems from the fact that it’s cheap to use. There’s no nasty licensing fees associated with HDR10 so any TV maker or content producer can use it. Right now every TV maker with an HDR set can play HDR10 content, and it’s the format used by the Xbox One S and Playstation 4 Pro as well. Content from Netflix and Amazon can also be played back on HDR10 sets.
But its affordability also creates limitations. Specifically, HDR1o is...sort of dumb. Other formats adjust depending on the quality of the TV in the hopes of providing the best possible picture on a set. HDR10 doesn’t do that. If your TV can perfectly match the colors and brightness and darkness posited by the HDR10 content than it will look exactly as the director intended. If your TV cannot, than you may find instances where light sources are blown out, or scenes appear too dark, or colors too vivid.
Also HDR10 doesn’t scale well. In fact HDR10 content on future TVs will look worse as those TV sets improve and peak brightness and darkness levels get better and and color reproduction improves—sort of like trying to watch standard definition content on a 4K set.
This scaling problem is dealt with in some of the other HDR formats. Dolby Vision is especially popular at the moment. Introduced last year, Dolby Vision has quickly gained a following amongst content creators because of its dynamic metadata. Theoretically, a filmmaker can master their movie in Dolby Vision and never have to master it again for Dolby Vision sets. It’s also mastered to reproduce more colors and better contrast—specifically, it can replicate up to rec. 2020 and a theoretical 10,000 nits peak brightness. HDR10 just goes up to DCI P3 and 4000 nits peak brightness.
But Dolby Vision requires expensive monitors to master content on, and it’s got pricey licensing fees, too. So while it’s the most futureproof HDR format and potentially produces the highest quality content, it hasn’t seen wide adoption. Stikkm more and more films are being mastered in Dolby Vision, and nearly every major TV maker announced support for the format in their flagship TVs at CES this year. Crucially, both Netflix and Amazon Video opted to support the format halfway through last year.
The problem keeping Dolby Vision from being the one HDR format to rule them all is that its dynamic metadata means it takes up a lot of space. So while its fine for people already streaming gigabytes worth of data over the internet it’s terrible for broadcasting content over airwaves—something necessary if broadcast TV ever plans to adopt HDR.
This is where Hybrid-Log Gamma (HLG) comes in. It’s the newest HDR format on the block. Developed in tandem by the BBC and NHK, HLG was created specifically with broadcast television in mind and ignores metadata completely, instead relying on the TV to know precisely what its capable of reproducing from the signal it receives. This also allows it to be completely backwards compatible with standard dynamic range (SDR) television sets—something HDR10 and Dolby Vision content isn’t.
HLG is the newest HDR format, but its support from BBC and NHK, the largest broadcasters in the UK and Japan respectively, its open source nature, and the lost cost of implementation into sets, has improved adoption in sets. Samsung, LG, and Panasonic all announced support at CES.
And others could follow suit. The beauty of HDR is that implementation of new formats can in some cases be as simple as a software update. So future formats like the Technicolor backed SL-HDR1 could be added to your $5000 set with the press of a button
But HDR does have one major hardware limitation that will leave you frustrated when setting up your home theater. It requires an HDMI 2.0 port with HDCP 2.2. HDMI 2.0 looks identical to the more common HDMI 1.4, but the ports are more expensive to produce and the cables much pricier, so most TVs have a mix of 1.4 and 2.0 HDMI ports, and none of them actually label the damn things or the provided HDMI cables.
Which means getting HDR content on your TV can be a nightmare of testing ports, fiddling with the television’s settings app, and haunting home theater enthusiast forums. In fact, double check. Did you get an Xbox One S or Playstation 4 Pro for Christmas? There’s a good chance it’s not actually displaying the HDR content promised.
Confusing formats and struggles of port labeling aside, when HDR is working it’s a stunning improvements to the films and TV shows you’ve been watching. Shows steeped in shadows no longer seem too dark to watch, explosions appear more realistic, and cars and the shine of a football player’s helmet appear almost extraordinary in how real to life the appear. As the technical challenges to broadly implementing HDR are overcome, we’re quickly finding out that the bright future was worth the wait.