It’s easy to become impatient with the avalanche of new technologies in home TV viewing. Not only have we had the slightly dubious charms of stereoscopic 3D to deal with in the last few years, it’s only been a short while since the most sluggish slowpokes transitioned to HD, let alone 4K.
In this context, it’s refreshing to discover that there are very few detractors of HDR (high dynamic range). It wasn’t hard to find people willing to criticize 3D, and even, to an extent, 4K. Shooting stereo 3D is a lot of work for an effect that can end up making people feel unwell, and the benefits of 4K—in distribution, at least—are marginal unless the picture is large and viewed from very close range. HDR, on the other hand, is almost universally liked: it’s impossible to downplay the results of good HDR from either a technical or commercial standpoint. With its expanded luminance, it looks so good that nearby conventional displays tend to look faulty by comparison. Not only is it a measurable, absolute benefit to film and TV production in a technical sense, it’s also easy to sell.
Sony BVM-X300 4K OLED monitor at Light Iron
Viewers confronted with an unfamiliar HDR picture tend to use a simple term to describe it: “brighter.” While that’s certainly true, it’s crucial to understand that the picture isn’t only brighter—it’s also higher in contrast. The brightest parts of the picture have more light coming out of them, but the darkest parts, ideally, are no brighter at all. This means that on a conventional liquid-crystal monitor of the type referred to as a TFT, HDR isn’t just a case of making the backlight panel brighter. That’s part of the approach, to achieve those brighter highlights, but the display panel itself also needs to achieve darker, denser shadows without allowing that bright backlight to leak through. OLED displays have an easier time, being theoretically capable of black levels of 0, but they can struggle to achieve the high brightness that’s ideally required for HDR display. Hybrid approaches, such as the use of a matrix of individually controlled LEDs to backlight small areas of a conventional liquid crystal display, are also in use.
Demonstration models by companies such as Sony and Dolby have made the rounds at trade shows for a few years. These professional models tend to noticeably exceed the capability of consumer-targeted displays. Video display brightness is expressed in nits, a term for candela per square meter. Standard dynamic range displays are generally calibrated to quite low levels, in the range of 100 to 200 nits, with cinema projection aiming even lower, around 48 nits.
This simulated image illustrates how Dolby Vision transforms a standard dynamic range picture. The technology delivers striking highlights, brilliant colors and deep blacks.
Some of the most powerful demonstrations of HDR from companies such as Dolby, based presumably on technology developed after their 2007 acquisition of Brightside Technologies, exceed 4,000 nits using matrix backlighting techniques. Likewise, Sony’s excellent and recently updated BVM-X300 display combines 1,000-nit capability and OLED color precision with true DCI 4K resolution in a display that’s more or less all things to all people—albeit at a price. The 2016 NAB Show saw something of a democratization of HDR with Atomos’ release of its Flame series of recorders with 1,500-nit displays.
In production, HDR may not require too many modifications to shooting technique. Many log-capable cameras have been effectively shooting HDR images for years, with one purpose of grading being to compress that wide range of brightness into something that can be attractively viewed on standard dynamic range displays. The grey, flat appearance of a log image on a conventional display is, to a great extent, the result of viewing high dynamic range content in a standard dynamic range environment. On-set monitoring may need to take HDR into account, and with the 7.1 update, Atomos’ recorders become capable of outputting HDR to the PQ standard (about which more below) for use with compatible displays.
Available on Atomos Shogun Flame, AtomHDR is a proprietary image processing technology that displays log footage with vibrant, true-to-life colors.
In postproduction, grading must naturally use HDR-capable displays, although the tools of color correction systems remain largely as they always have been. The main concern—some have called it a “format war”—is in finalization and distribution, where the need to create both standard and high dynamic range versions of material, and even several different editions of the HDR version, is an imposition that content producers would naturally prefer to avoid. There are essentially two technical approaches to distributing HDR, both of which are predicated on the need for greater precision with greater contrast. More contrast puts the brightest and darkest parts of the picture further apart, so we need more digital steps between them to avoid those steps becoming visible.
This is an issue that’s already been addressed in production, with 10-bit (that is, 1,024-step) pictures common and more or less required for reasonable log pictures, although the idea has been floated that really excellent HDR might require even more. The reticence of distributors to send extra data to consumers means that consumers might have to put up with HDR that’s merely OK, but significant work has already gone into ways of transporting HDR most efficiently.
Netflix is already streaming material in both HDR10 and Dolby Vision, including the Netflix series The Crown (pictured). Photo by Alex Bailey/Netflix.
The two standards that currently seem most likely to proliferate are expressed in the International Telecommunications Union’s Recommendation BT.2100. They can be briefly described as hybrid log-gamma (HLG), championed by the BBC and NHK, and perceptual quantization (PQ), developed by Dolby and upon which its well known Dolby Vision standard is based.
Of these, PQ is perhaps the most technically ideal, being derived from some quite careful mathematics that model the human visual system to make the best possible use of every digital code value in the signal. The Dolby Vision system, which is being widely promoted for both home and theatrical exhibition, adds significant cleverness on top of PQ that can help material look more consistent on displays of different capability, but that additional cleverness comes at a cost in both the hardware required to implement it and in licensing fees from Dolby, whose clear intention is to maintain its position as a provider of licensable technology.
BenQ PV3200PT 32” 4K UHD high dynamic range monitor
HLG, on the other hand, is a simpler approach that defines the relationship between signal level and light output differently. In the lower part of the signal range, where shadows are recorded, that relationship is similar to that implied by the ITU’s Recommendations BT.709 and BT.1886 (and related standards for material at higher resolution and greater color precision). Above the shadow level, a logarithmic relationship between brightness and recorded signal level is used. It’s similar to that used in-camera, allowing much more highlight information to be recorded than was possible with extant techniques. The result is something that can be displayed on both HDR and SDR monitors with reasonable results, affording enormous technical convenience to creators and distributors, although there is some question as to whether it exploits the possibilities of extended dynamic range as completely as PQ.
Simulated image from Ateme demonstrating the visual difference between standard dynamic range and high dynamic range
It’ll be no surprise to find that this incipient format war is likely to be decided by the volume production of consumer TVs, and thus by consumer-oriented organizations like LG and Samsung rather than industry insiders such as Dolby. Certainly, the proliferation of TVs advertising HDR10 compatibility is an indication that PQ, on which HDR10 is based, may be gaining more traction. (The HDR10 Media Profile was announced by the Consumer Technology Association in 2015. It uses the BT.2020 color space, the PQ standard and a bit depth of 10-bits.) The more complex and expensive Dolby Vision standard seems likely to become an option at the premium end of the market.
And uptake? Netflix is already streaming material in both HDR10 and Dolby Vision, with HDR10 also adopted by Sony. Several manufacturers showed HDR10- and Vision-capable displays at CES this year. If there’s an upside for those behind the camera, it’s that professional camera equipment, which isn’t as cost-constrained as consumer displays, has been HDR-ready for years, and that good cinematography will sing even more sweetly when displayed with these new techniques.