Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now

×

Understanding Gamma and High Dynamic Range

In order to fully understand high dynamic range (HDR) video it’s necessary to understand gamma. We have all heard about gamma since the early days of television, but now as the industry moves to HDR content production and delivery, gamma still plays an important role. In this two-part series, I first start by taking a look at some misconceptions about gamma. Here are several common ones I hear frequently.

Gamma is nothing more than a cathode ray tube defect – Gamma is caused by the voltage to current grid-drive of the CRT and not the phosphor. Consequently, a current-driven CRT cathode has a substantially linear light response. This means gamma could have been removed even in the glory days of CRT.   

Gamma is required to match human visual response – A common belief is that a non-linear Optical to Electrical Transfer Function is needed because of the non-linearity of the human visual system. It is true that the human visual system is very complex and lightness perception is non-linear, approximating to a cube-root power law. If a television system such as a camera, transmission system and display can faithfully reproduce the light from the scene the camera is pointed at, then the manner in which humans see light shouldn’t make any difference. 

Gamma can always be accurately adjusted – When set properly, a studio reference monitor should comply with ITU-R BT.1886 with a gamma exponent of 2.4. However, effective gamma still changes with brightness-control (black-level) or room lighting. Black levels can track room lighting with auto-brightness (began in early 1970s) to roughly maintain gamma.

Gamma is fixed or burnt into flat panels – Effective gamma and display black levels change based on room lighting. The power-law equation, where the light output is related to the drive voltage raised to a fixed exponent, is sometimes referred to as gamma meaning the gamma function. Other times, we simply mean the exponent itself such as 2.2 or 2.4. Flat panel monitors must match the power-law gamma Electrical to Optical Transfer Function (EOTF) but, since it is not native to the display technology, it must be created with signal processing such as with a look up table or LUT. Note that brightness adjustments will change effective gamma.

With today’s technology why not get rid of the gamma power law? – Occasionally, it is suggested that with programmable LUTs in modern displays we could easily get rid of gamma. Modern cameras have many gamma, log or custom Optical to Electrical Transfer Function (OETF) settings beyond the standardized BT.709 camera gamma setting. They vary with manufacturer and most even have custom settings or modifications to a fixed log or power-law OETF to allow artistic changes or compensate for scene lighting at image capture. The imaging system’s dynamic range on some cameras is now very large so the OETF needs to have some kind of non-linear log or power-law setting to compress the range to a 12 or 10 bit RGB output. Thus, to even get the imager signal out of the camera without the need for 16 or more bits of processing, it is not possible to get rid of gamma and go to a linear OETF.

To build on the last point, as can you can see in the plot on the left in Figure 1, legacy BT.709 gamma is very similar to how humans perceive light. The blue trace represents roughly the cube-root of light perception although there are many, much more complicated, expressions that better model this complex process. From a big picture perspective, it’s fortunate that the inverse of CRT gamma curve nearly matches perceptual lightness response, meaning pre-corrected camera output is close to being perceptually coded. If early displays had a linear response, early TV designers would have invented a gamma-like function anyway and added it to all display technologies from the beginning. To some degree, the industry lucked out since CRTs forced the camera gamma that became BT.709.

The graph on the right of Figure 1 (above) shows camera BT.709 gamma vs. BT.1886 or CRT display gamma. Since they are not quite the inverses of each other, system gamma ends up being slightly greater than one to about 1.2 as shown in the black trace. This is acceptable since it compensates for the typical dark surrounding of viewing conditions and makes the image seem to have higher dynamic range. The effect is a stretching of the dark grays near black so what you see on the display is not exactly what you see looking at the actual scene. 

Sometimes high dynamic range is confused with high brightness. The two are not the same. For example, you can have high dynamic range in a dark movie environment, with maximum brightness of only 48 nits but with a minimum much less than .03 nits in a good theater. That’s a good dynamic range.  Alternatively, you can see displays with very bright screens of hundreds, or even thousands, of nits with a relatively poor displayed dynamic range.  Just turn off the TV and if the screen looks light gray because of a light viewing environment, that is the bottom of your dynamic range since TVs cannot produce negative light. Overall, however, bright screens are beneficial in bright viewing environments. 

The idea that making displays brighter will always give higher dynamic range is misleading for another reason. For instance, you can’t increase the dynamic range of audio by turning up the volume. With audio, turning up the volume increases the noise as well. The same is true for video. With video, the “noise” is quantization noise, where the steps between quantization levels become clearly visible. This typically appears as banding or contouring. To achieve HDR, it is necessary to use bit-depth more efficiently while keeping average brightness about the same as an SDR image. This means more bit-levels for low light levels where the eye is more sensitive and fewer bit-levels for the high brightness areas where the eye cannot see the contouring. In others words, we need a Perceptual Quantizer or PQ that does a better job than the PQ of the current BT.709 gamma.

Evolving HDR Standards

SMPTE has standardized the ST2084 PQ, also known as Dolby Vision, for mastering reference displays. This uses a PQ based on the Barten contour perception with an EOTF that is the inverse of the OETF. Although the standard allows 0.001 to 10,000 nits with 10-bits, currently the best HDR displays peak at about 4,000 nits.

Another approach, standardized by ARIB STD-B67, uses a hybrid log gamma and was developed by the BBC and NHK. This process extends the log processing of high brightness peaks to mitigate blown out or clipped whites. By using the gamma function it seamlessly uses the power-law processing in the blacks as in BT.709 and BT.2020 standards but without the linear segment. This standard allows the display’s EOTF to adjust system gamma to correct for surround illuminations in the range or 10 to 500 nits.

Additional proposed standards from Philips and Technicolor are in the process of being combined. The Philips is parameter-based and embeds low bit rate HDR and SDR conversion parameters into the metadata. These parameters are extracted during decoding and used to tune the display for peak luma. Technicolor video mastering and distribution workflow allows for grading both an HDR and SDR master. This is vital to maintain the artistic intent of the image.

The academy color encoding system (ACES), while not an HDR format per se, allows for wide dynamic range and wide color gamut preserving the color workflow with 33-bit floating point and 10-bit proxy output in stops.

At this time, these standards and other proposals continue to evolve to help define the HDR workflow from the camera to the home.

Adjusting HDR Gamma

Modern cameras are capable of capturing a wide dynamic range. But unfortunately SDR displays will either clip or blow out highlights in images. The has led to the use of non-linear processing such as S-Log2, ST2084 PQ, and HLG that use the bits more efficiently to capture images. This in turn allows HDR displays to use the bits more effectively. However, it is important when capturing an image for the camera that the correct white point and 18% grey levels are set up on a waveform monitor to ensure correct processing of the signal through the chain.

A waveform monitor was used to assist in capturing the camera RAW image for a 100 nits SDR display as seen below. The image on the left was processed so as to appear to see what the camera sees with the exception that the highlights in the sky are lost. The image on the right shows how a standard BT.709 gamma camera signal would appear on a calibrated BT.1886 SDR reference monitor. Note on the right image that the tree and bicycle handle appear darker with more contrast than the actual scene at the camera due to the stretching of the blacks, even though the sky is still blown out and limited to 100 nits.

Figure 2. Comparison of a scene using an HDR monitor and a reference BT 1886 monitor. Note that these image are simulated to show the differences.  

In order to see the scene’s highlights in the sky above 100 nits it would be necessary to deliver the content in HDR while maintaining roughly the same average picture level as the SDR display on a BT.1886 monitor. However, should the HDR gray-scale below 100 nits match the camera scene or should the blacks be stretched to look like the SDR BT.1886 display for compatibility?

Once the decision between a camera-side scene match and a compatible SDR BT.1886 display match is made, the next step is to determine the dynamic range of the target HDR monitor, as well as the level of reference white for the SDR image. These factors all need to be considered when calculating a table of conversion values or creating an HDR conversion LUT. A deeper look at this process is the topic for another article.

All it takes is one look at a true HDR video on an HDR-capable screen, and consumers are convinced it’s the way to go. But having that impact requires a good understanding of gamma and its role in capturing and monitoring HDR content. During the transition period from SDR to HDR, a waveform monitor will play an important role to correctly capture camera RAW footage and to correctly balance images for both SDR and HDR.

Educated in England where he received an Honors degree in Communications Engineering from the University of Kent, Mike Waidson started his career with a consumer television manufacturer as a research engineer in the digital video department, before moving into the broadcast industry. Mike has over 30 years of experience within the broadcast industry working for various video manufacturers. At Tektronix as an application engineer within the Video Business Division, Mike provides technical support on video measurement products.

Close