cpn connect
careers

DV101: From WYSIWYG to Waveform Monitors: The Evolving Tools for Digital Exposure, Part I

12/20/2012 12:10 PM Eastern

Back in my day (yes, you can visualize me sitting in a rocking chair with a smoldering pipe in one hand and an afghan on my lap, peering out at you over the tops of half-moon spectacles, if you so wish) the light meter was a cinematographer’s best friend: a primary tool and an invaluable asset. Although there were exceptions—the late Douglas Slocombe, ASC, BSC, could famously call out the exposure based only on looking at the scene—they were extremely rare. Even the most experienced and revered cinematographers used a light meter to evaluate the exposure levels of their scenes.

The standard SMPTE 75 percent color bars should look like this on a waveform.

Enter the digital realm. For a long time (long being relative in the relatively recent digital revolution), as all cameras were shooting in an RGB-encoded format, we had the extraordinary benefit of being able to actually see the final image on the set. Digital became a WYSIWYG (What You See Is What You Get) medium and cinematographers could make instantaneous decisions based not on supposition from reading a light meter but by actually seeing the image, live. In the days of motion picture film, this was unheard of. Sure, we had video taps, but the video tap was really only a representation of the composition of the shot, not the exposure, color, latitude, etc. The cinematographer learned to balance all of these factors in his head through knowledge and experience.

In the world of WYSIWYG video, the light meter started to fade into obscurity—although it is still a very viable tool and many cinematographers still carry one—and many judgments were made based on the image on the on-set monitor.

In today’s production model, raw formats are preferred for scripted narrative, commercials, music videos and any non-reality/news or sports programming. This trend toward raw shooting has pulled us away from WYSIWYG and back toward a characteristic of the world of film: the cinematographer has to know how to interpret the image and extrapolate how it will look after post processing.

Although the cinematographer can see an approximation of the final image on set—via the camera’s RGB-encoded output or a LUT generator—it isn’t the actual recorded image they’re looking at.

This doesn’t mean that the cinematographer is blind and has to make decisions based solely on the approximate picture he/she is seeing. Instead, there is a multitude of tools that enable the cinematographer to make better image evaluation and exposure judgments on the set. These tools operate either in-camera, in-monitor or as third-party add-ons.

Waveform Monitors
One of the most prevalent tools for exposure judgment is the waveform monitor. Waveforms have been around since the early days of analog video and have been used by engineers to monitor the values of an image for more than half a century. They used to be fairly bulky little monitors that you would carry to a set; you’d run your camera feed through it before outputting the feed to the monitor. Today, many monitors and even some cameras have built-in waveform displays.

The waveform is fairly simple to read and understand. There are three primary display modes on most waveforms: luminance, luminance with chrominance, and parade. We’ll start with luminance.

Waveforms are displayed on the grid marked out on the screen. On the y axis, the primary section of the grid is marked in horizontal delineations from 0 to 100. This represents the signal intensities from 0 percent (black) to 100 percent (white/peak). The markings actually represent IRE values (Institute of Radio Engineers), which translate to percentages of luminance in the image.

Some waveforms have a scale that goes beyond the standard 0-100 to include below 0 values (generally to -40) and above 100 values (generally to 120). The lower values are for sync pulse signals, the line blanking intervals, and are of no importance to us in judging exposure. The area above 100 is the “superwhite” area; any signal in this area generally exhibits complete loss of image detail. Some waveforms also have a marking at the 7.5 IRE level for “setup”—this was required for getting solid blacks in standard definition video. You can completely ignore 7.5 for HD and cinema signals.
Although waveforms do consider chrominance information, they are primarily for measuring luminance information.

The scale of IRE represents a percentage of luminance of the image. When you see areas of the waveform above 100 IRE, you know those areas are “clipping.” They have become pure white and cannot be brought down. Likewise, areas on the waveform below 0 IRE are pure black and without detail.

The whites in a well exposed image will generally fall between 80 and 100. If you’re looking at a scene in a restaurant, the white tablecloths should generally read about 80-85 IRE so that they maintain detail but still look white.

Medium gray should be set between 45 and 55 IRE. Caucasian faces generally fall between 60 and 70 IRE.

The waveform represents luminance information, so we can look at luminance alone or combine both luminance and chrominance. This can get kind of messy and hard to read, so if your waveform has adjustments, it’s better to set it to IRE or luma only and ignore the chrominance (unless you’re in “parade” mode, which I’ll discuss below).

In addition to seven bars of 75 percent intensity white and six colors, SMPTE standard color bars also have a pure white block, which should hit 100 on the waveform, and pure black, which should hit 0.

The X axis on the waveform (horizontal scale) represents the image from left to right. However, since the Y-axis (vertical) represents percentage of luminance, it cannot represent the vertical portion of the image. Therefore, all pixels in the vertical column of the image are represented in the vertical of the waveform. If we’re photographing a white piece of paper in the upper half of the frame and a black piece of paper in the lower half, each point along the waveform will register both white and black. That can make reading the waveform a little tricky in real-world situations. Likewise, if we have a grayscale that is positioned vertically in the frame, you’ll notice that each point on the waveform has a line that represents the steps of that scale.

You can use the waveform monitor like a light meter. If you put a gray card in your scene where the talent will be, fill the screen with the gray card and then adjust your aperture until the waveform reads between 45 and 55 IRE, you will get a proper exposure for that area.

The waveform is a solid tool for envisioning your overall exposure range. In a very low-light scene, you’ll notice that the signal is crowded to the bottom of the waveform. In this area, you’re likely to be picking up a lot of noise. It’s generally better to open up, expose a little higher on the waveform scale, and reduce the brightness in post if necessary. The waveform is a tool that helps you keep a more solid signal-to-noise ratio in your image.

The other mode of a waveform is the parade mode. Here we separate out red, green and blue into their individual components and can compare the luminosity of each channel side by side. It’s a quick way to see if you’re overexposing your skin tones too much, for example: is the red channel running hotter than green and blue? Keep in mind that blue is generally the noisiest channel in an RGB signal, so if you are low in blue information, you may want to adjust your lighting or even add a blue filter to the lens and white balance again to make sure the sensor is getting a healthy blue signal.

Many monitors now have waveforms built in: Panasonic, ikan, Marshall and SmallHD all offer field monitors with built-in waveforms, which make them very handy tools for production in the field.