Full-Res Display: HDs Dirty Little Secret
On a recent visit to a post-production company in New Zealand, my first-ever south of the equator, I was struck by the sight of several computer monitors that had noticeable green "splotches" on the upper part of their displays. My hosts explained that these monitors had been purchased from countries above the equator, and were awaiting something called the "southern hemisphere modification."
It turns out that the Earth's magnetic field is oriented in quite a different direction down there as compared to Europe or North America. It's sufficient to cause a serious disturbance in the way the electron beam lands on a computer-monitor tube, and results in what engineers term color "purity" errors.
By now, I'm sure, readers in Australia, New Zealand, South America, South Africa, and many other countries have thrown down this magazine in disgust at my ignorance. Of course the Earth's field is different in those places, and of course it affects a conventional monitor's tube. I'm sorry; it just never occurred to me. It is, however, a useful lead-in to an increasingly important topic now that the film and broadcast worlds are inching toward widespread transmission and reception of film-resolution and high-definition (HD) images. That topic is the problem of HD image display.
Time For a Change
As more and more material is mastered in an HD format, (usually taken to mean a picture comprising 1920 pixels horizontally, and 1080 lines vertically), the more obvious are the limitations of existing display technology, even in demanding professional post or broadcast installations. The dirty little secret of HD is that very few people have ever seen a full-resolution HD picture, and the $30,000 to $40,000-plus broadcast monitors that are currently used every day to critically view finished HD product can barely display half the available resolution of a 1920x1080 HD picture.
This is primarily the result of using a technology that originates from the Nineteenth Century, namely, the cathode ray tube, or CRT for short. From displaying monochrome images of only a few hundred lines of resolution in the early days of television, the CRT has stayed with us into the color era and beyond. The feat of getting an electron beam, carrying no color information, to display three different colors was achieved by placing a fine, perforated metal mask between the beam and an equally fine grid of red, green, and blue colored phosphors on the tube faceplate. This "shadow-mask" system has been the way we have viewed color television images for over 50 years.
Herein lies one of the problems for HD. As the shadow-mask hole pattern becomes finer and finer to accommodate HD and higher film resolutions, the structural integrity of the mask begins to suffer; distortion due to heat and mechanical stress gives rise to shifts in color and purity errors, as the electron beam spills over to phosphors with the "wrong" colors. This problem only gets worse with the larger picture (and hence CRT) sizes that HD encourages; as the shadow-mask becomes larger, it is increasingly difficult to hold it rigidly in place.
Another difficulty was highlighted by my New Zealand trip: A finer shadow-mask means that the electron beam needs to be deflected less for a color change to happen. A change in the earth's magnetic field is enough to do this, so is the magnetic field from a badly placed loudspeaker, as many Avid editors have discovered when they placed their speakers next to their edit viewing monitors. The shadow-mask itself can become magnetized, causing a permanent splotch of color-or purity error-that can only be cured by demagnetizing or "degaussing" the tube. Every modern color CRT monitor has a built-in degaussing circuit that activates briefly at power-up to avoid exactly this permanent magnetization problem.
The susceptibility to environmental conditions is just one of many drawbacks of conventional CRT monitoring, made more obvious by the more stringent requirements of HD and film. If we are to deliver full-resolution HD pictures to the professional world, let alone the consumer, then the venerable CRT is not the technology to do it. Among other things, just the sheer weight of the heavy glass and metal becomes completely unworkable. Any engineer who has participated in lifting one of the 32-inch CRT broadcast monitors, which weigh in at a meaty 300lbs., will agree.
A maturing technology that holds great promise for the solution to the inherent CRT problems is the liquid crystal display, or LCD. After introduction in the 1970s, it took several years to reach the useable high-resolution color stage, appearing first on notebook PCs. Initially, the market for these displays was concentrated in the 12- to 15-inch size range to suit the laptop industry. But when affordable, lightweight, and thin desktop monitor-sized LCDs appeared in the mid 1990s, they started to become interesting as film and television display devices.
About two years ago, Apple Computer introduced a large (23 inch) desktop LCD panel with a pixel count that met or exceeded the HDTV standards, and with colorimetry that closely matched television requirements. For obscure reasons, the computer industry has standardized on a display aspect ratio of 16:10, rather than the 16:9 aspect of the agreed HDTV formats. This means that the Apple displays are 1920 pixels across, an exact match for the majority of HD pictures, by 1200 pixels vertically, leaving 100-odd lines to be blanked top and bottom. More importantly, the display interface moved into the digital domain, with a DVI connector, which at a stroke removed the issue of sending analog red, green, and blue signals to the monitor. Longstanding problems such as high-frequency (and hence resolution) loss in the cable, and unwanted signal reflections or "ghosting" due to poor electronic termination of the signal at the monitor, could at last be completely bypassed.
The combination of high resolution and manageable size and weight is an attractive proposition for use as an HD monitor, and this has piqued the interest of several manufacturers to design devices to display digital HD signals on LCD panels. Almost all designs, however, are based on chips designed for home theatre applications that downsample the HD picture to a lower resolution. Among other problems, these low-cost chips add interlace to progressive HD images, producing an annoying (and incorrect!) "twitter" between lines on the display.
My colleague, Martin Euredjian, at eCinema Systems, has designed circuitry from the ground-up using a customized chip to produce a reference-grade native HD image, displaying every pixel of the image as a single pixel on the display. The resulting combination of the eCinema EDP100 digital display processor and LCD monitor yields a full-resolution HD display for under $10,000. I have been present when Martin has demonstrated the new display to an audience of engineers and end-users, comparing it against a traditional CRT HD broadcast monitor. It is not unusual for the members of the audience to ask if the broadcast monitor is broken, because of the astonishing difference in resolution between the CRT and the LCD.
Purchase costs always need to be considered together with cost-of-ownership issues. A very high-resolution color CRT display needs continuous fine-tuning by technical staff to offset the effects of ageing or drift if it is to remain at peak performance. Once again, returning to those $30,000- $40,000 broadcast monitors, the combined effects of heat, aging, and mechanical stresses means that the CRT needs to be replaced usually after about three years, and in some cases in as little as a year, at a cost of well in excess of $10,000-more than the entire cost of the EDP and LCD combination.
The "Right" Colors
We now turn to the matter of color accuracy and repeatability of the monitor, which is only partially dependent on how the electron beam lands on the color phosphors. Here issues of linearity, thermal drift, and aging come into play. Even when new, it is difficult to get CRTs to "track" colors, in other words to ensure that each color gives the same light output for the same input signal level. The net effect of this mismatch is that a perfectly white output would take on a slight color cast as it fades through grey to black, adding to the difficulty of making color adjustments for such critical applications as digital intermediate color correction.
The result of this lack of repeatability can be seen in almost every telecine suite in the world, an area where absolute and relative color matters a great deal, and where both colorists and clients are attuned to the slightest changes. Here it is still almost unheard of to have two similar broadcast color monitors within the view of the clients. Why? Because they never match. Another one of television's dirty little secrets is that even very expensive monitors that have just been painstakingly aligned to the utmost degree will start to drift apart within a day or two, with differences clearly visible to this critical audience. Better to standardize on a single monitor than have to be caught up in the discussion of which monitor shows the "right" color.
The LCD panel transmits light to the viewer in a different way. In contrast to a CRT, where the electron beam excites the tube phosphors to emit light, LCD monitors "shutter" light from a constant electro luminescent source, which is much easier to mass-produce at a given white-point colorimetry. So the problem of color repeatability effectively disappears at the white point, and-providing the LCD driver circuitry is linear-color tracking is also equally repeatable.
Current LCD technology has both advantages and disadvantages, compared to CRTs. At present, LCDs cannot achieve extremely dark blacks, and so the contrast ratio-defined as peak white light output divided by the light output at the blackest black-is a relatively modest ~500:1 compared to CRTs, which can approach several thousand-to-one. On program material that has strong blacks, LCDs also tend to compress the black detail, making currently available panels not the best choice for highly critical applications such as telecine color correction.
Additionally, the color of the LCD backlighting cannot easily be changed from its nominal value (usually 6500 Kelvin, roughly light from the north sky [in the northern hemisphere!]). This is a disadvantage for display of film material, which uses a white point of about 5500 Kelvin, corresponding to the film projector lamp's reddish color. This last problem has been solved in the EDP100 by incorporating color look-up tables (LUTs), which can be programmed to mimic different display characteristics. The EDP also has a "black stretch" setting to make visible the darkest portions of a displayed scene for QC purposes.
The LCD also takes a relatively long time for the pixels to switch on or off, resulting in a "lag" that can be seen on very fast motion. Curiously, this disadvantage helps when displaying 24fps film material at the preferred HD standard of 24-frame progressive scan, since the otherwise objectionable flicker that would result is "smoothed" by the display. On the other hand, there are severe limitations in the brightness that can be achieved on a CRT, again because of the shadow-mask technology. Most broadcast CRT monitors are hard-pressed to achieve a peak white brightness of 30 foot-lamberts (fl) without severe color purity errors, and higher-resolution computer displays are even dimmer, nearer 20 fl, both significantly reducing with the age of the monitor. By comparison, a typical Apple Cinema Display has a much higher brightness of the order of 55 fl.
The Age of LCD
Clearly, current LCD technology has specific applications for the time being, until the next generation of displays-with better black level and shorter "lag"-becomes available. Typical applications make use of the exacting resolution of the EDP100, and include film restoration, film-grain management, critical focus setting, and quality control. Here at Rushes, in Los Angeles, we purchased an EDP100/Apple display combination for use as a high-definition and standard-definition machine room QC monitor, primarily for reasons of resolution, but also because of initial cost and cost of ownership issues. The good news for purchasers is that the EDP100 can be used with any compatible DVI display-present or future-including, for example, HD-capable DVI input projectors.
The future is bright for both LCD technology and the EDP100 pixel-accurate processor. Already, very large (42-inch and above) LCD panels are being introduced for the home and semi-professional markets, and the next generation displays promise to improve on black-level control for increased contrast, and reduce lag down to the same as CRTs. We can expect to see these panels appear in ever-more-critical applications for display of both HDTV and film. With both lower purchase costs, and almost zero maintenance costs, LCDs and similar DVI displays should remove the final cost barrier to working and viewing in HD.
Images from testing done for the article can be seen at www.ecinemasys.com/products/ edp100/edp100_lcd_vs_crt.htm. n
Mike Orton is a technologist at Rushes, Los Angeles, and yU+co, a VFX company in Hollywood. BBC trained, he holds an MA in Physics from Oxford University. Orton's consulting company, Weird Science Inc., specializes in film, data, and digital intermediate, since 1999.