In Europe, virtual sets have been used in production for three or fouryears. Recently, however, writers and producers in the United States havestarted to develop programs using the technology. ABC’s weekly children’sshow Disney’s One Saturday Morning (currently airing Saturdays in the7:30-9:30 a.m. slot), is a prime example of what is possible when creativeand technical minds meet to push the envelope.
Peter Hastings is the creator and executive producer for One SaturdayMorning. It was Peter who envisioned using a virtual set, although hereadily admits at the time he had little idea of what was possible. Peterhired Prudence Fenton, a veteran TV producer with computer graphicsexperience. Together they surveyed the virtual-set demos at NAB 1997, andleft the show backing the technology co-developed by Accom and ELSET.
Rutherford Bench Productions, Burbank, which does a lot of work withDisney, hired Santa Monica-based POP to create and produce the virtual setenvironment for One Saturday Morning. From there, what began as a drawingresembling New York’s Grand Central Station with a roller coaster travelingthrough the atrium evolved into 20 episodes of live action composited witha colorfully animated virtual building, where host Manny the Uncannyresides, manufacturing the popular cartoons Recess, Doug, and Pepper Ann.
The project was challenging on a number of levels. For example, this wasthe most aggressive use of virtual set technology to date, with threecameras, a pedestal, a dolly, a crane, and three Silicon Graphics OnyxInfinite Reality workstations generating the virtual sets; all thesynchronization was to be handled by ELSET. Meanwhile, a personal computercollected the real-time measurements from the camera positions andperformed all the calibration requirements to be sure the props and talentwere solidly in the virtual environment when the cameras moved through thestage. The stage, over 4,000 square feet, held up to 30 people in thevirtual set at any given moment.
The design process began by modeling the virtual set on an SGI computer,using Alias|Wavefront Power Animator, based upon a drawing by Disneyartists. During the modeling process, textures were applied to the surfacesusing ambient lighting. The lighting director and animator agreed on alighting strategy that called for morning sunlight to shine through theeast side of the building. This would allow for strong back-lighting in thecyclorama, and the shadows of the virtual set, props, and talent would beconsistent. Lights were added to the model and rendered into the texturesfor final approval.
At this point the model was far too large to run in real time. For example,it took five minutes to render a single frame of NTSC video using thefastest available computers. The virtual sets would need to be 18,000 timesfaster to render the virtual set at 60 fps (fields per second). Toaccomplish this, the model was divided into multiple rooms for differentscenes or camera positions, and the geometry was converted into polygonsand reduced to the point where the Onyx Infinite Reality was capable ofdrawing the set in real time, 60 fps.
Accom ELSET manages the hardware resources very well. However, like anytool, the designer must make the tradeoff decisions as to the best use ofthe resources. Accom MAPSET is the tool used to determine the size andplacement of textures in the set. The process begins with storyboards whichdetermine camera location and approximate distance to virtual objects.
The basic principle? The closer the object gets to the camera, the largerthe texture map needs to be. The current limit of the hardware is 64Megabytes of real-time texture memory, which is one of the challenges indesigning interesting and esthetically pleasing virtual sets.
After the virtual set was approved, we were in production. Producer DouglasRask, already experienced with real-time graphics production, assembled theteam for the three-week shoot. Load in, lighting, and calibration of camerapositions took three days, with a fixed dolly track position and multiplecrane and pedestal positions. Calibration of lenses and minor adjustmentsbetween camera setups was required. Green screen was chosen over bluescreen to minimize costuming problems: Because blue is a more popularclothing color, particularly in blue jeans, which many of the childrenwould be wearing, we needed to stay away from a matte color which wouldinteract with it.
A three-wall cyclorama was built to get the coverage director RonAndreassen was looking for. It was decided during preproduction thatdynamic camera moves would be important for a few reasons. First andforemost, it is a children’s program and the action demanded cameramovement, and secondly, if we were going to build a 3-D virtual set, weneeded to move the camera through the set, otherwise we could simply use a2-D background matte painting. During production Accom modified the craneto work on the dolly track. This gave us the ability to travel from oneroom to another with sweeping camera moves covering the action-children onbicycles, bumper cars, skateboards, and more.
It would not have been feasible to build this environment practically, butmore importantly, it would not have been possible to levitate a30-foot-diameter cereal bowl 15 feet off the ground without wires, or flyin a 1,200 square-foot video monitor made of stone. The prime time specialthat aired September 5 used many real-time special effects, and the programwas recorded only a few days prior to air, unheard of with traditionalcomputer graphics. On a daily basis we recorded over 100 minutes ofanimation composited with live action directly to Digital Betacam.
Accom ELSET performed even better than anticipated because of the supportprovided by Michael Bauer and Chris Merrick; we were able to deliver asignificantly larger set with more dynamic camera moves than we planned forin preproduction. Michael regularly created animation for the director,minutes before shooting, and Chris’s quick adaptation of the crane for thedolly track was impressive.
The software has evolved to the point where 3-D animators can createenvironments using off-the-shelf software like Alias|Wavefront, Softimage,or 3D Studio MAX, and quickly import their models into ELSET. Critical tothe success of their design is a strong knowledge of the hardwarecapabilities and good customer support. [Real-time character generation isproduced by San Francisco-based Protozoa.]
Other features of the ELSET software include the ability to change virtualobjects between foreground and background, and to remotely control thelighting environment using the Ultimatte. Both of these are necessary tocreate a real sense of 3-D space. ELSET also has the ability to controlVTRs or digital disk recorders (DDRs) for playing video or synchronizingsound effects from hard disk. One of the most recent features is theability to record the camera data for post rendering in Alias|Wavefront,allowing you to render larger, more detailed sets, useful for establishingor beauty shots-and opening the door to future film applications.
Like most software, there is always a version in Beta release, and we lookforward to new features on the horizon, like handheld cameras, depth offield, and retro-reflective keying systems. However, the existingtechnology has proven itself in production, and POP is working on othervirtual set projects in development.
Author Paul Lacombe is a member of POP’s Virtual Division. He serves as CGsupervisor on Disney’s One Saturday Morning. He can be reached via e-mailat: firstname.lastname@example.org. Santa Monica-based Pacific Ocean Post consists offive operating divisions (POP Television, POP Sound, POP Film, POPAnimation, and the Cinram-POP-DVD Center.) The company can be reached at(310) 393-4699, or via its Web site at: www.popstudios.com.