Has motion-capture technology matured enough to assume a prominent place inthe daily animator’s tool-box? Chris Walker chuckles at the question. Afterall, Walker’s company, Modern Cartoons-Oxnard, California-recently provedthe technology can produce animated television programming daily and inrealtime. Modern Cartoons made the case with 26 episodes of the children’sshow Jay Jay The Jet Plane, executive produced by Walker.
Modern Cartoons creates each Jay Jay episode in realtime usingVermont-based Ascension Technologies’ magnetic body-capture system and aproprietary facial-capture system that Walker and his colleagues developedin-house. After the show has been directed and “shot” in sequential order,and approved by producers, the show’s animators tweak the episodes andrender them over several days. This successful creation of real-timeanimation may seem a huge victory for Walker and a handful of others whohave long believed motion capture represents animation’s future. But howfar into that future are we?
That depends on your frame of reference, Walker says.
“I guess being able to produce 26 half-hours in realtime, incorporatingboth body and facial motion capture for each episode, can be called a levelof maturity for the technology,” he says. “Then again, we’ve been workingon how to do it for almost 10 years. I firmly believe in motion capture,but historically, there have been failures, and, therefore, many producersare hesitant to go for it. They want to rely on what they knowworks-key-frame movement. Look at all the CG movies out there. How many areeven trying motion capture?”
In fact, one animated feature currently in production, Sinbad: Beyond theVeil of Mists, is relying on captured motion for its human characters.(Venice, California-based House of Moves-Hollywood’s sole service facilitydedicated exclusively to motion capture-is handling the project.) Still,Walker has a point. Although motion capture has helped particular sequencesof particular projects in recent years (notably, the complicated stunt workon Titanic), it is hardly commonplace for feature films, let alone TV.
Though a niche technology now, motion capture is growing daily both inAmerica and abroad. Witness the stateside appearance of European-animatedprograms such as Donkey Kong Country. Some American shops have also divedinto motion-capture production head-first and are moving the technology’srole in episodic television ahead with surprising speed.
The Magnetic PullMagnetic systems have been the tools of choice for television motioncapture because of their real-time capabilities. Several facilities arecurrently using Ascension’s Motion Star product, which is based on the DCapproach (designed to lessen magnetic reverb problems caused by peripheralmetal located near the capture stage). The other leading magnetic system isthe more traditional, AC-based Star Trak product, manufactured by Polhemusalso of Vermont. The use of both systems has spawned a slew of virtualcharacters on television networks around the world.
Notably, Ascension’s system has been bundled into the popular turn-keysystem created by Medialab of Paris, and Los Angeles, Medialab’s Clovissoftware, combined with Ascension’s hardware, is responsible for some 50characters animated in or near realtime for TV broadcasts. These includeDonkey Kong and company, Tilde and Dash (“hosts” for programs on SanFrancisco-based cable network ZDTV), and Bill (the co-host of a live Frenchgame-show called BigDil).Medialab also serves as a service bureau forclients who have created their own characters.
“We have lots of clients using a system that we created for our ownproductions,” explains Francois Lelievre, Medialab’s VP for TV performanceanimation. “By creating an entire system-the Ascension motion suit andreceivers, along with virtual puppeteering devices like joystick, pedals,and sliders-we have a tool to instantly animate virtual characters withrealistic motion. Daily usage is where this technology is going. The futurewill be live, interactive characters.”
However, the need to air such characters in realtime is currently limited.Therefore, what most animation shops want from motion capture is the samething they always want from technology: a way to speed up production andreduce costs. To meet this goal, the shops and production companies areincorporating motion capture, key frame, and compositing into unique andoften ingenious work flows.
Tethered and wireless magnetic motion-capture technologies have come downin price and advanced to the point where important effects shops areinstalling permanent motion-capture stages. Among them: Modern Cartoons,Foundation Imaging of Valencia, California, and Pacific Title Mirage,Hollywood (see accompanying story), all of which use Ascension’s system.Threshold Entertainment, Santa Monica, also recently installed Polhemus’system to create martial-arts effects for its syndicated series MortalKombat Conquest.
When Foundation Imaging moved into larger quarters early in 1998, it builta permanent 40 x 40, motion-capture stage. The company now uses theAscension system extensively for sequences of the Saban-produced show,Mystic Knights of Tir Na Nog, which airs on the Fox Kids network.Foundation also occasionally uses the system for creature work on UPN’sStar Trek: Voyager. Paul Bryant, the company’s co-founder, says FoundationImaging built the stage in order to permanently unite motion-capturetechnology with its main production pipeline. He says the system has sincebecome a “routine and easy-to-use” part of the company’s daily operations.
“To be honest, it takes longer for an actor to put on the suit than for usto begin reading data,” says Bryant. “We felt we had to have thistechnology and found a system that allowed us to quickly capture cleandata. Therefore, it made more sense to integrate it into our own pipelineand develop expertise with it rather than try to bring it in for particularprojects.”
The Optical AdvanceDespite TV’s reliance on magnetic systems, several Hollywood facilitieshave recently installed permanent optical systems even though camera-basedoptical technology is more expensive and less useful for real-timeapplications.
Last year, Netter Digital Entertainment, North Hollywood, installed asystem manufactured by Motion Analysis, Santa Rosa, in a refurbished, 20 x30 room to churn out episodes of the new syndicated 3D cartoon, Voltron:The Third Dimension. The cartoon is the first 3D CG show based primarily oncaptured movement rather than key-framing. What is particularly significantabout Voltron is the speed with which Netter installed the motion-capturesystem, solved compatibility issues with their NT-based Lightwave animationsoftware, and established a production pipeline.
“We got this project so quickly and had such tight deadlines, we could nothave made this show without motion capture,” says Larry Stanton, Netter’sVP of technology. “But we wanted clean, accurate data in great detail, andthat meant an optical system was more suited to our needs. We checked outall of them, and the Motion Analysis system offered the most detailed data.With their help, we were able to install the system and create a productionmodel in just six weeks. Now, we basically shoot the show like it was liveaction. We can capture movement from 75 to 100 scenes a day, and we areturning out an episode every eight to 10 days.”
Netter is now incorporating the system into the production of the newsyndicated sci-fi show, Crusade, the spin-off from Babylon 5. While Netterlimited its animated creature work for B5, the company will routinely useit for Crusade specifically because of the new motion-capture stage, saysStanton.
Major effects facilities have also taken on expensive motion-capturesystems. ILM and Digital Domain, as well as House of Moves, have createdstages built around the 24-camera-based optical system manufactured byVicon Motion Systems of Tustin, California. That system, the Vicon8, iscostly. But, according to users, it has eased the occlusion issue, one ofthe long-held concerns about optical technology, whereby time-consuming”compensation” is required when some motion data is not captured if aparticular sensor falls out of view of surrounding cameras.
“Having 24 cameras is probably the first really significant change inoptical hardware in the last few years,” says Tom Tolles, president ofHouse of Moves. “That means companies like ours, who handle really bigvolumes of data with multiple captures from multiple performerssimultaneously, can have more high-fidelity data than before.”
Even if occlusion becomes less of an issue, optical systems still faceother obstacles in becoming routine animation tools. They do not yet offerreal-time views of captured motion because CPU render capabilities are notas fast as the data collection process. (Although faster chips haverecently reduced the time in-between to mere minutes.) Optical systems costmore, and since sunlight includes infrared light, they do not work welloutside during daylight hours.
Bargain CapturePeak Performance Technologies of Colorado claims to be addressing opticalsystem limitations for video with its Motus system. The company created thesystem on the principle that it should be PC-based and video-based, thuslowering its cost and permitting data capture from outdoor video shoots.
“Originally, the technology was developed for medical and sportsapplications, but we created Motus specifically for animation use,”explains Steve Risenhoover, Peak’s principal systems engineer. “It canfunction in-studio as a standard optical system, with the body suits,markers, infrared cameras, and receivers that are part of Motus, but it canalso extract data from motion captured outdoors on videotape, as well.”
To capture daylight motion, the system works with any standard video camerato shoot an actor’s movement, even if the actor is not wearing sensors. Instudio, artists convert the video data into any standard, digital formatAVI, for example), and thanks to special software, the Peak system thenextracts motion data directly from that file. Peak has also created a newplug-in for 3D Studio MAX that debuted in January. The plug-in, dubbed’Kinecapture,’ permits MAX users to track movement from AVI files directlyto computer models created with that software. Bob Mulverhill, Peak’smarketing director, says these types of technologies herald “the advent ofoptical motion capture for low-end use-affordable capture tools forboutiques.”
What’s NextWhether high-end or low-end, motion capture has clearly become a permanentpart of the animation equation. The development of several softwarepackages aimed at easing the extraction of motion-capture data and applyingit to 3D characters has further helped the technology’s transition into theTV world. Among the most popular of these is Kaydara’s Filmbox, whichseveral users call a highly efficient tool for making motion capture partof a streamlined production pipeline.
But even the technology’s biggest supporters warn against thinking ofmotion capture as a panacea that might someday eliminate key-frameanimators and other tools.
“It’s a new arrow in the effects’ quiver, but you need lots of arrows inthis business,” explains George Johnsen, postproduction supervisor atThreshold Digital Research Labs, which uses the Polhemus system. AlisonSavitch, president of TDRL, says that for most applications motion captureworks best in conjunction with key-frame animation, not in place of it.
“For our show (Mortal Kombat), which has lots of effects and martial arts,we combine the two effectively, and I think that is how you will see itapplied for episodic TV in the future,” says Savitch. “The Polhemus systemcan very accurately replicate complicated movements of martial artists, butsometimes we want those moves to exist in a creature or other being or bedifferent from a human’s movement. That’s why we have skilled animators whocan play with, alter, or refine it.”
As motion-capture technology plays a growing role in the quest to createphoto-realistic, digital humans, Hollywood’s oldest visual effects company,76-year-old Pacific Title Mirage has taken up the challenge. The companyrecently decided to make motion capture the foundation of its expandingvisual effects/animation program with British motion-capture veteranWilliam Plant as its new president of production.
Plant quickly set up a large, Ascension-based body-capture stage andsupervised the creation of a proprietary performance-control systemoperated by a joystick for one hand and a unique, multi-accessed inputdevice for the other. The tool allows artists to puppeteer simple facialmovement for certain animated characters and is resolution-independent. Thebody stage and the performance-control system permit the company to directanimated TV and commercial sequences much like live-action-in realtime andunder one roof.
“The idea is to use pieces of different technologies, including some webuild ourselves, to stage live performances of animated characters in aregular production cycle,” says Plant. “We recently shot two Captain Crunchcommercials in a single day with 320 performance-capture takes. We thenapplied that data to lower-resolution, cel-animated 2D characters, sinceCaptain Crunch is not a 3D character.”
For more complex characters, Pacific Title Mirage integrates performancecapture with the company’s proprietary LifeF/x digital animationtechnology, originally developed for medical use. A recent demo conductedfor Millimeter showed an extremely convincing, moving, digital replica of afamous actor’s face created for an upcoming feature film. LifeF/x appearsto be pushing closer to the coveted goal of creating lifelike, animatedhuman faces.
According to Dr. Mark Sagar and Dr. Paul Charette, the company’sco-directors of research and development, LifeF/x uses data taken from anoptical, facial motion-capture session conducted by filming the actor withseveral high-definition cameras. No sensors are required since specialsoftware analyzes the high-def images and extracts data to digitally buildgeometric replicas of the complex movements of the actor’s face. Animatorsbuild those movements into the computer model of that face at the same timethe model itself is constructed. The CG facial model relies on data fromseveral sources to construct a face that both looks and acts like theperformer. Those sources include a Cyberscan of the face, scanned photos ormaquettes of the individual, and data from a generic, CG human facialmodel, all combined with the motion-capture data.
“The cool thing is that this system drives realistic tissue motion into thecomputer model,” Charette explains. “It is possibleto essentially fake thesame look for very short viewing periods by utilizing low-resolutiongeometry and then doing real-time texture updates. But this method, insteadof texturing a subtle facial movement, actually builds that movement withreal geometry. The wrinkles you see on this CG face are actual geometry,not texture.”
The technology also permits captured movements to be altered by animatorsor even inserted into other computer models by altering data or the modelitself, as opposed to key-framing changes. In other words, one can directthe entire animated performance even after the motion-capture session hasended.
“We can age the actor, make him younger, turn him into an alien, and stillhave those subtle facial movements that make the animated face seem to bealive,” Sagar says.