The creation of CGI characters and objects is no longer the exclusivedomain of big-budget feature films. Packages affordable to even desktopanimation houses are facilitating an explosion of 2-D and 3-D CGI effectsin commercials, broadcast promotions, and long-form television.
Dean DeCarlo, president of commercial animation house Celefex, New York,has been encouraged over the past six months by the rate of advancementsoftware developers are achieving in the area of character animation.
Celefex creates much of its CGI in Side Effects’ Prisms and ICE packagesand refines the work in Amazon Paint. Lately, DeCarlo reports, his companyhas also used Mac- and Windows NT-based programs such as 3D Studio MAX,Photoshop, and Illustrator for creating CGI.
“A lot of this software is getting really amazing,” he says, hopeful thathis amazement will continue through subsequent releases. “3D Studio isstill on Version 1.1 of the software,” he allows. “It has great modelingand rendering, it’s fabulous with particles, and it’s got a lot of coollooking effects. Unfortunately, the animation choreography program is stilla little weak.”
Inverse kinematics (the ability to calculate muscle and joint movementswithout having to animate them frame by frame) is a function which is vitalfor creating efficient character and creature movement. DeCarlo expresses adefinite eagerness about the inverse kinematics tools and the new userinterface which are part of Side Effects’ new Houdini package. “It’s stillin a very early cut,” he says of the software, “but I think it might berevolutionary. [Side Effects has] been talking about providing a specificcharacter animation add-on for Houdini that would have comprehensiveinverse kinematics. That would really take the work we can do to anotherlevel.”
Ron Thornton, founder of Foundation Imaging in Valencia, California, hasfound recent developments at Newtek to be finally facilitating efficientcharacter animation even in his all Mac- and NT-based shop.
“We’re only beginning to get enough tools to be able to manipulatecharacters in a truly convenient way,” he says. “Lightwave’s architecturehas been opened up to allow many more plug-ins. Morph Gizmo (distributed byNewtek) allows you to sync a character’s speech easily by morphing from onemouth position to the next.” This method, he says, is easier to use thantraditional morph tool Elastic Reality.
In the field of 2-D CGI, Thornton has been particularly pleased withLightwave’s Cel Look which can add to computer animation the definite edgesand hard color traditionally associated with cel animation. “It’s given usthe ability to synthesize the appealing look of cel animation without thework,” he says of Cel Look, the plug-in that is allowing Foundation tocreate the direct-to-video Batman movie for Warner Bros. Animation.
“This opens up a whole new market in 2-D,” says Thornton. “Until now, ifyou wanted the cel animation look, you had to print cels out and send themto Korea or Japan for inking. Then if you wanted to combine some 3-D moves,it was a very elaborate process. Only big-budget projects could do it, butnow much smaller shops can do it at a reasonable cost.”
Thornton continues using Lightwave and After Effects for Photo real effectson Star Trek: Voyager and Deep Space Nine, and he recently completedseveral shots for the upcoming feature Jackal, which stars Bruce Willis.”Of course, packages like these are opening up possibilities in long-formtelevision,” says Thornton, “but even more exciting is that it’s letting uscompete in feature films.”
This is a common theme of late among CGI houses who, until very recently,were confined to video work. “We’re headed toward some intense filmeffects,” says Mark Kyle, digital effects supervisor of Disney i.d.e.a.s.,Lake Buena Vista, Florida. This full-service facility, situated at DisneyMGM studios, offers CGI for broadcast and commercial clients and plans toexpand into features shortly.
“All the CGI tools we’ve invested in in the past have been geared to videoresolution only,” says Kyle. “We want a future in feature film, sorendering speed is now very important to us.”
As part of a major hardware upgrade, Disney i.d.e.a.s. plans to installSGI’s new faster-than-an-Onyx Octane box as soon as they ship. Kyle hasalso ordered SGI’s recently-released 02 as well, declaring, “It’s really agood 2-D box.”
Kyle continues: “We’re not the only people going to NAB to see what willhelp us do feature film effects. I think it’s part of a lot of companies’five-year plans. So we’re all going to compare Fire versus Flame and seewhat the next level is of Discreet Logic. For CGI I think we’ll stick withAlias Wavefront Power Animator, a seat of Softimage, and some packages thatrun on NT like Electric Image and 3D Studio. These all seem to be going inthe right direction.”
A particular area Kyle would like to see development take place in isradiocity-the way in which virtual objects are virtually lit. “It would benice to have real lighting scenarios as part of a CGI package,” he says,”something that could match the way lighting is discussed in the realworld. If we could just say, ‘This object is being lit by a 2K tungstenspot light from over here,’ and have it happen with all the necessarybounce and reflection, it would make it a lot easier to collaborate withdirectors and DPs. None of the packages can do that with radiocity yet.
“Obviously, you can light CGI objects with a lot of detail with most of thesoftware out there,” he continues, “but you can’t indicate the flavor oflight-florescent, sunlight or whatever-so those of us doing the computerwork are speaking a different language from the director or the DP. Itwould be nice to have the ability to sit down with a traditional-mindeddirector and have him be able to speak his own language about the lighting.”
As has been the case traditionally, feature film effects set the bar foreverybody else. Larger companies already specializing in CGI for featuresare generally ahead of companies relying on off-the-shelf software. Thesefacilities generally have the staff and machine power either to splicetogether the best features of existing packages or to write their ownproject-specific code.
At least one feature film with a budget in the $100 million range will openevery single weekend for 12 weeks this summer, and ground-breaking CGI willbe a part of most of them. “ILM did about 75 CGI shots for Jurassic Park,”says Richard Hollander, president and senior visual effects supervisor ofVIFX, Santa Monica (now a subsidiary of Newscorps Company). “Now bigeffects movies have hundreds and hundreds of CGI shots.”
Hollander reports the company is engaged in ever more CGI productionenhancement as opposed to just creating big money creature shots. “We’recreating virtual sets and replacing live action with CGI,” he says. “Weworked on Mission: Impossible creating the all-CGI helicopter in thetunnel, and we’ve got a lot more of that kind of thing coming up thissummer.”
Bigger shops that have the ability to build objects-helicopters, planes,etc.-can learn each time and build up a cache of models and tricks forusing them. “It’s all building on itself,” Hollander says.
R&D for VIFX is more a question of learning the ins and outs ofpre-existing software than in creating their own. “We do much more adaptingand gluing things to existing packages,” Hollander says, “than we do inwriting our own code.” When pressed, he singles out Side Effects and Aliassoftware as being the most amenable to such adaptation.
“Our internal R&D is in lip movements, and modeling fabric,” he says. “Wewant to make things talk with facial animation, to be able to generate allthe physics associated with making a virtual animal or person run down ahall, say, without having to animate every frame. How do we maintain therules of physics? How do we automate the rules? These are the types ofquestions that need to be answered.”
Michael Rosenbaum, visual effects coordinator at Sony Pictures Imageworks,Culver City, previously worked with Imageworks’ president, Ken Ralston, atILM on such watershed projects as The Abyss. Now at Imageworks, Rosenbaumis visual effects coordinator on Contact, the Robert Zemeckis-directed,big-budget adaptation of a Carl Sagan science fiction novel.
“I have a different perspective on CGI now that I’m an effects supervisoron the entire film,” says the former CGI specialist. “We can do some thingsin the computer, but it could mean much more time and money than if we dothem as miniatures or some other kind of practical effect. I never used tothink of that when all I was concerned about was computer graphics. Thecost was never so much an issue.
“As studios compress their schedules,” he continues, “you get less and lesstime to try things a new way. Only ILM-with projects like Jumanji when theydid so much work on creating CGI hair and Twister with all those storms-cantake nine months for R&D of a single effect.”
CGI water is particularly difficult to create, and Contact required a greatdeal of water effects. Rosenbaum’s first impulse was to try to go with CGI.”If we had the time, we could probably come up with ways to do the waterentirely using CGI, and we’d probably have more control of it that way,” hesays. “But we don’t have the time, so we’re mixing practical effects withCGI.
“One of the big problems shooting water in miniature,” he continues, “isthat it doesn’t scale. In Contact, I started with water elements shottraditionally on a stage, and digitally enhanced and augmented the water’sshape and motion to maintain proper scale.”
Rosenbaum points out that Imageworks has committed to a program of R&Dwhich will be ongoing and not project specific. This division is headed bychief technology officer Lincoln Hu, who formerly held that same title ofCTO at ILM when Rosenbaum and Ralston were there.
H.B. Siegel, ILM’s current CTO, looks further into the future than do thoselimited by the constraints of existing software. Since ILM is one of ahandful of companies with the wherewithal to write project-specific code inhouse, it has a lot of the above mentioned trouble spots licked. They cameaway from Jumanji and 101 Dalmatians with a techniques for rendering hairand fur that nobody else’s package can touch.
Hair and fur is created at ILM using proprietary in-house code specificallydesigned for modeling and rendering a great number of long thin cylindersthat interact with one another yet have a life of their own. “It’s a lot ofcomputing,” says Siegel, “trying to figure out where a hundred thousandobjects will intersect, and having it all happen automatically so you don’thave to physically calculate the way light will pass through them all.”
CGI cloth is another area ILM keeps busy improving. “It’s still verydifficult to get cloth to look perfect,” he says. “It’s much like hair,with all the individual strands.”
Siegel points to the area of inverse dynamics as a major focus of ILM’sR&D. “Say a character bounces a basketball and throws it in a hoop,” heexplains. “The simple ballistics you learn in first year physics is stilldone by an animator. If you just programmed in the physics having to dowith pressure, direction, etc., you would have a realistic,interactive-type situation where your character would often miss thebasket. With inverse dynamics, you could predetermine where the ball mustend up, and the software would figure out exactly how it had to be thrown,and from where, and what all the surrounding environmental conditions wouldhave to be.”
Looking even further ahead, how long before CGI actors replace thetraditional kind? Though photo real CGI actors would be desirable for allsorts of reasons-you can own them outright, they take direction well, andthey never storm off to their trailer-Siegel insists that the practicalcreation of such a beast is still a far-off dream.
“There is a whole slew of things between where we are and creating CGIhumans,” he says. “We need facial animation so that an animator is notforced to think how every dot on characters face will move or how bonesinteract, how hands and arms move with the shoulder and on and on. Thosethings are still handled relatively crudely today as an animator sits thereand plays with his thumb trying to observe how it works with the rest ofhis hand.
“We can get very close for animals and nonexistent creatures,” he says,”but when it comes to humans, that’s a whole different thing. We have verysophisticated hardware in our brains that can look at a person half a mileaway and tell who it is without seeing their face by how they walk. Evenwith our most sophisticated techniques, there is still a long way to gobefore we could efficiently create a CGI human that would fool anysix-year-old into believing it was real.”
Some things are still best done the natural way. Pyro-fire andexplosions-in films and on TV is quite often the result of combiningpractical effects and CGI, rather than going the all-CGI route.
Peter Kuran, president of the 20-year-old Sylmar, California, based effectscompany VCE (Visual Concept Engineering), has stored some genuine fireeffects on Pyromaniacs, a CD-ROM filled with fires and explosions whichhave been sampled on many budget-conscious television shows includingBabylon V and Hercules: The Legendary Journeys.
“The true nature of fire is almost impossible to get,” says Kuran. “You canget away with it if it’s a faraway city engulfed in flames, something offin the horizon, but close up it’s still pretty easy to tell real from CGIfire.”
The randomness of fire’s shape, speed and movement, he maintains, would beso difficult to reproduce with any sort of efficient computer routines,that it simply isn’t worth doing. “I’d say perfect, close up, CGI fireswill happen about a day before actors are computerized,” Kuran predicts.”I’ve seen so-called CGI actors on TV, but they look like weirdsurrealistic dolls. The level of realism in CGI fires I’ve seen is aboutthe same.”
Kuran is no Luddite, simply holding on to the traditional role of the “pyroguy” on the set. At VCE, he has created all- or partial-CGI fires when itwas the only way to go. “CGI fire is great when you want fire to dosomething you can’t get real fire to do,” he says. “If a flame turns into aguy with a head, hands, and feet in an ad for charcoal, we’re going to dothat in the computer.”
In the above-mentioned Kingsford Charcoal spot for Young & Rubicam, SanFrancisco, Kuran imported frames of a real explosion into Photoshop. Then,in Photoshop, he created key frames of the fire matched to the form andmovements of a real performer. He then used Elastic Reality to morphbetween the newly-created key frames to create the animation of the fire”performing.”
The task of creating believable computer-generated water is actually aseries of very difficult tasks. To simulate the transparency, reflection,refraction, and movement of water requires specially designed code and lotsof rendering power.
Richard “Dr.” Baily, principal at Image Savant, Hollywood, specializes increating water for spots and features. “There are a whole lot of verysubtle visual cues that make an effect read like water,” says Baily. “Theanimation, the surface character, the surface tension, reflection, andrefraction. You need to have different layers of scale when water is doingits thing. And then you need very small layers of detail like droplets,subdroplets, and foam.”
In this recent spot for Lubriderm, Baily started out creating a cleargeodesic sphere in Wavefront Explore-his package of choice because of itseffectiveness with his own home-grown code. Then he used his proprietaryprogram which “turns rigid bodies into Jell-O-like objects.”
“If you looked carefully,” Baily says, “you could see that this new objectwas made of polygons. I then increased the resolution until the polygonswere too small todetect.”
At this point, Baily used his software to infuse all the predesignedwater-like properties into the object so it can pour, splash, ripple, etc.,and anything in the background will distort appropriately.
All the rendering was ray-traced using Wavefront’s Front End to create aconvincing degree of refraction. “I used the Explore rendering tool, to dothe actual rendering,” Baily says. “It gives you the most realistic waterand the lighting quality is better for this type of job than it isRenderMan.”
Says Baily: “I’ve wanted to generate refractive liquids in the computer foryears. It’s still a frontier in CGI. I had the principles, but I reallycouldn’t do it before Wavefront Explore came along. Its truly openarchitecture lets you devise whatever plug-ins and utilities you want. WithWavefront Explore your only real limitation is your own cleverness.”