cpn connect
careers

Marrying Stop Motion and CGI for "The Corpse Bride"

2/14/2012 8:28 PM Eastern
In Tim Burton's The Corpse Bride, the dead are more lively the living. The film follows young Victor Van Dort (Johnny Depp) as he tries to make his way back to his fiancée in the world of the living after accidentally "marrying" the beautiful but half-rotted Corpse Bride (Helena Bonham Carter) and being dragged into her fun-but-deceased world. For this latest film foray, Burton relied on the talents of co-director Mike Johnson and director of photography/visual effects supervisor Pete Kozachik, both veterans of two previous Burton stop-motion features: The Nightmare Before Christmas and James and the Giant Peach.

At the start of Corpse Bride production in November 2003, Burton was completing Big Fish. He continued with production on his next live-action feature, Charlie and the Chocolate Factory, produced essentially simultaneously with Corpse Bride.

"In a co-directing situation, one director usually handles one sequence while the other handles another," Johnson says. "Our approach was more organic. Tim knew where he wanted the film to go as far as the emotional tone and story points to hit. My job was to work with the crew on a daily basis and get the footage as close as possible to how I thought he wanted it." Burton had exclusive control over reviewing and approving storyboards, concepts and scenes; Burton and Johnson shared the responsibility of directing the performances of actors.

Corpse was originally to have been shot on film, though a last-minute change of heart by the studio helped introduce a different technology. In 1997, during pre-production on Henry Selick's feature, Monkeybone, Kozachik was looking for a method of image capture that would streamline the painstaking process of integrating stop-motion characters with pre-filmed live actors. "Even using the special ground glass optics we made to draw the image for the film camera's video tap, we wound up with a video composite put together on the stage that lacked the resolution needed by the animators. I thought there might be something out there, not exactly a video camera, that could produce enough resolution for that purpose, as well as for our production footage." After wrapping Monkeybone, Kozachik continued to test candidates for a practical means of shooting feature animation digitally.

In early 2003, the Corpse Bride production unit wasn't interested in digital capture for stop motion; the team was instead prepping the movie for a film shoot. "I think they had booked every known Mitchell camera—25 of them," recalls Johnson. "I was really pushing for digital but didn't know how to pull it off technically. Pete came on, and he was a big advocate of the digital idea."

Just two weeks before filming was to begin, Kozachik and visual effects consultant Chris Watts came up with a solution using digital still cameras that was deemed viable by WB senior vice president of physical production and visual effects Chris DeFaria. The production then went digital.

After experimenting with a dozen different models, Kozachik opted for a basic digital still camera, the Canon EOS-1D Mark II, an off-the-shelf model that was outfitted with adapters to allow the use of Nikon prime lenses (14mm-105mm). "One reason I went with this particular camera is that its image chip is just about the same size as Super 35 film negative, so we could use Nikon lenses and treat them like regular 35mm cine lenses and get the same effect—the same depth of field and angle of coverage. I knew that we were going to be fighting to make this look like a 'real' movie because we weren't shooting on film, so I wanted to at least have the optics look like movie optics."

Making the Dead Come to Life
Animation for Corpse Bride took place in a large stage space at 3 Mills Studio in East London. A dozen animators/puppeteers were put to work when production began, but that number had nearly tripled by the end of production. The initial group spent time developing each puppet's unique characteristics. "Each of them has his own mannerisms and style of walking and talking," explains Animation Supervisor Anthony Scott. The team created a large library of shots, based on tests shot with the puppets, to use as future reference. "A lot of work went into portraying each character through its motion, which is what animation is," adds Johnson. "We had some animators working exclusively on Victor walks for weeks."

The puppets themselves, built by Mackinnon and Saunders, were typically about 17 inches tall and animated on sets built three to four feet off the ground with trap doors that allowed animators access to the sets' surfaces to manipulate the puppets. The three primary characters—Victor, Victoria and Corpse Bride—were fitted with heads the size of golf balls that contained special gearing to allow the animators to manipulate individual parts of the puppets' faces. "The gearing was accessed with an Allen wrench through either ear or through the top of the head," describes Scott. "Turning it one way would raise one corner of the mouth; turning it the other way would lower it. That's how we created a smile or a frown."

Groups of characters were animated together, for the most part. One elaborate musical scene contains a group of animated skeletons dancing together. Reminiscent in style to "Elephants on Parade" from Disney's Dumbo, the scene also draws inspiration from stop-motion master Ray Harryhausen (and a similar scene from Jason and the Argonauts). Harryhausen happened to drop by for tea and a visit while the scene was in production. "I have always been intrigued by the stop-motion process," says Johnson. "We're all influenced by Ray Harryhausen. Production sort of ground to a halt that day as all the animators came rushing out to meet him. He had a look at the skeletons and gave us the thumbs up. It was one of the key moments of the whole experience."

The animators' work was actually spread over 25 to 35 individual setups/stages, each having its own Canon digital camera. (A total of 32 cameras were used on the film.) Each camera was outfitted with a "grabber" system that enabled the animators to capture frames and download them into a computer to assemble a short "reel" of the shot being produced to check their work. "Unlike the point-and-shoot digital cameras most people have, these digital SLRs can't actually provide a live image," explains Kozachik. "The mirror is always up for viewing, except when you shoot."

The solution lay in attaching a small, sensitive black-and-white surveillance camera to the camera's optical glass viewfinder. "For the most part it worked just fine for us, though you're still dealing with the glass optics of the viewfinder, so it didn't always provide us with the quality we would sometimes need," notes Scott. "The camera's stopped down and we had to use work lights, so details like cloth and hair were hard to see sometimes."

The actual images were stored on a 1GB image card that was capable of holding about 100 frames of animation. Eight roving camera teams—each team including a lighting cameraman, an assistant, a lighting electrician and a set dresser to deal with any art department issues—worked with the animators to set up shots. Each camera team had a "lighting station" workstation—comprising an Apple G4 computer and a monitor to assist in checking lighting and framing—to view TIFF file versions of the camera's images. (Additional computers operated motion control equipment.) Once a shot was approved, the computer was removed and the animator was left to shoot the scene using his still camera and "grabber" computer/camera system to check his work.

Scenes were developed initially from storyboards created by a team led by story department head Jeffrey Lynch. "We shot as close to a 1:1 film ratio [one take per shot] as we could because there was no time for reshoots," explains Johnson. "We did most of our experimentation in the storyboard process—as many ways as needed—to get the scene how we wanted it. There was no coverage, as there would be for a live-action film."

Johnson would go over the scene with the animators, sometimes acting out the scene, if necessary. The animators would create a "dope sheet"—in which a shot was broken down, frame by frame—to account for key "hits" (such as beginnings of words, action points, etc.). The animators would then shoot tests of the scene, often shooting on "2s" or "4s" (meaning shooting just every second or fourth frame of what would appear in the final animation). "The next day, when they'd finish their test/rehearsal, we'd cut it in and see how it played in the reel and fine-tune from there," Johnson says. "We might do some lighting tweaks, performance tweaks or have the art department get in and touch anything that needed it. Then we'd close the curtain and let the animator animate the shot."

The animators would sometimes make use of the voice and/or video recordings of the actors, a practice also common in cel animation.

Once photographed, the frames were manipulated by a team of "data wranglers." Using a workflow developed by Chris Watts, the frames were downloaded from the camera image cards as RAW files, converted to Cineon files and processed through a "color cube." "The color cube is a 3D lookup table created by FilmLight Ltd. that forces the image data into behaving like a particular Eastman Kodak film stock—in this case, 5248, one of my favorites," Kozachik explains. "With this film emulation, we could actually rate our cameras at ASA 100, then take our light meters and spot meters and, with great confidence, shoot as if we were using 5248. Sure enough, the footage would come back and look just like it." The frames could be processed further to generate a TIFF file for viewing on the lighting station computer monitors so lighting, composition and color could be previewed.

A fair amount of visual effects, delivered by London's Moving Picture Company (MPC), were applied to the 1,000 or so shots in the movie, though most of the effects simply painted out puppet supports and similar set equipment. Some visual effects elements—groups of birds and butterflies, for example— were created completely in CG, though others were composited as visual effects from real-life elements. "We shot real candle flames with Mini DV cameras," explains Kozachik. "I was really trying to stay true to the nature of the puppets, which are made of rubber and foam. If we're shooting real character objects, then our effects should come from the real world, too."

As for shooting the characters themselves, the trick, he says, was obtaining visually interesting shots that would dependably support the director's storytelling. "The challenge is keeping the action clear and simple with lighting and composition. There's a discipline to clear storytelling with these puppets. You want to be abstract, but one can easily go overboard with these critters because they aren't as familiar to the audience as real humans. The characters don't necessarily translate the same as if you're shooting a real person. You have to consciously balance arty atmosphere and graphic clarity so as to not confuse the audience about what it is they're looking at."

Helpful, of course, was applying a digital intermediate process to the footage. "The DI was great because we had footage from 30-some different setups, and the footage didn't always look the same, naturally, from camera to camera."

The overall results are strikingly smooth and beautiful. The storytelling and well defined characters carry the film with the aid of some masterfully executed stop-motion animation. "We were able to get the performances out of the puppets from a combination of everyone's hard work," says Johnson. "The puppets were designed with mechanisms that were subtle enough to bring out emotion in the characters' faces, and the animators were experienced and talented enough to take them to the limit."