Dreamworks” Eagle Eye is all about the art of a nonstop chase. Director D.J. Caruso (Disturbia) keeps upping the ante as the film”s protagonists try to escape all manner of pursuers. Worst among these pursuers is a missile-bearing drone aircraft that dogs the protagonists even after they drive their car into a tunnel. As they weave in and out of speeding traffic, a semi behind them loses control, flips over and gets hit by a missile. The explosion that follows is fearsome, and it”s especially scary when you consider that both the pyro and the flipped truck are not CG—they”re real.
Caruso”s Idea was to do as much of Eagle Eye as possible as stunts and not in animation. So in this shot the CG element wasn”t the truck, it was the environment—created in the computers at Sony Pictures Imageworks. SPI Digital Effects Supervisor Dave Smith says, “Usually, if people wanted to do an effect like this they would shoot a tunnel plate and put in a burning CG truck. What we did in this case was to flip that around. We had the natural dynamics of an exploding truck captured on film, and we provided the environment—which used to be the hard part!”
On a safe outdoor location—a former airbase—the production photographed a full-size semi running up against pylons, flipping over, and exploding. “What”s wonderful was that they can now do very elaborate stunts a little bit safer with a lot more visible rigging in the photography because we can remove that rigging from the plate, Smith says. “In this case, we cleaned up a lot in the plate photography.”
For rotoscoping, Imageworks used the program Silhouette (from SilhouetteFX). “We also have our own suite of tools that wrap around the Silhouette curves,” Smith says. Imageworks” facility in Chennai, India, did the roto work for Eagle Eye and then sent the Silhouette curves back to Smith”s team in Los Angeles. “Then, if we needed to, we made small adjustments within our proprietary compositing package, Bonsai.”
The cleaned-up photography of the truck then had to be seated convincingly inside the virtual tunnel, which was actually a CG extension of a real tunnel in downtown Los Angeles. It was there that the protagonist”s car had been photographed dodging traffic and racing past camera. But Smith says, “The actual tunnel was just a couple of blocks long. We needed to extend it to about seven blocks for this chase sequence.” Since the shots containing the virtual background had to intercut with shots photographed in the actual tunnel, Smith says: “Our virtual tunnel had to contain accurate details—like signs and grates—at the right points so that it felt consistent with the shots around it.”
To enable Smith”s team to replicate the real tunnel in CG, Lidar Services was employed to scan the location. “The Lidar Services people made a model to some degree for us out of those data points,” Smith says. “That was the basis for getting our model to be as accurate as possible.” Imageworks then built the virtual tunnel using Autodesk Maya and proprietary plug-ins. For the shots in this sequence where the camera was moving, Smith”s team used 2d3 boujou and proprietary code to track the camera.
One problem that had to be solved was adjusting the scale of the virtual tunnel to accommodate the truck photography. “The truck was actually blowing up higher than the interior of the tunnel,” Smith says. “So we had to figure out how to cheat that and make it feel like the truck fit inside the tunnel. The width of the road was also different, so we had to go back and forth to tie it in with the plates of the real chase that were intercut around this shot.”
Lighting the virtual environment was especially challenging, since light from the exploding truck had to bounce off the tunnel walls in a convincing way. “To get the explosion to feel like it was in that tunnel it involved interactive lighting,” Smith says. “First, a normal render of our tunnel geometry would just have the tunnel”s built-in fluorescent lights—1,700 4ft. fluorescent tubes. And then we had to have the full interaction of the light from the explosion, with the proper shadows that it throws. Our tunnel had a lot of geometry and a lot of lights falling on nooks and crannies. We wanted the lights to fall off in a way that matched what happened in the real tunnel. That”s always been a challenge, but it”s becoming less of a challenge.”
Smith credits Imageworks” raytracing renderer Arnold for helping them manage the virtual lighting in Eagle Eye. “This is the first visual-effects film where I’ve used Arnold, though it was used to render the animated film Monster House,” he says. “In the past, we”d think of how to do this with six or seven lights to keep our render times down. But with the Arnold software we”re getting to a point where we can use it to get a photorealistic look and still make it easier for our lighters. They could light our virtual tunnel exactly the way that the production people lit the real tunnel. They stuck 1,700 little tubes all the way down the virtual tunnel and turned them on. They could tell the Arnold renderer that the tunnel was made of cement and then just let the light bounce around like it naturally would.”
As the protagonist”s car speeds away from the burning truck, the brightness of the explosion can be seen illuminating the car”s rear bumper, highlighting the narrowness of their escape. “Our new lighting package gives a lot more power to the lighters to spend time making that glow travel in the scene correctly and getting the shadows to move properly,” Smith says. The Arnold renderer was also programmed to factor in the properties of the plate photography of the cars and truck. The end result was render times of up to eight hours per frame.
Compositing the virtual tunnel with the stunt driving footage was done in Imageworks” Bonsai. “We worked interactively to get light flares to bounce around so that the shot didn”t feel like a ‘Comp.” We also added things like camera shake to give it a more dynamic quality,” Smith says.
The goal throughout Eagle Eye, he says, “was putting our CG in places where we”re not used to seeing it. We”re constantly blurring the line. If we keep bouncing around between CG and an effect that was shot in camera, it keeps everybody guessing.”
Director: D.J. Caruso
DP: Dariusz Wolski
Visual effects supervisor: Jim Rygiel
For Sony Imageworks:
Visual effects supervisor: Jim Berney
Digital effects supervisor: Dave Smith
CG supervisor: Bob Peitzman
Color and lighting and compositing: Brian Adams
Look development: Doug Smith
Modeling: Yun Kang
Rig removal: Kumar Selvaraj & Sam Vaidhyanathan
Roto: Sanket Devangkumar
Matchmove: Shaik Sadiq
Lidar scans: Lidar Services