The independently produced CG film movement takes another step forward with this month’s release from the Weinstein Company, Hoodwinked. Producers insist the movie is a true indie project — made for less than $20 million. The project’s production model saw animation created at a studio in the Philippines; compositing, lighting, and texturing done at a different facility in India; and all conceptualizing, storyboarding, editing, and finishing work done out of a small production office in Burbank, Calif.
A sequence from Hoodwinked as it appeared in the Final Cut Pro interface. FCP was used to first previsualize the movie from storyboards and then edit it as the project moved along.
The executive summary: Filmmakers wrote the story themselves, found a single investor to back them, made the movie on a shoestring overseas, edited and assembled an embryonic version themselves in Final Cut Pro, shopped it at the Cannes Film Festival, and lured the Weinstein brothers to buy it as the first animated release for their new distribution entity. That agreement came with enough cash to finish in a FotoKem DI suite and add star power to the voice cast (Glenn Close, Ann Hathaway, James Belushi, Andy Dick, and Patrick Warburton, among others).
Cory Edwards, who co-directed the movie with Todd Edwards (his brother) and Tony Leech (who also edited the piece), explains that the journey was more complicated than all that. For one thing, the Filipino studio, Digital Eyecandy, founded by producers Sue Bea Montgomery and David K. Lovegren specifically for the project, was originally meant to be the place where the entire movie would be made. Time and budget limitations, however, forced filmmakers to add Prana Studios in Mumbai, India, to the mix, while also making some hard decisions about limiting the number of lead characters, crowds, digital sets, and so forth. But mainly, Edwards suggests, the indie nature of the project impacted the final creative direction of the story and imagery “just as it would with a live-action independent film.”
“Like many indie films, we did not have the luxury of re-working scenes and going back to the drawing board — we had to get scenes right the first time,” says Cory Edwards. “That meant we needed a battle plan to figure out what shots we needed earlier than a major studio feature would require. That is one reason we came up with a stop-motion type of look to the CG animation. Our mantra was to do something small very well rather than trying to copy something huge and grand poorly. So we picked a style that made it a little more quaint and less realistic, but funnier.”
Filmmakers adopted a stop-motion style for the film”s CG imagery in order to produce the animation within their tight budget and timeline.
The Edwards brothers drew all storyboards themselves; Leech cut them together with temp dialogue, music, and effects using Final Cut Pro 3 on a dual 1GHz G4 PowerMac computer in his apartment. That reel served as a basic template for the film, which allowed the animation process to get underway in the Philippines. As rough animatics for each scene were completed, the Edwards brothers and Leech continued building the evolving cut in Final Cut Pro (eventually upgrading from FCP 3 to FCP HD 4.5) on top of an Apple Xserve RAID (2.5TB) foundation, connected to the G5 through Fibre Channel at Burbank offices provided by FotoKem, the facility that later provided digital color correction services.
In the Philippines, artists created 3D images in Maya (4.0) as filmmakers spent much of their time at that studio personally supervising animators. Filmmakers also routinely collaborated remotely using a combination of Mac iSight teleconferencing technology on their laptops and Photoshop as a collaboration tool.
“I created about 200 Photoshop images representing changes in environments or characters, drawing arrows and colors and notes on the images, sort of like a John Madden telestrator [used on football telecasts],” says Edwards. “Sometimes, we could even fix frames ourselves in Photoshop, rather than sending it back across the world to India. We found that Photoshop was a great tool for drawing pictures to communicate visually with artists.”
Leech adds that the project’s upgrade to Final Cut Pro HD greatly smoothed the assembly and approval process during this period. “I down-rezzed the HD shots in Final Cut Pro by bringing them into 10-bit DV resolution timelines and rendering them out,” explains Leech. “When we moved up to Final Cut’s version 5, it really helped retain the image quality because version 5 has a much better scaler to bring images down to NTSC size without introducing artifacts like aliasing.”
Leech says that as shot approvals trickled through, Prana artists moved original, 16-bit TIFF images along the pipeline by compositing them in Digital Fusion and rendering them out in the MayaMan plug-in from Animal Logic Research. Frames were then copied from Prana’s servers to ATA drives, which were shipped to Burbank.
There, filmmakers opened each shot as an image sequence in QuickTime Pro. They then exported shots at an uncompressed 1920×1080 frame size. Would Leech opt for the same approach in the future?
“If we were to do this again, I would research converting the still image sequences to HD shots using a 10-bit codec that might provide some lossless compression,” Leech says. “A single shot from Hoodwinked is often in the neighborhood of 1GB to 2GB. With nearly 1,400 shots in the film, storage space fills up quickly.”
Shots were next brought into Final Cut Pro and scaled down in the 24fps edit timeline to letterboxed 720×480 versions. “This allowed us to create material that would go out to DVCAM [with a pulldown] or DVD [as 23.98 material] for screening purposes,” says Leech. “DVCAM tapes with foot/frame and timecode burn-ins were provided to Skywalker Sound for the final sound mix.
“Once we were ready to go to the iQ room at FotoKem to do our digital color correction and create a digital intermediate, we created HD timelines in Final Cut Pro, and did a simple copy-paste from the 720×480 timeline to the HD timeline,” he continues. “We then exported the HD reels as QuickTime movies using the same attributes we used when converting the still images into HD shots.”
Reels and EDLs were then brought on FireWire drives to FotoKem. The FotoKem team brought HD QuickTimes into Shake and parsed them into Cineon frames. From there, frames were loaded into the iQ, where FotoKem colorist Walter Volpatto color-timed the film,and also made certain visual fixes that money and time limitations prevented filmmakers from addressing earlier in the process. Cory Edwards describes this stage as “one more chance to direct the movie.”
“There were a handful of places where we had single-frame problems, and Walter was very helpful in painting things in or out,” he says. “There was one shot where a character’s head was not properly connected to his body — it was floating above his body. In the iQ, Walter was able to paint in the part of his neck that was missing frame by frame. That can be costly if you do too much of it, but on the other hand, in selected places, it’s a lot more cost-effective to have Walter do it over a few minutes rather than sending the shot back across the world to be revised.”
Filmmakers say the experience taught them several things they would do differently in crafting a future indie CG pipeline.
“It all depends on if the story is any good, obviously, but assuming so, the technology is now there to make these kinds of films independently,” he says. “I think a boom will happen now like the kind that happened in the 1990s with live-action indie films — you will see a lot of them.”