RSS

Home
Loading

DNxHD Marvel

 Web Expanded

DCPTV: Mix Sound for Film Feature: Iron Man

Audio Podcast: Cutting Iron Man in HD

Sidebar

Shooting Shell Head

<i /> Photos: Zade Rosenthal. TM & © 2007 Marvel. © 2007 MVLFFLLC. All rights reserved.

Photos: Zade Rosenthal. TM & © 2007 Marvel. © 2007 MVLFFLLC. All rights reserved.

Although Iron Man arrives to cinemas largely stereotyped as another big-budget popcorn movie, the character affectionately known as “Shell Head” to comic-book fans is bringing some important new technical developments with him to the big screen. In particular, those developments include an extraordinarily complicated visual-effects effort. Industrial Light & Magic (ILM) — along with a handful of other vendors including The Orphanage, Los Angeles; The Embassy Visual Effects, Vancouver; and CaféFX, Santa Maria, Calif. — built around 1,000 digital shots that advanced recent techniques used at ILM to make the Pirates of the Caribbean movies and Transformers and improved on the art and science of digitally costuming live actors.

But potentially the most important development of the production was the decision to edit the picture in a tapeless workflow using Avid's new DNxHD 36 codec — making Iron Man the first large-budget studio picture to be cut in this fashion. This move impacted virtually the entire editorial process, much of the visual-effects process, and the method for viewing dailies during production.

HD edit


Editor Dan Lebental, A.C.E., says this decision was the inevitable outcome of the move by Marvel Entertainment to produce Iron Man, the upcoming The Incredible Hulk, and all future Marvel-related feature films itself. Marvel, therefore, built a temporary editing facility in Los Angeles and purchased several Avid Media Composer Adrenaline systems to cut its first two movies. (The company is in the process of building a permanent production facility, but that facility wasn't ready in time for Iron Man's and Hulk's production cycles.)

“The new generation [of DHxHD 36-powered Avid Media Composer Adrenalines] had just arrived,” Lebental says. “So, even though we knew there could be some bugs since it was so early, we urged them to go in this direction. I was excited about editing in HD. Since the advent of computer editing, we have never had particularly clear pictures to work with, and now that has changed. They were crystal-clear on this project — it was amazing.”

Lebental insists that cutting in HD “significantly changes things” for editors, and even though he freely admits to periodic technical hold-ups, for him, there is no going back.

“I was looking at an HD picture the whole time,” he says, adding that FotoKem sent the team data after an HD transfer from the original film negative, which they then loaded into the Avid Unity system. “There were some film dailies for the DP [Matthew Libatique — see sidebar] to check certain things, but by and large, for editorial purposes, we were projecting and viewing the same HD footage that was in the Avid — the same media we were cutting with. That is an important change. As the editor, it let me look at things like hair and makeup and give a more informed opinion right away, rather than having to go check it in film before giving my thoughts. As recently as four years ago, we still had a parallel film-cutting process, and now, that is all gone. It's a very different workflow.”

For Iron Man, Lebental's team relied on eight Media Composer Adrenalines (quad-core Apple Mac Pro workstations with 24in. Dell LCD monitors and one with a 30in. Sony LCD monitor) networked into 16TB of Unity storage. During production, dailies were usually played directly out of an Avid using a 2K projector projecting imagery onto a 10ft. screen in the dailies trailer. Director Jon Favreau viewed the action on a 50in. Panasonic plasma monitor in Lebental's edit bay, and the team also used this setup for small, informal screenings at the production editorial office, which was located near the main shooting stage at the old Hughes Aircraft facility in Playa Vista, Calif. (Cutting moved to the temporary Marvel facility in Santa Monica after shooting wrapped.)

Lebental says a lot of patience was needed given that the production was editing on infant technology. But even with “quite a few crashes,” it was worth it.

“It's a lot of data to be moving around — and this is brand-new technology, so, of course, there were some technical problems,” he says. “But the problems were minor compared to getting to cut in HD and view crystal-clear images on large monitors. It also greatly helped the preview process to screen dailies off the same media and not have to conform them in any way, or send them out to anyone. We literally finish here [in the editorial office], go output to a Sony HDCAM SR tape, and project it in a large theater, and it looks great.”

<i />Although Stan Winston Studios built three physical suits for Iron Man (played by Robert Downey Jr., pictured here with Director Jon Favreau), Industrial Light & Magic used technology it had developed for Transformers<i> to enhance the suit''s complexity—including the wire-heavy “under-skeleton.”</i>

Although Stan Winston Studios built three physical suits for Iron Man (played by Robert Downey Jr., pictured here with Director Jon Favreau), Industrial Light & Magic used technology it had developed for Transformers to enhance the suit''s complexity—including the wire-heavy “under-skeleton.”

The project was such an early application of the Adrenaline HD system that one of the project's major bugs was related to matching sound to picture during the editorial process. Assistant Editor Dawn King says the editorial team had periodic trouble rebuilding open-media framework (OMF) databases in Adrenaline after receiving production sound on hard drives. They also had problems with audio timecode in the databases dropping out late in takes.

“For the OMF database problem, we used the AIFF audio format for our sound, and when copying media onto our drives, the media would not relink, and the databases would not rebuild properly in the Avid [much of the time],” King says. “The workaround we used was to import the audio media, rather than copying it onto drives. This can be problematic if you want to rename clips, because when you import audio linked to a master clip, the master clip will revert to what it was named when it was digitized. It's not a big issue when you first bring media into the Avid to work with, but if you have to re-import media, it can cause extra work. With the [timecode bug], we had to re-enter the sound timecode in the database for every take in the show.”

An offshoot of the HD editorial process was the fact that it benefited ILM's artists for the massive visual-effects portion of the project, because it allowed the facility to more efficiently search for alternative background plates, run tests, and provide temp effects shots for preview screenings, while significantly reducing the number of filmouts required to accomplish all those things.

Ben Snow, ILM's visual effects supervisor on the project, working in collaboration with Senior Visual Effects Supervisor John Nelson, says he wholeheartedly agrees with Lebental that editing in the DNxHD 36 codec was a major boon to the project, despite the bugs.

“It really changes things,” Snow says. “It's easy to forget that a couple of years ago, we would have had to film out great swatches of unfinished work so they could have temp screenings. This [HD format] gives you a very good looking temp screening that is digitally projected at basically 2K, and that lets you know you are in good shape. We got the same [Adrenaline] Avid system here [at ILM], and yes, there were some bugs, like with any new technology release. We were also limited because we only had one of those systems up and running during production on this movie, instead of several like we had with older Avid systems. But they telecined the entire movie to HD resolution, and that's a big thing. It wasn't quite as high quality as a film scan, but it gave us really good material to work with when seeking alternative background plates and running our tests.”

<i />Although Stan Winston Studios built three physical suits for Iron Man (played by Robert Downey Jr., pictured here with Director Jon Favreau), Industrial Light & Magic used technology it had developed for Transformers<i> to enhance the suit''s complexity—including the wire-heavy “under-skeleton.”</i>

Although Stan Winston Studios built three physical suits for Iron Man (played by Robert Downey Jr., pictured here with Director Jon Favreau), Industrial Light & Magic used technology it had developed for Transformers to enhance the suit''s complexity—including the wire-heavy “under-skeleton.”

Digital costume


Realism was a constant topic of discussion on the visual-effects front, considering there were multiple versions of the Iron Man suit in the movie and each of them had to evoke realistic and heroic movement. In the majority of the shots in the film, the suit was called upon to display fantastical mechanical abilities or movement that simply wasn't possible with Stan Winston Studios' practical suit worn by actor Robert Downey Jr.

Therefore, the team developed a highly sophisticated digital costuming solution. The end result, according to Snow, was the development of digital pieces of costume that evoked a sports-car mindset. “It was fully featured, tricked, and pimped out,” he says with a chuckle. “The actor would wear our specialized mo-cap bands and whatever parts of the suit he could comfortably wear, and then we'd fill in the rest with CG — in most cases, doing almost the entire suit and having it move very exactly on his body.”

ILM took techniques and proprietary tools developed for its inhouse iMocap performance-capture system, enhanced for the last two Pirates of the Caribbean movies, and further advanced them for Iron Man.

Snow says the Iron Man team used iMocap tools on set, but with a lower profile than Pirates — using a single data-capture camera, rather than multiple camera setups. That motion was then pumped into various CG pieces of the Iron Man suit to make it move as Downey Jr. did. Another recent film to pass through ILM, Transformers, directly impacted lighting and texturing — which was based on improved high-dynamic-range-image-based lighting tools and metallic surfaces created for Transformers. The Iron Man team extended those tools to achieve the distinctive red and gold of Iron Man's final suit (dubbed Mark 3) and the brushed-chrome look of the interim silver version (Mark 2). (Early Mark 1 suit shots in the movie were created at Embassy, Vancouver.) Unlike Transformers, the Iron Man suits had to cut directly back and forth with the practical suit, and in many cases, CG suit parts joined with practical suit parts within a given shot.

“That's why it really helped us to have better tools for capturing and seeing the fidelity of the real set,” Snow says. “We can better take high-dynamic-range images and use them to light our work. We developed ambient occlusion and things back when I worked on Pearl Harbor so that we could approximate self-shadowing from environmental lighting. But since then, we have tools to make it possible to do more accurate indirect lighting and self-reflection. On this film, that's important because [Iron Man] wears a metal suit — we could better reproduce the way the metal would reflect itself. If you have shining lights on one surface and those lights bounce to another surface — we can now capture all that with a high-dynamic-range that we can use for lighting our CG object. And instead of faking reflected light with bounce lights, we can calculate them directly. It's very computationally expensive, but we had to ramp up our processor pool [at ILM] to get through Pirates of the Caribbean 3 and Transformers anyway — so we had that available to us.”

Snow says that Favreau wanted the movie to more closely resemble a gritty, realistic war film than a typical superhero movie. Therefore, the practical suit built by the Winston team was used extensively to capture real textures and the foundation of certain plates.

“We also went out and photographed damaged cars and things to reference the suit as it got banged up,” Snow says. “Then, we put layers and layers of texture onto the thing. At one point, there were about 20 layers of maps to create different levels of smudge and damage as [the suit] gets beat up.”

<i />Cinematographer Matthew Libatique says he opted against using the Panavision Genesis HD camera system to shoot Iron Man because the movie largely relies on daytime exteriors, limiting that camera''s advantages in low-light situations. Instead, he used Panavision Millennium and Millennium XL film cameras.

Cinematographer Matthew Libatique says he opted against using the Panavision Genesis HD camera system to shoot Iron Man because the movie largely relies on daytime exteriors, limiting that camera''s advantages in low-light situations. Instead, he used Panavision Millennium and Millennium XL film cameras.

Shooting Shell Head


As he got ready to enter the digital-intermediate phase of the Iron Man project at EFilm, Hollywood, alongside Director Jon Favreau and Colorist Steve Scott, Cinematographer Matthew Libatique mused about the effort behind his first gigantic visual-effects-laden feature film.

“It's a difficult film to talk about because it's unlike any film I've been a part of,” Libatique says. “Because of the visual effects, in a strange way, it's more collaborative than other films. You are so reliant on other departments. You have to exercise different muscles because you are taking preventative measures, and yet trying to be aggressive with the film. It was a learning curve for me.”

In particular, Libatique says he felt he had to create a look, choose stocks, and light in a way that took the film's extensive visual-effects requirements into account.

“The primary mission to get my head around was reverse-engineering the look of the film based on what was required for visual effects,” he says. “I didn't want to do a variety of processes that would make it difficult for them to do composites. Luckily, [Visual Effects Supervisor] John Nelson was willing to stay away from lots of greenscreens and [accepted] handheld or moving cameras without the benefit of motion control quite a bit, so I had to take that into account and not create issues [for the visual-effects team]. Also, Jon Favreau was bothered in past films by [heavy grain]. I'm a big fan of grain, but for him, I wanted to create a thick negative with fine grain. Beyond that, though, I stayed in my wheelhouse — mixing color temperatures, measuring what language I could, and using the character of Tony Stark as motivation for creating an emotional language of light for the character.”

Libatique nixed Favreau's initial idea to shoot the movie using Panavision Genesis HD cameras, successfully arguing that much of the movie takes place during the day, thus preventing the production from taking advantage of Genesis' strengths in low-light situations. Instead, he used Panavision Millennium and Millennium XL film cameras — outfitted with Panavision Primo lenses, Angenieux Optimo zooms, and a brand-new Cooke 15mm-40mm lens that served as his primary lens on the Technocrane camera.

He shot virtually the entire movie, except for some effects plates, using Kodak Vision2 500T Color Negative Film 5218, rated at 320 and 400. Although he had the benefit of HD dailies throughout the project, he periodically filmed out sequences to judge color and light. “I love the digital intermediate, but I need a guide to see where I am, and I like doing that with printer lights,” he says. “For this film, it was consistently in the low 40s, and that helped me judge the grain. Seeing the effects shots that [were ready at press time], this process helped me stay consistent.”

Suit light


Libatique also faced the challenge of lighting pieces of the various iterations of the Iron Man suit — the original steel suit, the silver second version, and the advanced red version.

“The beauty of the first suit — a stunning creation by Stan Winston — is the interesting way it accepted light because of its dull sort of armor,” Libatique says. “Imagine something made out of old missile parts. You don't really have to light it, per se. You just create a naturalistic setting for the environment, and you don't have to keylight it in the traditional sense. The first iteration of the modern suit is completely silver and shiny, though. This one accepted light even further, so I worked at very low light levels when we shot that suit. Otherwise, it would scream. And then, the final red suit — the big problem with that was I have an aversion where skin tones can easily turn magenta. To combat that, you normally add green to the image, but that can take away from the red of the suit. So that was problematic — I had to keep faces from going magenta without taking away from the redness of the suit. Generally, if we keylit somebody, I would be conscious of what the color temperature was under the keylight. It was about balancing the color of keylight, and not letting the suit be affected by the lights for people's faces in the scenes. In the end, much of the final suit [was] created by ILM. This gave us some breathing room to cheat color.

“But, really, I didn't contend with letting the effects dictate the look in terms of color, other than that one maintenance issue. That and giving it a thick negative to combat grain are really the only two things I did differently. But in terms of color and contrast, I mainly worked like I always do.”

For a handful of visual-effects greenscreen plates, Libatique made the unorthodox decision to shoot them with a Panaflex 5-perf 65mm camera. That move was made for specific inner-helmet shots in which viewers see parts of Tony Stark's face viewing computerized data inside his helmet — an attempt to allow viewers to connect with the Stark character during an action beat.

The DP says he felt he could avoid lens distortion with a wide lens inside the cramped space in this manner.

“The problem was, when you push in to get physically close, you are on a wide lens,” he says. “If I went to a longer lens, I felt too far away, so I had a dilemma. I thought, ‘What if I went with a bigger format?'' I could come in with the wider lens, and therefore, flatten it out a bit, and optically, I wouldn't get that distortion. A 40mm lens in a 65mm format would be different than a 40mm lens in 35mm. I got that physical proximity to his face, and I wanted that proximity to be apparent to [viewers].”
— M.G.