Like other pioneering HD feature directors such as Michael Mann, George Lucas, James Cameron and Robert Rodriguez, David Fincher has said goodbye to film and embraced high-definition video as his capture medium of choice. But Fincher has taken it one step further, abandoning recording to tape as well. There’s a reason, he states simply: “Tape is stupid.”
His latest film,
, about the so-called Zodiac serial killer who terrorized the San Francisco Bay Area in the late ’60s and early ’70s, is the first feature film with an entirely uncompressed, tapeless HD workflow through production, post and delivery. The film was shot and posted as data, with no tape media involved except backup on vaulted LTO data tapes. Instead of being recorded to videotape, footage was captured to
D.MAG hard drives, transferred to an
storage network for post, uploaded to a Web site for dailies and review and finally conformed, graded and printed.
The tapeless, uncompressed HD workflow performed so well in execution on
that Fincher is currently employing it on his next film,
The Curious Case of Benjamin Button
, starring Brad Pitt as a man who ages in reverse.
This tapeless workflow was conceived about three years ago while Fincher was creating commercials for clients including Nike, HP and Motorola. Repped by Anonymous Content, commercials are where the director first built his name. Editing spots at West Hollywood-based
with his longtime (and
) editor Angus Wall, the two began thinking about the benefits of cutting their projects in HD. At the time, Fincher says, Avid technology didn’t allow editing in HD, but with the arrival of
, the time had come. “Angus called me and said, ‘I think we’re now finally in a place where we’re going to be able to cut in HD,'” Fincher recalls. “‘The question is, How do we want to acquire?'”
Fincher had seen the
camera and quickly put it to work on his commercial projects. “We could shoot, select a take, and go, ‘That’s the one we want.’ We would then put it on a drive and send it to
in Venice, [Calif.,] and they could ingest it and be working on the thing that day. I really liked that workflow.” Fincher and Wall began scheming about creating a similar system that could be used for feature production. “We wanted to build a system that would allow us to never have to go to the lab, never have to wait for pull takes and film scanning. We could have it all in one house.”
The director turned to Rock Paper Scissors software engineer Andreas Wacker, who, with Wall and his team, developed the system used on
. The first link in the chain, of course, is the Viper camera, which was operated on
in FilmStream mode, which outputs an unprocessed, uncompressed RGB 4:4:4 10-bit log signal. “You need the [4:4:4] color space,” Fincher says. “I’d much rather throw information away than not have it there to work with.”
Prior to choosing the Viper, Fincher considered the
, both of which he thought quite capable, but he stuck with the Viper to take advantage of his already-tested workflow.
“I chose the Viper because I wanted to see what it could do if it was properly nurtured,” says Fincher. “I had shot commercials with it but never a feature. I felt it was time to try it. I liked the process of working digitally, and I didn’t like waiting until the next day to see what I had shot.”
director of photography Harris Savides, ASC, explains the benefits of an IT-based workflow and Thomson’s Viper. “The Viper is a high-definition video camera that captures data raw, meaning the camera outputs the image data off the sensor chips without modification. HD sensor chips generate tremendous amounts of data. Initially, HD cameras recorded to tape–a medium that cannot support HD’s high data rate. Camera manufacturers decided to address this drawback by compressing the data and reducing the data rate that the tape mechanism can handle. When the data is compressed, decisions about color balance, contrast, brightness, etc. must be made during the compression process. Once these decisions are made and the results compressed, any subsequent modifications degrade image quality. In effect, the filmmakers must live with what was recorded to tape.”
Savides continues, “The Viper represents a drastic departure from this paradigm. Rather than making image processing decisions and then compressing the data, Viper only captures the data and outputs the unmodified, unprocessed data. Without an onboard recording device, the Viper depends on an external recording device. Filmmakers can opt to record to a tape recorder, like the Sony HDCAM tape system. In this instance, image decisions would be made and the data would be compressed. However, with the availability of S.two digital field recorders and
data recorders, filmmakers can choose to capture the raw data. These are high-capacity, high-speed recorders that are able to handle the HD data uncompressed. As a result, filmmakers can modify the data as much as they wish without degrading the image. Plus, filmmakers have access to the full range of image controls available from postproduction tools rather than being limited by the in-camera image controls.”
Fincher had a hand in some modifications to the Viper, he says–input Thomson certainly welcomed. “I think people are finally seeing that [shooting HD] is the way things are headed and are getting responsive to filmmakers’ needs. There have been some odd choices that have been made, and now you’re starting to see [manufacturers] actually talk to filmmakers instead of to engineers.”
In initial meetings with the company, for example, Fincher recalls questioning the system’s ability to allow the camera operator to play back and view footage at the camera. “I said, ‘That doesn’t work. I don’t want the guy in his own little banana republic. I want him to get off the dolly and get over here. I want him standing next to me at the big monitor because I’ve got to tell him what I think’s funky and what has to get fixed.'”
“David has been instrumental in developing this whole process,” adds Savides. “He took it and shook it and said, ‘Don’t do it this way; do it this way. You don’t need this. We’re making a movie; this isn’t a football game. This camera isn’t being attached to a 40-foot trailer with 900 monitors and tons of technicians. We’re capturing the images onto a drive and sending it off to start editing.'”
The Integrated Workflow
The overall production pipeline of Viper-to-S.two was provided by Rufus Burnham and his staff at
. (The production’s S.two hardware was rented from and supported by The Camera House.) “Rufus has wagered that the future in image capture is made up of ones and zeroes,” notes Postproduction Supervisor Peter Mavromates, another Fincher veteran. “The Camera House has worked hard to deliver high-quality digital images with stable and reliable tapeless data capture.”
The footage and associated on-set metadata was recorded via S.two DFRs (digital field recorders), one per camera, to S.two D.MAG (digital film magazine) hard disks. The D.MAGs hold the equivalent of 400GB of DPX file data–actually about 372GB after accounting for headroom, according to Mavromates.
Due to the capital cost of the drives, only a fixed number could be acquired for the production. Each drive, when full, was brought to the editorial suite at Rock Paper Scissors, where the data was transferred from the drive, allowing the units to be recycled to the set.
The typical number of D.MAGs in use at any given time ranged from 15 to 25 (25 are currently at work on
). When principal photography began in September 2005 on location in the Bay Area, only 18 were in use, and, as Mavromates notes, “Things got a little touchy. We almost ran out of D.MAGs because we couldn’t recycle them fast enough.”
One reason was that, for protection, each full D.MAG’s contents were copied to a second D.MAG drive before the primary drive was sent to editorial–making two drives per camera “load”–doubled when one considers Fincher typically shoots at least two-camera throughout. Once shooting relocated to the Los Angeles area (at The Lot–the former Warner Hollywood Studios–and the Downey Stages), drives were brought directly to editorial for transfer and returned to the set, without that additional cloning step.
Once at editorial, the D.MAG drives were loaded into two different S.two docks (F.DOCK and A.DOCK) so the media on the drives could be downconverted for offline editing and stored securely and redundantly on LTO-3 tape.
S.two F.DOCK Fibre Channel docks were used to ingest the material from each D.MAG at faster-than-real-time speeds into the post team’s storage area network, a 40TB Apple Xsan. The Viper’s DPX files and associated metadata were then downresed using
to 1080 23.98 DVCPRO HD QuickTime files for Final Cut Pro offline work by Editor Angus Wall and his team. (The editors began the project with five Apple Power Mac G5s and later added several Mac Minis to help with the rendering workload.)
These compressed QuickTime clips were also placed on a secure online dailies viewing system for review. Because the production and post teams were rarely in the same place during shooting, Fincher previewed footage online via a technology called PIX (Project Information eXchange). The editorial team would upload QuickTimes of dailies and edits to the secure PIX Web site, which acted much like an enhanced FTP site for remote collaboration. “We not only use it for dailies but also for location, notes and edits, costumes and casting,” Mavromates says. Via PIX, team members were able to view footage and add comments linked to timecode. Project managers set access privileges per user to keep the footage and documents secure.
In addition to F.DOCK, the D.MAG drives with Viper footage were loaded into S.two A.DOCK stations. A.DOCK is an archiving docking station with internal disk array as a buffer for fast offloads of D.MAGs to an attached data tape library. The A.DOCK is connected to an LTO library “robot” containing 400GB LTO-3 tapes, which hold about half an hour of material each. The data stored on the LTO tapes was original, uncompressed HD Viper footage that would later be used for final conform.
While the data was verified at the capture stage at the set, on
, the LTO-3 tapes themselves were verified a second time against the original DPX data before the D.MAGs were erased and returned to the set. “That was really an extra step,” notes Mavromates. “We’ve now gained enough confidence with the system to eliminate that extra step on
After the images went through quality control and inspection, two LTO-3 clones were automatically made as backup uncompressed archive and delivery masters.
The system has, indeed, proved safe, Mavromates adds. “There were bugs now and then that needed to be solved, and S.two was very supportive to solve them. After all was said and done, to the best of our knowledge, the only thing that we lost in the whole project is about half a scan line on one frame. When you consider that we shot the equivalent of 1.5 million feet of film, half a scan line is a pretty good record.”
Fincher gave the system a “road test” on a handful of commercials before jumping in with both feet on
. “He really treated them as 30- or 60-second visual effects shots,” notes Mavromates. “He could shoot the DPX files to the S.two equipment and then hand them over to Digital Domain” for the creation of visual effects.
Break the Camera
DP Harris Savides was no stranger to Fincher, the two having shot a great many commercials and music videos together as well as Fincher’s 1997 film
. But Savides had used the Viper/S.two system just once with Fincher, on a Motorola spot in July 2005, just prior to starting
. So the cinematographer put the camera through its paces–both to develop a look for the film with the director and to see just what could be done with the Viper. “As Harris likes to say,” recalls Mavromates, “to ‘break the camera.'”
“They wanted me to play with the Viper, to see how it responded to bright light, low light–the normal things you’d do when you test for film,” Savides says. “They let me experiment for about three or four days. I just tried to do as much as I could wrong with it.”
Savides worked closely with
(TDI) colorist Stephen Nakamura during the testing period. “Harris has a natural feel for light,” Nakamura comments. “He’s one of the best cinematographers I’ve ever worked with because his instincts come through at the point. For him, an HD camera is just another recording device for his ideas of lighting. The testing was just a way for him to be able to find out, ‘How much can I push this image until it falls apart and I’m unable to achieve what I want to achieve?’ In fact, Harris probably did more testing than anyone I’ve ever worked with on any film.”
“I wanted to see what extremes it could tolerate: how much backlight it could take, how much underexposure it could take,” Savides explains.
Adds Nakamura, “He needed to see how the highlights respond to a certain type of light he puts on an actor. How do the blacks respond when I don’t have light on them? And how much light do I need to put on them? So he would shoot stuff that was too dark and I would try to do whatever I could to save it. He could see, then, how grainy it could get and then know, ‘I can’t push it that hard.'”
Savides adds, “With lighting, it is all about placement of light and shadow. There are certain tolerances with film, and I kind of know where it is going to go. What’s interesting about this is that I still don’t know in some ways how it reacted. What’s cool about the Viper is that the camera doesn’t have any compression issues. The one disadvantage is shooting in high light. When you are shooting against the sun or light in windows, it can’t handle a backlit situation as well as film. It’s just one of the problems inherent in a digital camera.”
After the thorough testing process, Savides and Nakamura came up with a look that would work with the system. “We accumulated a bunch of little rules,” says Savides. “I’d try to work wide open as much as possible and use muted colors as much as possible. It’s a period movie, so we never wanted anything to stand out too much. The movie’s not a pedestal for the cinematography.”
Nakamura explains that he referenced looks developed for
, which was shot with the
digital cinematography camera in HDCAM SR 4:4:4 format, for Savides’ use. “I had certain adjustments to the gammas and saturation for that film, so I just set a look similar to those for Harris for indoor and outdoor situations,” he says. “Then it was just a matter of him finding the ‘sweet spot’ of the exposure for any particular scene and creating the best look for what the camera does in certain lighting conditions and holding that as a standard.”
No Viper look would be complete without compensation for the green bias inherent in the RGB 4:4:4 log FilmStream output. Savides adjusted for most of it using a CC Magenta 30 compensating filter, enabling Fincher to view the images looking somewhat normal on set. The remainder of the discrepancy was handled by LUTs created by Editor Angus Wall and Fincher. “I made nine LUTs under David’s supervision using a demo version of
. We used printer light settings and worked with David and Harris’ test footage.” Once the LUTs were created, Andreas Wacker pulled the LUT files and sent them to production, where they were inserted into the S.two DFR to enable on-set preview of footage. One of the LUTs ending up doing 95 percent of the work, according to Wall.
Once the look was established, Savides had the tests printed to film and projected on the largest screen he could find: at the DGA Theater in Los Angeles. “I didn’t want to be fooled by just seeing it in the ideal situation. I was adamant that we take it and have the lab make a release print,” he says. In addition, partially through the shoot, the post team assembled a five-minute set of clips representing a variety of lighting situations, which were filmed out and shown to Savides at the Ross Theater at Warner Bros. “Harris had only seen tests that were in artificial conditions,” says Mavromates. “He hadn’t seen any real ‘war footage.’ He seemed quite pleased.”
On the set, once a series of takes for a particular setup was completed, Fincher would simply delete those that were unsatisfactory, the remaining takes being “keepers,” and some among those identified as circle takes, Mavromates explains. “He might shoot 20 takes, of which editorial might only get 12, of which only four might be circled.” The circle takes had the LUTs applied to them and were returned to Fincher via PIX as QuickTime dailies for review. (Before production began, Fincher worked with S.two engineers to add the ability to delete takes on set to the D.MAG system. He also worked with engineers to add an automatic slate function.)
In the Dark
As mentioned, photography for
took place on location in the San Francisco Bay area and at stages in the Los Angeles region. The shoot was a long one, stretching from early September 2005 to March 2006–a stretch exceeded only by Fincher’s
schedule, which began filming in New Orleans in November 2006 and isn’t due to stop shooting until May of this year.
For the most part, Fincher prefers shooting two-camera. “We tried to go one-camera on most of the show, but I just felt like I couldn’t get enough coverage for what I was trying to do,” Fincher explains. A multi-camera shoot meant lighting challenges for Savides.
For municipal locations, which occur throughout the film, the cinematographer went with overhead Kino Flos, as well as light bounced in from outdoors through windows. “It facilitates quick setups with multiple camera positions,” he says. “And it prevents any lights from shining directly into the lens.”
Savides employed an extensive array of
lenses on the Viper cameras, including 5mm T1.9, 7mm T1.6, 10mm T1.6, 14mm T1.6, 20mm T1.6, 28mm T1.6, 40mm T1.6, 70mm T1.6 close focus and two 6-24mm zoom T1.9 lenses.
The movie is filled with realistic-looking room interiors, often shot with very low light levels. Savides explains, “David and I discussed how we wanted the movie to look. My philosophy is always to base the look in reality. Always try to screen reality for the audience, not overlight or take them out of the movie unless it calls for it. I always ask, ‘What’s real? What motivates this scene?’ People live in rooms and work in existing spaces that are lit. People aren’t lit. They’re in rooms that are lit. So I always light a room first, and then let people exist in that room.”
For a crucial sequence that depicted the nighttime crime scene of victim cabbie Paul Stine at the intersection of Washington and Cherry streets in San Francisco, Savides focused on re-creating a streetlight source. But don’t be fooled into thinking the incredible nightscape depicted in the scene is the effortless work of the trusty Viper (as one might think, given the camera’s inherent abilities in that department).
With the exception of two shots–actor Mark Ruffalo leaning in to look at the cab and a cutaway where he finds a shell casing–the entire sequence is a fine visual effects segment created by Digital Domain. “They wouldn’t let us shoot at the actual location,” Fincher explains, due to the exclusivity of the high-rent neighborhood.
Instead, Digital Domain pieced together matte paintings based on photographs from the era of the crime and contemporary shots. The actors, however, were photographed on a minimalist setting created as an exterior at the Downey Stages in Downey, Calif. In order to allow insertion of the background plates later, 20×25-foot bluescreens were placed on dollies and rolled about, following the actors as they moved around the cab scene. Fincher credits this technique to Camera Operator Ken Marks. “It was a very odd shoot,” the director says.
The workflow facilitated the creation and insertion of visual effects shots, both by Digital Domain and fine aerial cityscapes by
. Matte World delivered an impressive opening shot depicting the now-extinct Embarcadero Freeway and another showing the gradual construction of the Transamerica Pyramid, which Fincher uses as a timeline.
“Shooting DPX files, in terms of VFX, is heaven,” says Mavromates. “Delivering a shot to the VFX house is just a matter of me going to my data ninja, Wyatt Jones, and saying, ‘I need this background plate,’ then throwing them onto a hard drive or DVD data disc and delivering them. I don’t have to call somewhere and move my negative to a cutter, and then have it moved somewhere else to be scanned. It’s within my current overhead to be able to deliver a plate to a VFX vendor.”
Shots also could be viewed at TDI in the DI screening room, which took place frequently. Nakamura credits these screenings for adding additional production value. “David could make comments and say, ‘Hey, I see too much detail there,’ or ‘I don’t see enough detail there, that looks fake.’ They’d redo it, and we’d come back the next week and look at it again. The process started early enough that everybody had a chance to do their best work.”
Another effects vendor that received DPX files from the production team was Los Angeles-based
. At one point in the film, the narrative takes the form of a hypnotic montage, artistically drawing together the letters and clues the serial killer provided to investigators. After receiving input from Wall on Fincher’s vision for the sequence, high-resolution scans were taken of the actual letters from the
killer. “It seemed early on that the process required a strong collaboration between 2D and 3D to really capture the level of detail that David was looking for,” explains A52 VFX co-supervisor Andy Hall.
According to A52’s Sarah Haynes, the raw live-action Viper footage for the sequence was provided to A52 on a portable hard drive in the form of DPX files. After A52’s work was completed, the sequence was output as DPX files and sent to Technicolor, where it was colored by Stephen Nakamura and mastered as part of the film’s digital intermediate.
, with the exception of fades and dissolves, was conformed in-house, taking full advantage of the system built by Andreas Wacker. Based on technology from S.two, Wacker created an automated system in which up to 24 LTO-3 tapes are loaded to a “robot,” to which the editing team can send instructions to retrieve DPX files to create the full-res conform. This virtual conform process takes XML files from Final Cut Pro and pulls relevant files from the LTO tape library.
“We walked into the DI house with our picture, each reel, already conformed, with the exception of dissolves or fades,” explains Mavromates. It is preferred to color correct the A and B segments of such transitions separately, particularly for dissolves, for example, from a night shot to a day shot. “If you have the dissolve pre-built, it’s hard to make a smooth transition from one color correction to the other. So we like to pre-color correct those elements and then marry them in
The production also took advantage of the skills of
(the former Lowry Digital Images), known in the industry for its abilities in film restoration, to produce a uniform level of grain throughout the picture. “We have some scenes shot with almost no light in which the grain seems more apparent than in shots captured with more light,” Mavromates says. “So we did some testing with DTS and liked the way their process evened out any noise issues and helped sharpen the image. They noise-reduce everything and then put in a base level of grain throughout. So the movie’s very, very consistent looking.”
DTS also processed some visual effects plate shots prior to compositing at Digital Domain. “The bluescreens in some shots had grain structure, so putting them through this process gave DD cleaner plates to work with.”
Conforming with DPX files made the handoff to TDI that much simpler, since the company’s own pipeline is based on DPX files, though a workaround was required to make adjustments for file header inconsistencies.
Having well-captured images also made Stephen Nakamura’s job easier. “When I work with David and his cinematographers, I usually don’t have to struggle to color correct his movies,” says Nakamura, who has worked with the director for more than 10 years. Testing with Savides prior to the film’s shoot, Nakamura says, certainly paid off. “When I work with good cinematographers, it makes my life easier. We can be creative instead of trying to fix something that looks bad to begin with. That gives us creative freedom, when we have good cinematographers who can expose shots correctly, like Harris did.”
DPs often think using the Viper means a free-for-all on exposure–something Nakamura warns against. “Everyone thinks, ‘Oh, my God, these cameras are so great. I can just shoot with no light,’ which is not necessarily true. If they shoot in low-light situations, they have to make sure their exposure is correct. They can’t just run-and-gun these cameras and expect it to look the way
In certain situations on
, data in the shadows was left there intentionally, even though the data existed. In a scene in which Robert Graysmith (Jake Gyllenhaal) is shown around a basement by the creepy Charles Fleischer, the shadows were left as shadows. “A lot of times, it’s not detail that was important to David. He never cared to see some of the shadow detail that might have been down there,” Nakamura says.
Working with a DI seems an instinctive way to finish a film, Fincher says. “The DI is such a natural extension of what we’ve been doing for 20 years in commercial postproduction. On my first movie, it was so painful to go into a room and kind of approximate what the image should be and say, ‘Well, that’s about as close as we can get it.’ That just used to freak me out.”
The same goes for working in HD with a tapeless workflow. “We have to be thinking this way because this is the technology that will help give us our freedom. If we can make movies cheaper, we’re going to be able to do things the way we want to do them, as opposed to the way they’ve been done a million times before.” Though viewing captured images instantaneously may not be for everyone. “There are a lot of people who really like the mystery of ‘I wonder if I got it? I’ll find out tomorrow.'”