Director David Fincher insists he has deleted videotape from his professional life for good. “I've hated tape for a long time — all the nonsense that comes with tape,” he declares. “I wanted to shoot HD since we made Panic Room , but at the time, Sony Pictures warned us that HD wasn't reliable enough to shoot a major feature — which I thought was ironic, since their parent company makes HD cameras used to shoot motion pictures. We couldn't get them to provide HD projectors to do dailies [at the time], because we shot three-perf and telecined everything, and ended up getting our dailies on HD VHS tape. That's how far back that was. But it's probably just as well, because now we can do it without tape altogether, which is far better anyway.”
A couple of years ago, Fincher concocted a strategic plan to R&D a tapeless workflow built around Thomson Viper FilmStream cameras recording to D.Mag Digital Film Magazines (from S.two of Reno, Nev.) on a series of five commercials. He has now applied lessons from those projects to a full-length feature film for the first time with Zodiac, a look at the men and the media circus surrounding the hunt for the famous San Francisco Bay Area serial killer in the 1970s. The movie, shot by Harris Savides, ASC, is slated for release in the next few months from Warner Bros. and Paramount. It's believed to be the first full-length studio feature film shot and produced entirely as data from start to finish, with no physical media involved beyond backing up all raw imagery to 500 vaulted LTO data tapes during postproduction.
“This is a literal tapeless process, because we record to hard drives, shuttle them back and forth to the edit room, where we load the data, back it up, and convert it to editable media,” says Peter Mavromates, one of the film's producers. “No tape is involved anywhere in that process. Furthermore, when we deliver images to visual effects companies, we pull computer files and put them onto hard drives, and send them to vendors that way. The only tape is data tape, which we use to back up original camera files, but there was no safety net when we were shooting on set in that regard, because even that backup happens in the editing room, after the shoot day — not during production. For a feature film, this is all very new.”
Fincher insists the whole process was fairly simple and straightforward in most respects, even though many of the tools his team used were still evolving and being tailored to his specific needs as the project got underway. The biggest challenge, he says, involved grappling with a studio and industry culture that tends to see the removal of physical media as an impediment to their security and long-term archiving goals.
“It's about getting people to wrap their minds around change,” he says. “The studios, for instance, often had what, for me, was a surreal response early on. They were trying to understand who, exactly, would take the digital media from the set and get it cloned and archived safely for them. I said, ‘The same, totally underpaid PAs who normally take your film from the set, in the middle of the night, to the lab. Now, instead, they will be taking an anvil case with a D.Mag in it back to the editing room.''
“When we started working on it, a lot of people had trouble understanding what we were doing in that sense. People were worried when I told them we could do this with fewer people, which is great from my point of view. They were freaking out, saying, ‘Fewer people will be employed,'' and all that. I was saying, ‘No, it means more movies can be made.'' The same people can be employed, but working on more productions. We can bring the cost of movies down, and give more people, not fewer, rolls of the dice. But some people get it and some don't, at this point. From our point of view, though, we're very happy how things turned out.”
The movie's basic workflow started with shooting in uncompressed, 10-bit, 4:4:4 color space, in the 1920/1080p HD format, with six Viper cameras outfitted with Carl Zeiss DigiPrime lenses. The cameras were cabled directly to a revolving group of 20 D.Mag units, which ingested the imagery and corresponding metadata as DPX files. Those D.Mags were then routed back and forth to the project's editorial offices, where they were loaded through a chassis-like connector called the F.Dock into a massive SAN. Data was then backed up to two separate sets of LTO tapes, which were then quality-checked before cloned data was down-rezzed to the DVCPRO HD format and loaded as QuickTime files via FireWire into editor Angus Wall's Apple Final Cut Studio system for editing.
Once a stringent verification process was completed on data from each D.Mag, the hard drives were returned to the set, erased, and re-used. The editorial team relied on close to 40TB of Apple Xsan storage during the cutting process, and Wall's team distributed digital dailies each day using the Internet-based PIX (Project Information Exchange) system for remote collaboration on the evolving pieces of the movie. (See p. 21 for an in-depth look at how PIX was used on the project, and on the issue of remote collaboration generally.)
Filmmakers add that, at press time, offline color correction and an initial conform were all underway during the editorial process, conducted by Wall using Final Cut Pro, applying a special LUT to layer the movie's basic color scheme into the images for dailies viewing. Editors used Apple Shake compositing software to render the imagery during the editing process for that purpose. Data from the LTO tapes, at press time, was slated to be used next for a re-conform and final color pass on the eventual digital negative. (For more details on this workflow process from Andreas Wacker, a consultant on the project, go to digitalcontentproducer.com.)
From there, finally, film prints will be struck. “We live in a world where we still have to exhibit on film, at least for now,” Fincher laments.
There were, obviously, hiccups along the way, and at press time, Fincher and his colleagues did not have their final product in hand, but the director says developing this process was more than worth it in terms of letting him work the way he has wanted to work for many years, sans videotape.
“We had seen the D.Mag hard drives work with our commercials, and we decided that was the way to go, because of random access and the ability to constantly review it,” Fincher says. “It never made sense to me to have a 4:4:4-capable camera that records to any kind of compression on a tape format that you can't immediately play back. Therefore, you need a BNC cable to go to an NTSC deck to record your videotape, when you could have VTR playback. That always seemed insane to me. If you have a stable and reliable platform on which to record on, and immediate replay of what you just did in 1920/1080p, 4:4:4, why would you even consider putting a tape deck on top or on back of your camera? Therefore, a 35lb. camera, like [Panavision's] Genesis, with a tape deck stuck on the top, just wasn't something I was going to embrace.
“Plus, we wanted this to be a widescreen movie, and Viper has a nice way of dealing with anamorphizing the 16×9 pixel array to give us full use of 1920/1080p across the 2.37:1 anamorphic aspect ratio. We really wanted to get as much resolution out of it as we could, since I did not want to crop the frame top to bottom. Why start out with a 2K image, and then throw a third of the frame away?”
Still, Fincher's workflow was so new that many of the tools plugged into his pipeline were not designed specifically for the way he wanted to work. Thus, his team worked with S.two engineers to upgrade the technology to meet Fincher's specifications. For instance, Fincher wanted to have the option to delete takes on set, despite the fact that the hard drive system he was using was designed specifically to prevent image deletion during production. Fincher also wanted to add an automatic slate function to the technology.
“Initially, the hard drives were a little cumbersome, and we had to make it work the way we wanted it to work,” he says. “Over time, we designed a lot of stuff into it. They designed the system to protect you by preventing deletion of images on set, for instance. But I like being able to delete stuff. If a take is a disaster, I want to be able to get rid of it and start again. So, they built that capability into the system for us.”
Fincher adds, “We saved about 30 minutes a day by not having [physical] slates; plus, you almost never have to stop and reload. We probably reloaded about 30 times over the course of 120 days, at the most. [Actor] Robert Downey, Jr., said to me that he had never been on his feet so long on a set, because we rarely had to stop cameras.”
Mavromates elaborates that the addition of the self-slating capability to the system was actually a crucial development.
“David asked for the ability to automatically slate footage,” Mavromates says. “We did not have a clapper on the set. So what they added was a capability for the data-capture person who throws the hard drives into record mode to enter basic information for a scene and whatever is recorded. The first frame is a framing chart and next five frames are a slate, automatically inserted, and then it goes to what he filmed. And the feature also means the data-capture person does not have to do anything from the second take onward. The last frame of the take provides a window to record other information, like lens height, lens length, aperture, and other conditions — those get burned into the very last frame of the take. That's important for tracking purposes in editorial, and it functions the same way a traditional clapper would. It's just that our slate is digital, not physical.”
As for the archival issue, Mavromates points out that when the project is finished, Zodiac will have both a pristine digital negative and a film negative, and it will also exist in a variety of other formats. “[This method of filmmaking is] a pretty safe way to ensure there will be data to use in the future,” he says.
“There is the danger, of course, of technology getting obsolete, but we're better off, certainly, than movies done many years ago because, at the end of the day, we are creating so many high-quality masters,” Mavromates says. “We will have about six original digital negatives for this movie, the original data, different HD versions, a film master, and all those HD DVDs out there. They'll be able to re-master this movie some day, if they want to — they won't have to piece it all together from scratch. To me, this is the future of how movies will be made. Digital cinematography is still in its adolescence, and it will mature very quickly, so our workflow will only improve along with it.”
That is certainly Fincher's firm conviction.
“We could lose data some day, but let's be honest — that's always been the case with film, as well,” Fincher says. “Somebody find me a good print of Lawrence of Arabia, or a decent restored print of Rear Window. Everyone says we won't have the resolution of 35mm, but the truth is, 35mm is maybe 4K, and that's before they do things to it. You have all this color space with film, but you don't ever use all that color space. As soon as you drop an orange filter over [the lens], you have suddenly limited your blue and green color space, for instance. And by the time you dupe it to inter-positive, then to inter-negative, and go to three dupe negs or six dupe negs, and make 3,000 release prints, then you are looking at something, in most cases, just over 1K. So I think it's silly to get attached to [film] like that.”
Postproduction: the word implies a separation where there is first one phase, and then, another. In between the two, traditionally, there has been a film lab. That, of course, is often no longer the case. The transition into various current digital workflows has, in my view, happened in a much less binary and dramatic way than some might suggest. Years ago, for instance, 3D technology—a traditional postproduction toolset—jumped ahead in the typical production timeline of the movie-making process and called itself “previz.”
In other words, the movie-making workflow equation has always been evolving and, now, merely continues to do so. Thus, for Zodiac, we tried to do what you can do in 2006. Therefore, I don''t see our work on this movie as new or revolutionary. Rather, it is the natural extension of the simple fact that this movie never existed in any form of analog media or even videotape, for that matter. Zodiac is a digital movie shot with Viper cameras and stored on hard drives all the way through the movie-making process. Although it all sounds complicated, at the end of the day, this approach worked extremely well for us and for the creative needs of the project as delineated by David Fincher.
Still, there is only one main reason that Zodiac''s workflow worked as efficiently as it did—the team of people Fincher assembled. If you do something for the first time, then your choice of tools and programs might appear to be crucial on a certain level. But, in reality, your only chance is to assemble a dedicated team. Editor Angus Wall''s company, Rock Paper Scissors (RPS) was extremely lucky in this respect. Dedicated professionals such as assistant editors Wyatt Jones, Pete Warren, Brian Ufberg, and Brad Waskewich saved the day—several shoot days, actually. While it might look fancy to draw boxes with arrows to illustrate how the bits have been flowing, Zodiac showed again that people matter above all.
RPS not only picked up the editorial work on the movie, but also took on the job of handling the digital negative. Technically, of course, this “negative” is just a collection of 18,220,156 DPX files.
Since any negative—digital or film—contains the final product, it needs to be made secure along the way. For me, personally, this was the scariest part of the job. With a can of film, you know what you can and cannot do. Millions of movies have already been stored on those thin plastic strips called film in the last 100 years. Those negatives, of course, can get scratched every time you handle them, and yet, you have to work very hard to create a total loss in the analog world. In contrast, it can take only one simple command to erase terabytes of data, which could represent an entire movie.
To make sure that didn''t happen to us, we created two sets of data tapes and stored them in secure locations. Before the tapes were vaulted, we verified each one against the working copy of the DPX files. You might think working with redundant files all through is much safer than working with a traditional camera negative, but logically, however, the potential for human error is much greater.
Since the Zodiac data setup essentially replaced the film lab, it had to work on the same pace. Each day''s data load had to be processed by the end of that day—regardless of how much material had been shot on any given day. Downtime was simply not an option. Since we had a limited supply of the S.two D.MAG digital field-recording devices, a delay in processing data would not only prevent people from viewing dailies, in a timely fashion, it would slow virtually the entire project to a crawl. After the exhaustion of a short safety buffer, the shoot would have simply stopped if the RPS data stream became blocked for whatever reason. In postproduction, there are normally a few deadlines on any given project. Some can be moved a little, some cannot. On this movie, there was a daily deadline, and it was a deadline that could not be moved or extended.
Therefore, in parallel to the backup process, we triaged data and corrected, flagged, and tagged various errors that occasionally happened during the shoot and acquisition phase. Once data had passed this gate of digital purification, we could depend 100 percent on naming conventions and other standards that we established for the show. The mechanical analogy would be to keep dirt out of the gearbox. Those digital seals might look like a lot of work, but the absence of grinding noises was well worth it. Once you can depend on the constituency of data, you can freely rearrange it to automatically cater to any upcoming needs.
Since we had data management and editorial running in parallel to the shoot, we created QuickTime editing files directly from DPX files. Our code for this created command line Shake scripts that we rendered on all available computers. The main reason for this design choice is that Shake has a high quality re-sampling algorithm. When you compress material, then a decent-looking source image without any resizing artifacts will allow modern codecs to allocate bandwidth for picture content more wisely. Another compelling argument for Shake was the simple fact that it''s command-line renderer is free.
Since the Viper''s digital negative has no “look” applied in-camera, we had to create a system to apply lookup-tables (LUTs) that would give everybody a consistent preview of the actual material. Angus Wall set David Fincher up with a couple of basic looks that we then converted and distributed for the on-set preview. The same looks got used in all edit media and online reviews.
While we processed the actual audio and video data each day, we also built a system of metadata for each shot. Script notes and meta information from the digital acquisition system got transferred into a database that kept all shots of the movie in a web-accessible and editable database we developed specially for this movie. Keeping the directors comments physically close to the content was an important goal, and allowed for a smooth editorial workflow.
Dailies had to be delivered to the studio, of course. Therefore, the PIX (Project Information Exchange) system was used to provide a web-based solution for this. Since our data sets had been purified, so to speak, in the triage process, PIX simply scraped the RPS shots database to gain access to the same set of meta information. “Web 2.0” is an overused buzzword for a wide range of new web-based services and capabilities, but in our RPS/PIX collaboration process, we definitely used Web 2.0 techniques, not because they are cool or hip, but simply because it was the most simple and efficient way of accomplishing what we needed to do.
The working storage for the movie is an Apple Xsan installation—about 40TB. It did a great job, yet terabytes are not impressive to me. You basically just buy more storage and have a larger disk system. That''s as exciting as a stretch limo in my view. Xsan performance can be good or mediocre, depending on how you write your code. Much like Shake, Xsan is not an Apple product in the way that an iPod is an Apple product. It''s professional, yet Apple is a bigger company than your average professional software or hardware manufacturer, which has positive and negative aspects. On the one hand, the company is always going to be there for us, but on the other hand, in a small boutique, it might be easier to locate and collaborate with the actual person who wrote some slightly misaligned feature that needs to be corrected.
At press time, our current monitoring setup was a 30in. and 23in. Apple dual-screen LCD combo. While a 6.5 mega-pixel screen configuration sounds ample, we actually felt that an even bigger, dual 30in. setup would benefit us the most. The machines do lots of things simultaneously, all the time. More information is better, and therefore, it was worth the money. I believe the people you work with directly are much more important than whether you have racks full with shiny equipment. Basically, it''s all scrap metal in three years anyway, while happy people still can be around—and productive—for many years.
In a broader perspective, our Zodiac workflow is based on simple and reliable ingredients: DPX files and their corresponding metadata, a web-accessible database, a decent amount of XSan storage, and so forth. While the movie was shooting, we optimized the pipeline constantly where we could, without risking any workflow issues. Since the individual steps in the data chain worked reliably, we were able to scale it up in an efficient way.
The following might serve as an example of this applied attitude. At one point, we ran out of render power because the shooting pace picked up significantly. After a quick test evaluation, we scrapped a plan to deploy XSan and picked up some Mac Minis instead. Those boxes ended up being twice as efficient per dollar than their fancy “real computer” counterparts.
I believe that things should be as cheap as possible, but not more than what you need. With commodity hardware becoming so powerful these days, the range of tools and machines that could be deployed got extended significantly on this project.
Constant software adoption along the needs of our specific production also worked out well. We simply created features as the need for them came up during production. The underlying structure was sound enough to cope with all these revisions, up to a point anyway. Many of the broader changes and features that we figured out would be good for our workflow, we will only deploy for our next movie. We didn''t need them this time around, or we figured them out too late to fully implement them. Still, Zodiac was a great start in all this, and I admit I''m a bit amazed that it all worked—and happy, too.
Andreas Wacker is a postproduction consultant and technology expert, specializing in data flow, and he helped put together the methodology employed on Zodiac. He has worked on dozens of movies and visual effects projects over the years, and was previously a co-founder of the visual effects facility, Method Studios.