Pro Tools on Wheels
HD Production: Then and Now
A major concern using HD cameras to produce Rescue Me involved capturing practical fire footage, which could create exposure problems. But DP Tom Houghton uses a variety of techniques to mitigate that issue. Here, actor James McCaffrey copes with a typical blaze in the show.
All Rescue Me Photos by Craig Blankenhorn. Courtesy FX and Sony Pictures Television
Denis Leary joined the ranks of high-definition acquisition afficianados long before his award-winning FX Network series about the lives of firefighters, Rescue Me, now in its second season, went into production. His first experience with the HD format took place in 1999 when he co-starred a low-budget feature thriller, Final, directed by Campbell Scott and released in 2001.
“Campbell had a very small budget,” Leary recently told Millimeter during a break from filming one of this season’s final Rescue Me episodes. “He told the cast he wanted to shoot HD and warned us going in that we would need two full weeks of rehearsal because, with the high-def format, he was planning to shoot on quick turnarounds, pausing only about 15 minutes between lighting setups. He said he would shoot about 10 to 12 pages of dialogue a day. I thought he was out of his mind. But that’s what we did. As an actor, it killed me — it was very grueling. But as a producer, I realized this was great, and I decided that if it looked as good as he said it would, forget about film and just shoot entire movies in HD. We shot that movie in about 20 days, and I thought the final product was terrific. I knew then the cameras would have to get better and there were some problem areas, but I figured this would be the future.
“When I got to the questions of budget and shooting schedules for Rescue Me, I was not only eager to get into HD — I was convinced that if we got the right DP and the right approach, we could make it work. I worked with Clint Eastwood a long time ago, and I used the occasion to pick his brain, and one thing I never forgot is something he told me. He said that if people are shooting 16 hours a day and doing 20 to 30 takes on the same shot, that’s because they haven’t done their homework, or they are jerking around on set. If you adopt that mentality, then HD becomes your best friend. You do lose some things going without film, but you also gain things, and it all balances out.”
Kerry Orent, one of Leary’s producing partners on the show, is even more straightforward about the show’s use of HD. In fact, he makes no bones about it: The show relies on HD primarily for budgetary reasons.
“We investigated other alternatives, like 16mm, but the truth is, we went with HD because it was cost-effective, and we addressed that because of our limited budget even before creative issues,” says Orent. “At that point [two years ago], enough people were shooting HD that we were confident we could pull it off, and that we wouldn’t be starting from scratch. And, of course, it has proved to be a substantial savings for us. That was our thinking right from the show’s inception.”
The switch to HD is rampant these days in broadcast television on the multi-camera sitcom front, but for single-camera, hour-long episodic dramas, the commitment to HD remains the exception rather than the rule (see “Breaking the Comfort Zone,” Millimeter, March 2005, for some other examples). Leary, Orent, and all of their colleagues at Rescue Me, however, expect the paradigm to shift away from film for such shows — and toward HD workflows similar to theirs — sooner rather than later.
“Based on our experience on this show, including pushing the limits on how we shoot — mainly handheld, and very fast — I would have to say I don’t see a good reason not to shoot HD for television,” says Orent.
Leary, meanwhile, doesn’t even think of it in terms of television production. To him, it’s all filmmaking, and ironically, that now means digital filmmaking as far as he is concerned.
“Whether it is TV or a movie, this is how I want to shoot,” he says. “I’m producing a movie in the fall that will be shot in HD. I’m now used to it. I have a crew that is used to it. We have great speed shooting this way — easier lighting, all that stuff. Coming from an acting background, I think that keeps actors fresh, and as a producer, it allows us to do more with our money and time.”
Series creator/producer/star Denis Leary (center) and co-star James McCaffrey (leaning on truck) are captured on tape by camera operator Jamie Silverstein (far left) and operator Gabor Kover (far right), during taping of a scene from the show. Typically, action is captured handheld, using Sony HDW-F900 cameras.
Fast and Furious
Because Rescue Me is a Sony Pictures Television production, it was decided early on that the show would rely on Sony camera technology, although producers say they were permitted to test other cameras as well. They settled on Sony’s HDW-F900, which has become something of a workhorse in the TV world in recent years, relying on Canon HDXS JF22ex6.6B IRSE and Canon XDXS HD22ex4.7B lenses, supplied by Video Equipment Rentals, New York.
DP Tom Houghton, who took over the show in its first season after Jonathan Freeman shot the pilot and the first two episodes, says that the show’s look is not meant to be pretty, and, given how they shoot, its production setup is not that complex.
“We often have two cameras, mostly handheld, and generally shooting as fast as possible, since that is how Denis envisioned the show,” says Houghton. “It’s not glamorous, except maybe for occasional romantic scenes. It’s about a firehouse and the firemen and the fires themselves, so we wanted it to look spontaneous and rough. That’s a look you can get in 16mm, which, I believe, is how they shoot [FX’s police drama] The Shield. But what is interesting is that look also lends itself to the HD medium, and on our show, I think it fits together nicely. In the two years we’ve been doing the show, we have gotten a nice rhythm going. And since 90 percent of it is handheld, that is kind of an evolution for HD production — it used to be complicated to do lots of handheld work with this technology.”
Orent adds that issues that might have bogged down HD technology in the past on a show of this nature — bulky cabling, the need for an extensive video village, on-set color correction, and a digital imaging technician — are no longer obstacles to making Rescue Me fit Leary’s vision.
“Denis really wanted us to shoot quick and look good, without a lot of technical things going on,” says Orent. “So we ended up very freewheeling. We only have one cable, running to the [30in. Sony HD monitor]. We don’t have a color correction guy on set, none of that stuff. It’s a lot like a film shoot actually. Right off the bat, we threw out the window the notion of having a complicated cabling situation with tons of equipment. It just wouldn’t work for this show.”
Instead, Houghton simply sets the look in camera. He says, “[We] rarely [mess] with the menus other than for global color correction, tungsten, daylight, or whatever.”
Operator Gabor Kover lines up a shot as writer/producer/director Peter Tolan looks on.
“Our method,” he says, “is to do modest color correction in the camera, somewhere between tungsten and daylight usually, and mix the lights quite a bit. I do most of the colorizing with gels and lighting selections, to be honest. And then, the whole thing gets sent to Modern Video Film [Burbank, Calif.], where our colorist [Kim Schneider] adjusts it based on our discussions, his knowledge of the show, and input from the producers.”
Leslie Tolan, the show’s postproduction producer, adds that the show’s color palette has evolved a bit in the past year. “We have established that, generally, we don’t like warm scenes — it’s a very cool show in terms of colors,” she says. “But once we get inside the firehouse, we do saturate it, since that is the firefighter’s own little world. We saturate it a bit, and put more blue into it, and that makes it a more sterile environment. Our colorist is well acquainted with our approach and does a great job, so the process is not that much different than if we were doing it on film. The only difference is we get our dailies faster, and we can see the images from different locations and give input from wherever we happen to be.”
Tolan supervises the offline editing process on Avid Media Composers, v. 7.1 at her office in Pasadena, Calif., and once again, she says the HD pipeline has become so efficient and commonplace that the approach hardly deviates from how things would be done on a film show.
“The big change is that, last year, we had two editors, but we found that was a really grueling schedule for two guys considering the amount of footage we shoot with the HD cameras,” she says. “So this year we have three editors, and each editor, at all times, is working on two shows. But when the material comes to the cutting room in a DVCAM format for editing, the process is pretty much like doing it with film-transferred material, except that we don’t have dirt or scratches to fix. Really, a master is a master, and whether it comes from a telecine and a DAT or from HDCAM, we are still cutting the same basic thing during editing. We load it into the Avid, edit it, and send [Modern Video Film] our EDL for the mastering process. We end up with HD masters, and they look beautiful, but the steps we take to get there are not that unusual.”
Grip Scott Eberle (left) helps set up a car rig, as DP Tom Houghton (seated) and gaffer Scott Ramsey (right) discuss the sequence.
The bigger challenges usually happen on set, according to filmmakers. In particular, given the show’s subject matter, there was some initial consternation a couple of years ago about capturing practical fire footage convincingly with the HD cameras, as well as the addition of visual effects where necessary.
Houghton says, however, that he has been pleasantly surprised by how well fire sequences have turned out on the show. He adds that he has learned a great deal about shooting such effects with HD cameras.
“You really learn a lot about what this format can take in terms of exposure and latitude shooting stuff like this,” says Houghton. “For fires, we have very few CGI embellishments most of the time. Denis wanted to show the experiences firemen really go through in real fires. He wanted to show dark, smoky environments, as real firefighters have described it to us. They are disoriented. They are crawling around. It’s dark, smoky, and confusing. So that approach keeps us away, most of the time, from having to shoot real bright fires. We do shoot real fires, under control, of course. We had a fear of exposure issues relating to fires, and that is still valid. If you are shooting a big explosion or something, you could overexpose at a certain point if you aren’t careful. But generally, you just learn the best ways to do it.
“For instance, if you zoom into the middle of the fire, it will white out on you. But when you are shooting a fire scene in daylight, which is what we usually do, your base light level is the daylight itself, and therefore, the fires do not overexpose easily. One trick I learned was to evaluate how much the fire is dominating the frame overall. Then, you stop down with it. Plus, with HD, you have the advantage of looking at the fire on a high-end monitor, and so you can adjust and make judgments as you go.”
Houghton adds that HD permits him and his crew to work at a near-frenetic pace, which matches both the creative design for the show and the limitations of its schedule and budget.
“You can shoot quickly with film, too, obviously, and with HD you do have to set up the cables and the monitor, but the fact that I can get 30 to 40 minutes on a cassette and still see what I’m getting on the monitor right away is a blessing,” he explains. “I talk to the assistants by walkie-talkie so that I can stay at my monitor, and that allows us to quickly communicate about adjustments on the fly. It’s true that I don’t feel quite as close to the work as I would on a film job, but the camera viewfinders and monitors can’t help me much with my decision-making, so I need to use the big HD monitor as my reference. It has become my viewfinder in a sense.
“Another thing I’ve learned, though, is that we can get away with a very high lighting ratio for effects shots. We have periodic dream sequences that we shoot that way, and I was surprised by what kind of ratios we got away with. I thought, going in, that I would not be able to use such a strong range of lighting ratios. Usually, except for some effects, I don’t use too many filters because this is a realism-based show. But the filtration I do use for effects — to blow some windows or get halos around things — can come in real handy. Using light and filters this way, I can get a range of effects on set, like shaft of light or a halo around someone’s head, very easily now. HD works great for that stuff.”
Leary, Houghton’s boss on the project, thinks HD works great for all kinds of stuff. He concedes, “There is still the occasional shot or stunt or slo-mo effect that works best with a film camera.” But, that said, he thinks that film cameras will eventually become specialty tools on HD projects, rather than the other way around.
“That’s the future,” he says. “People can resist it. But I think most people resisting it just don’t want to do their homework on HD. In the future, I predict, you’ll keep a film camera in your truck for a special shot or slo-mo stunt, but it will be strictly a backup. HD will be the main production tool.”
Pro Tools on Wheels
Despite its state-of-the-art HD pipeline, an equally important technical innovation coming out of
might well be the show’s liberal use of remote collaboration techniques and tools — particularly on the audio side. Leslie Tolan, the show’s producer in charge of postproduction, reports she no longer needs to be tied to a Todd-AO, Burbank, Calif., sound stage to supervise and approve the show’s final mix. That’s because her colleague, composer Chris Tyng, developed a remote, web-based audio review and playback system for the show — sort of Pro Tools on wheels — that permits Tolan to sit in on mix sessions in realtime and collaborate with Todd-AO artists, all from the comfort of her summer home on Cape Cod, Mass.
“It’s amazing that I can be connected to Burbank from the Cape, or when I’m in some other location, and hear and see everything they hear and see on the mixing stage and give them instant notes, just as I would if I was in Burbank,” Tolan says. “We usually have a two-day mix for each episode, and around the middle of the second day, Burbank time, near the end of my day [on Cape Cod], we’ll have a live run-through, which I can fully participate in by connecting to the Internet and using the system Chris designed. I instant-message notes right to the stage, and they can rewind, make my requested changes, and let me hear the changes the same time they hear them. It’s pretty cool.”
Tyng hasn’t gotten around to naming the system — he and Tolan each have one — which consists entirely of existing tools that he has configured into a small, mobile system that can easily travel wherever its user goes. He packaged all the components into a specially configured, shock-mounted, Stanley toolbox on wheels, and the entire system can be operated right out of its specially configured case.
He says the system consists of two Macintosh Mini laptops, one of which runs a version of Pro Tools and one of which operates a nonlinear video playback software tool called VVTR, made by Gallery Software, London. Also part of the package: a couple of FireWire drives and a Canopus ADVC-100 FireWire digital video converter.
“All these pieces are small but powerful, and the whole thing lets us roll the Stanley toolbox into any room with a good broadband connection — and within a half hour, we have a remote connection to the dub stage,” Tyng explains. “It’s basically an ISDN connection over IP. Last summer, we had an ISDN connection at a local inn [on Cape Cod], but this way, we can do it remotely with a regular broadband connection. The key for the whole thing is a [new, recently released] plug-in for Pro Tools called Source Connect [from a company called Source Elements]. That kind of plug-in, the small Mac Mini computers, and other things are all examples of recent technological developments which are freeing professionals from all having to be in the same place to collaborate on projects like this one.”
Tyng adds that the Source Connect plug-in shakes hands with an identical system at Todd-AO, allowing the company to stream audio in stereo directly from the stage and SMPTE timecode over another separate stream.
“We can easily synch up sound and picture so that we can view and hear the exact same thing as the people at Todd-AO,” he adds. “Usually on DVD, we get a copy of the online edited version of the show sent to us, and our video software then lets us digitize that copy of the online picture into our remote system. The Source Connect plug-in lets us synchronize the streaming audio with that picture at our remote location. Essentially, the two [picture and sound] are able to instantly synchronize because the software streams us stereo audio and, separately, the timecode stream. Pro Tools has a SMPTE reader, and when we lock Pro Tools to the incoming SMPTE signal, it, in turn, locks the virtual VCR on the Mac Mini computer to Pro Tools. In other words, from thousands of miles away, we are able to slave to the master audio coming from the dub stage in Burbank.”
For speakers, Tyng designed the system with a pair of small Genelec 1029 units, which travel separately in their own Stanley Tool case. The streamed sound is played through those speakers and a high-end Sony television at Tolan’s home on Cape Cod, giving her a good idea of how the mix will sound to home viewers.
Both Tolan and Tyng emphasize the system can allow senior-level industry professionals to, in his words, “escape Hollywood” and still get their work done.
“In the past, one common thing in the industry has been this need for executive producers to be present on the stage when you do a final sound dub of a show,” Tyng says. “Men and women at the top tier, with lots of other things to do, having to park themselves at the dub stage for a couple days is not preferable to most of them. So this system gives us the ability to bring the playback and review system to the producer’s environment, with no loss of quality or efficiency.”
According to Arrested Development producer John Amodeo, the industry has come to accept the HD production model initially used for Titus in 2000.
HD Production: Then and Now
Back in 2001, John Amodeo told
, “[I’m] a bit surprised that more TV shows aren’t doing HDCAM production.” At the time, Amodeo was on the bleeding edge of the HDTV revolution — serving as producer for the now-canceled Fox sitcom,
, the first major episodic television series to be acquired in HD, after switching from a film production model in 2000 (see
, February 2001). For that show, Amodeo was responsible for designing and implementing a new HD production pipeline and training an HD crew.
Now, four years later, Amodeo isn’t surprised anymore, except perhaps at how long it took to evolve traditional TV production models into the HD world. HD production is now nearly ubiquitous for sitcoms, and it is slowly but surely oozing its way into the hour-long dramatic genre (see the March 2005 issue of
And once again, Amodeo is standing on the front lines. He’s still working in the half-hour format, this time on the single-camera, Emmy-winning Fox comedy
, for which he is currently serving in his second season as the producer in charge of all things technical after taking over for first season supervising producer Victor Hsu. He sees HD continuing to mature into all TV genres, however, just as he expected four years ago.
“The technology has freed us, and the acceptance is now here,” he says. “The argument is no longer about limits of the technology. For television, HD workflows can be adapted for just about any kind of show. That wasn’t quite the situation, or at least it wasn’t perceived to be, back when we did
Amodeo recently chatted with
about Arrested Development’s production approach, and how things have changed since his early experiences on
Millimeter: Besides obvious improvements in the technology, what’s the biggest change you see in high-def television production from your
The funny thing is, when we set the system up on
, it worked perfectly right away. Everyone was skeptical at the time, and even when other producers and production executives visited our set, they said it looks good, but they would never want to use it themselves. But the truth is, the system worked right away.
In a sense, little has changed technologically — it just works even better today. But what has changed is the industry’s acceptance of this workflow. It’s extremely well accepted today. In the multi-camera world, everyone is shooting HD now, except in specialty situations or if a powerful star or director demands film.
Also, the crews are trained now. On
, we all had to learn how to best use the medium. Directors of photography, production designers, set decorators, makeup, and hair artists all had to experiment and learn what to adjust to get the best look from the digital cameras. Now, all the departments have been through their period of experimentation and have figured out what to do to make a great looking show.
We used to have conversations about whether it was worth making the change to digital. What were the benefits? Cost savings? We no longer have those conversations. If the show is multi-camera, it will most likely be shot on HD. These days, what is unusual [on TV sitcoms] is if someone does not want to shoot HD.
Millimeter: But what about HD gear? Don’t you have a lot more options these days?
That’s certainly true, of course. When we pitched Fox on the idea of shooting
, a multi-camera show, in HD, the first question was, are there enough monitors and cameras around to even do it? At that time, Panavision was about the only company set up to serve our needs. But now, there are so many vendors, and they all have this technology. It used to be, we had to collect ancillary pieces of our system from different places. Now, most vendors can rent you the entire package.
Millimeter: What is your basic workflow for
and what changes have you made on the show since you took over at the start of the second season?
The first year, before I came on the show, they were shooting using [Panasonic’s] Varicam. They were shooting picture-only on the cameras — no sound, no timecode. Sound was all recorded separately, and then they integrated sound and picture later in postproduction at Laser Pacific.
The Panasonic cameras are excellent, both mechanically and in terms of image quality, but that workflow seemed inefficient and costly, although Laser did the job brilliantly.
But Laser Pacific is primarily an HDCAM facility, and so, working with the Panasonic format involved bumping the 720p camera masters back up to 1080 — the equivalent of a digital telecine. Picture from 720p and sound from a digital source were then synched up and upconverted to 1080 for editing. It just seemed to me like a long trip to marry all those elements, and then reconvert them for editing. And because Fox broadcasts in 720p on its digital channel, we eventually had to downconvert back to 720p for delivery.
Since Laser Pacific is one of the top post houses in town, and because they worked so hard during the show’s first season, I didn’t want to change facilities. Given that I had experience with Sony’s HD system, I instead decided to switch to HDCAM, and make that our format.
Now we shoot on two older Sony camcorders — the HDW-F900 — and capture picture and sound and synch timecode in the same place. The change has saved us a lot of time by avoiding the delays of that whole synching and upconversion process before we could digitize.
For editing, we downconvert HDCAM to DVCAM, and we then bring the DVCAM to our offline facility on the Fox lot. An assistant editor then digitizes it all into the Avid overnight, so the editors can start working in the morning as soon as they come in. Once the cut is locked, the EDL goes to Laser Pacific and they perform a conventional assembly using their Supercomputer system.
Millimeter: So now that this kind of HD workflow is spreading through the television world, what’s next?
There are two things. First, the next step is to get rid of tape — to go completely tapeless. Whatever that means — disc-based or some other kind of media. There would be a tremendous benefit for us to get data out of camera as fast as we can, and directly into the editing system. That is obviously the way we want to go eventually.
The second thing is, we’d like to have smaller, lighter, high-quality versions of the cameras. We would love to have true 24p, 1080 MiniDV cameras. We have experimented with small DVCAMs, but they don’t intercut well with our format right now. It’s something we are looking forward to, and I know all the manufacturers are working on this right now.