Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now


Breaking the Comfort Zone

Marvin Rush points with pride to twin Apple Cinema HD Display monitors ablaze with colorful imagery from a scene he’s directing for an upcoming episode of Star Trek: Enterprise. Rush, a longtime Star Trek franchise veteran, has served as the show’s DP since its inception four years ago. Today, however, he’s directing an episode titled “Through the Mirror Darkly, Part 2” while his A camera operator, Doug Knapp, takes over his DP duties. However, Rush still monitors his precious HD imagery closely.

A visual effects shot from the first season of Battlestar Galactica.

Rush points to the image on the monitor, cabled from a handheld Sony HDW-F900/3 camera run by operator Joe Chess, who stands on a catwalk filming actress Jolene Blalock. Rush explains how the image runs through an eCinema Systems HD scan converter box to the monitors, giving him a pristine HD view to tweak color himself in realtime during production. Using a Sony RCP-7930N paint box, Rush brings the scene very close to final color long before it enters the postproduction world.

This, Rush explains, represents a crucial shift in the production of hour-long episodic television programming as such shows move into the world of high-definition acquisition: the need to color-time dailies has gone away. According to Rush, the final look is now largely designed on set, and usually requires only modest tweaking in a da Vinci 2k-Plus system during the offline process that happens later at Level 3, Hollywood. Also, special protocols have been developed to make sure visual effects shots match that look seamlessly.

Discussing the show’s conversion this year to an HD production model — dictated by Paramount Studios as a budget-cutting measure — excites Rush. He has become a vocal HD convert for this type of high-end television production. Not even the news (delivered at the time of our interview) that UPN has officially cancelled Enterprise restrains his enthusiasm for, and pride in, the HD workflow model his team has developed. Rush expects the show’s HD transition to be influential in the industry even as production on the show itself permanently winds down.

While not surprised by the show’s cancellation because of low ratings, Rush is confident that the decision had nothing to do with the show’s look this season. Quite the opposite. “I think Enterprise looks as good, or better, than it ever did when we were shooting film,” he insists. “We screened the season premiere this year [last October] at Paramount digitally on a big screen, and that screening proved it — it represented material shot under budget, on time, our first seven days shooting HD. I’ll put that footage up against anything we ever shot on film. I think we demonstrated an efficient way to do HD for television this year, and I’m proud of that.”

Meanwhile, elsewhere in Hollywood, director Michael Rymer pauses briefly from a round of meetings regarding development of an independent film he hopes to direct, possibly shot in HD, to discuss another high-profile, science-fiction franchise show now airing on the Sci Fi Channel: Battlestar Galactica. Rymer directed the two-part Battlestar miniseries in 2003, which was shot on 35mm film, then the series pilot in 2004 and this year’s season-ending two-part episode — all of which were shot using Panavised versions of Sony’s HDW-F900/3 system.

Rymer freely concedes that he initially opposed, with fervor, shooting the series in HD. He recalls being opposed to the concept “in an emotional, obstinate way.” Harvey Frand, the show’s producer, says the switch to HD from film was required in order to make the show affordable to produce. But despite that reality, he says that Rymer resisted and constantly fought to return to film.

Rymer, however, now raves about the show’s HD production model. He promises to direct more episodes for season two (the show was renewed shortly before press time), and he says his Battlestar experience has convinced him to seriously consider shooting his new movie in HD.

What changed his mind? How did both shows, which essentially represent the state-of-the-art for producing high-end science fiction for television, transition into HD production? What have been the advantages, disadvantages, lessons, and visual effects and post consequences of these two moves into the HD acquisition universe? To answer these questions, Millimeter recently chatted with key production and post players intimately involved in both shows.

Actor Callum Keith Rennie taped in front of a greenscreen. The show’s visual effects team intially had concerns about pulling clean mattes from greenscreen using HD, but extensive tests and innovative use of tracking markers put their fears to rest. (Battlestar photos: ©Carole Segal/SCI FI Channel)

Battlestar Goes HD

Rymer explains his initial HD reticence as a mixture of past experience, misconceptions, and miscommunication. “During the miniseries, we established the aesthetic for Battlestar Galactica, and that was done on film,” he says. “We wanted a documentary-style quality and a gritty look in keeping with that. So we were pushing grain and making depth of field as shallow as we could and doing lots of handheld, long-lens work. So, going into the series, from what I knew about HD, which was admittedly limited, I had a concern that it would not be conducive to applying this sort of documentary aesthetic.

“I had previously done some HD tests for a micro-budget feature in which exterior daylight work was less than satisfactory — probably that is the worst environment for HD. In applying that experience in my mind to Battlestar Galactica, I couldn’t see how we could make it work. That was my initial reaction. Then, we went through our first phase of testing with [DP Steve McNutt], who had experience with HD. Steve was primarily doing tests for himself, to test lighting and things. My first impression of those tests was not favorable either because it wasn’t grungy enough for the look of our show — it was mainly clean, bright lines. The video was picking up the fluorescent bars of various consoles on our [spaceship interior] set. I saw a slick, high-tech environment, which might be great for some sci-fi shows, but not for ours. That made me even grumpier. By then, I had a pretty bad attitude about HD.”

However grumpy, though, Rymer was informed that HD was the only option. McNutt was given more direction about Rymer’s concerns for the next round of tests, and Rymer started to cheer up.

“Steve pushed gain, desaturated, and tweaked things using his state-of-the-art video village,” Rymer recalls. “What I realized is that the video village, which I called the crab shack, had technology in it that someone with experience could use to transform the imagery into the look we wanted. He really pushed the gain; he brought up digital noise, which approximated film grain nicely, and he showed me the control he could have. It wasn’t perfect — daylight exteriors are still a limitation — but it became clear we could maintain, and even improve, the gritty look we had previously established.”

McNutt’s “crab shack” is, in reality, a Sony MSU 750 (Master Setup Unit) system for realtime image painting, combined with a Sony 20in. BVM-D20 F1E HD monitor and a Sony 14in. BVM-14 H5E HD monitor, encased in a portable tent that McNutt designed himself. There McNutt toils, with assistance from video engineer Michael Sankey, to finalize the look of each episode, manipulating color as needed on set. This kind of setup, he explains, is the entire reason that shooting digitally now makes sense on shows like Battlestar Galactica.

“I wouldn’t consider shooting HD on this show without this kind of system,” says McNutt. “With this system, we don’t have to rely on controlling blacks and changing skin tones during the online process. Seeing dailies without those adjustments makes everyone nervous anyway. Since [executives] are aware of the potential of what the final product can look like, they expect to see it during dailies. With the MSU, I can make blacks as black as I want, but with more detail, on set, and I can do all sorts of effects. I can easily pull my stop on the go, as well as change the emulsions, contrast curve, and color sensitivity from shot to shot, if desired. In other words, I can do most of my own color timing on set, and that is what they see when they get dailies. It’s not a final color correction, but it’s close to final color adjustment. I’m not promoting massive manipulation on-set — most shows don’t require that approach. I’m just promoting good, strong images that will cut your online time way down. By the time the product gets to the online people, it can be about 80 percent done.”

Battlestar‘s workflow involves using the HDW-F900/3 system with Panavision Primo lenses (supplied by Panavision Canada) to shoot live action on stage in Vancouver and in surrounding locations. Northwest Imaging, Vancouver, then dupes those tapes for dailies, and sends them to the show’s editorial office at Universal Studios, where down-rezzed SD versions are used to offline the episode on an Avid Media Composer Adrenaline system. After approvals are finalized, the episode is conformed via a final tape-to-tape color-correction pass at Modern Video Film, Burbank, and eventually mastered to D5 (a DigiBeta SD master is also created for domestic broadcast). As much as possible, though, the online does not take place until after visual effects are finished at Zoic Studios, Culver City, Calif., with a small percentage of shots normally completed at boutique shops Atmosphere and Enigma, both in Vancouver.

Enterprise DP Marvin Rush examines a shot on the twin Apple Cinema Display HD monitors he uses on set during production.

Impact on Effects

Battlestar visual effects supervisor Gary Hutzel (a Star Trek alumnus) jokes that this notion of waiting for final visual effects shots rather than onlining first and then cutting in final effects weeks later (as most shows do, including Enterprise) is at face value, “a crazy scheme hatched by Paul Leonard,” the show’s associate producer in charge of post.

“That creates a certain amount of pressure, and it’s certainly not what I’m used to,” says Hutzel. “But I can see Paul’s point. This is a show where changes are commonplace right to the last minute — the executive producers are very active in cutting the show. Paul is reacting to that need. He needs to be able to react to creative tweaks at the last minute. Therefore, we completely previsualize the entire show in time for the director’s cut, while post waits to online until our shots are finalized, whenever possible. The creative process that begins with the freedom that HD acquisition provides, in the form of extra takes and coverage that would be cost-prohibitive with film acquisition, must continue in the post environment, and spontaneity in the editing process must be respected. A rigid approach to visual effects design and delivery would undermine that.”

Leonard adds that this scheme doesn’t always work, and the online process often commences before effects shots are final, but he pursues this goal whenever possible. “That’s a major hurdle,” he says. “We try not to online [before effects are done], but we often need to start on color and can’t always wait around. But the reason I try to do it is that I’m most afraid of visual effects coming in after you have onlined the picture and then realizing that the motion is different than in the temp version. You then realize you have to make a change, and you don’t have the frames to make up for the visual effect change, so you have to go back into the online bay anyway to change your picture. That becomes tiresome and expensive, plus you constantly have to give the sound crew a change list for synching the show. Most of the time, when things are going right, it’s actually more efficient to do it this way.”

In terms of building those visual effects (each episode averages 30 to 40 effects shots), the commitment to HD acquisition created concern early on about how best to capture clean greenscreen plates. It also eliminated bluescreen from the mix altogether.

Zoic’s Lee Stringer, the show’s departing CG supervisor, says, “We were worried going in about being able to pull greenscreen mattes with no issues. We originate from the HDCAM tape through the Sony 900 deck and bump up to D5 tape, so there is always some compression there from the original. Therefore, we were worried about compression artifacting. We were also worried about [greenscreen] tracking shots. Motion blur and its resulting artifacting problems would impact tracking 3D elements onto what was shot. But early on, they shot reference footage with the camera moving around with tracking markers on the greenscreen, and we pulled it all in. A big reason for this was the fact that Gary Hutzel came up with super-bright blue [button-cel battery operated] LED units that we could attach directly to the greenscreens as tracking markers.”

Hutzel adds that, because tracking across a greenscreen would often be required for the show, he had to “lay the groundwork for a full 3D tracking solution.”

Enterprise DP Marvin Rush lines up a shot with an HDW-F900/3 camera supplied by Plus 8 Digital, Burbank, on the show’s set at Paramount Studios.

“We weren’t going to use motion control or track on set — all 3D tracking had to be done in post,” he says. “The LEDs are [tiny] and stick easily to the greenscreen with Velcro and tape. We also used transparent Plexiglass poles, with the glowing markers attached, when tracking across the greenscreen for space ship shots. We found that with the movement of the ships, just using tracking markers alone was not consistent enough. But the poles, transparent and placed at certain distances away from the ship, made it easy to track parallax in those shots.”

The production tests, combined with Hutzel’s tracking marker solution and Zoic’s expert use of Boujou (version 2.1) tracking software, have put all tracking concerns to rest. “We recently showed a full 3D crane move,” Hazel says, “bringing Commander Adama [Edward James Olmos] off a platform onto the greenscreen, and we did it only having focal length and relative camera position information up front. It ended up being a perfect track. This is important, because it lets directors work as they normally would in film, to get the best out of the actors. They don’t have to avoid anything or compromise the scene in order to accommodate the effects work.”

Still, Hutzel adds, the current generation of HD technology used on the show “can’t handle bluescreen yet, so we are staying exclusively with greenscreen.”

“There are other issues that crop up even with greenscreen,” he says, “like the occasional blanking issue as you transition from person or object into the green-screen. You can get a blanking effect unless the balance is perfect, which is hard to do shooting live-action on the set. But that is pretty rare, and we are aware of it, so we can identify the problem and rectify it when it crops up.”

Zoic uses an all-Lightwave (version 8) pipeline for modeling, texturing, animating, and rendering CG on the show. According to Hutzel, Boujou translates scene-tracking information directly into Lightwave, and 3D elements are rendered out that exactly match the motion from the original plates. These elements are then assembled in Combustion and are passed along to Zoic’s HD Flame system for final composite.

Battlestar DP Steve McNutt in his “crab shack” video village, featuring a Sony MSU 750 system for realtime image painting and two Sony HD monitors, encased in a cover McNutt designed himself.

Greenscreen Collaborators

Hutzel emphasizes that the nature of HD production and the approach to producing Battlestar Galactica have evolved his relationship with DP McNutt far beyond what he was used to in the film world. This is particularly true regarding the greenscreen issue, he says. When the first season began, he explains, he insisted that McNutt provide him with plates devoid of video noise, with a standard gamma setup to assure clean mattes. This was uncomfortable for McNutt in the sense that he was routinely pushing the look with the rest of the show. According to Hutzel, around the time episode four went into production, he and McNutt got together and began experimenting with ways to simultaneously satisfy both men’s requirements for the plates.

“Normally, with regular live-action, Steve might push for video noise to emulate grain — sometimes he might go to +18dB or more,” Hutzel explains. “He wanted to try pushing the greenscreen plates too, but I was hesitant. But as our relationship got comfortable, I told him to try it a little, and he went up to +6dB, and he also pushed gamma, crushing the blacks, clipping the highlights, and tweaking the color enough to let him try to match overall skin tones and lighting with the rest of the episode. The plates can sometimes be challenging, but what we found is that they sometimes work better for our comps. And when they don’t, I ask him to back off a bit.”

Producer Harvey Frand emphasizes that this expertise and collaboration is exactly what makes high-end HD television production work on shows like Battlestar Galactica. In other words, professionals like McNutt, Hutzel, and their colleagues are making the move to lower-cost production through HD acquisition reasonable, without sacrificing perceivable quality.

“Economics is obviously the big reason why we are shooting HD for this show,” says Frand. “It simply was not affordable for Sci-Fi Channel if we shot film. But that was not the only reason. We all were very familiar with Steve McNutt’s HD work, and we had tried to hire him in the past on other projects. The thinking was that we could make it work if we had someone with his level of expertise heading up the HD work. The first season, in my view, proved that it can work, if you have the right people. If the quality would have suffered, there would have been no point to doing the show.”

Typical Battlestar effects scenes, created mainly at Zoic Studios, Culver City, using a Lightwave pipeline. Visual effects supervisor Gary Hutzel says producers try to wait for final effects shots before onlining the show, whenever possible.

Enterprise Goes HD

Despite his endorsement of Enterprise‘s transition to HD production for its final season, Marvin Rush and the show’s producers had previously rejected the format when the show was born four years ago. In fact, Rush began conducting HD tests on the set of the previous Trek show — Star Trek: Voyager — while Enterprise was still in the developmental stage. At the time, all that was available to Rush was the first version of the Panavised HDW-F900 system, and it wasn’t quite there in terms of meeting the show’s requirements, according to Rush.

“The problem was we do a fair amount of pyrotechnic work on this show — fire effects, propane mortars, high light energy, high bright elements generally, and tints,” he explains. “If you light a set and expose it for bright exposure, you normally have head room for pyrotechnics, but the sets we shoot are normally fairly low lit, given the nature of our show. So we generally do pyro sequences on a low-lit set, and that can lead to overload. At the time, the early version of the Panavised camera could not handle the overloads without flipping. It didn’t hold onto detail in the highlights. It would overexpose and lose detail, and then not recover terribly quickly. So it didn’t render the pyro shots very well, and that just wasn’t acceptable, even though it did, as it does now, have tremendous latitude at the bottom of the spectrum — better than film. At the time, we therefore decided to stay with film.”

This year, however, Paramount insisted Enterprise costs be reduced. Therefore, more HD tests were conducted, this time with version three of the HDW-F900 camera, as well as with Sony’s HDW-F950, and Thomson’s Grass Valley Viper FilmStream technology. Rush rejected the Viper because he required a camera that could work untethered, given the show’s periodic handheld requirements. After testing both Sony cameras, he concluded that version three of the HDW-F900 could, in fact, hold up when it came to highlights, due to improvements in the camera chip.

“It passed with flying colors,” he says. “The screening at Paramount featured footage captured on film for the teaser to the episode — material shot last season. We hard cut that against this season’s HD footage, and on the big screen, there was no way to tell the difference. That proved to me this technology was ready for primetime.”

The HDW-F950 was also tempting to producers, according to supervising producer Peter Lauritson, because of its 4:4:4 recording capability — a higher resolution that would be useful for greenscreen work. “But it wasn’t as mobile as the 900, and it is a more expensive system, and we had a mandate to cut costs,” Lauritson explains.

During the season, Rush settled on Zeiss DigiPrime lenses and two Fujinon 15×7.3 zooms for most Enterprise work, all supplied by Plus-8 Video, Burbank. Unlike McNutt, Rush decided not to include a video engineer on his staff this season — mainly because he felt it wasn’t necessary, but also for philosophical reasons.

“I have some background in that stuff, so I just do the tech work myself,” Rush says. “The cameras are very stable. I’m very familiar with them, and we don’t need that much technical support. We have a tech guy come out [from Plus 8] every two to three weeks to check the cameras and make sure they have not drifted, and that has been more than sufficient. But the other issue I wanted to avoid was having a competing voice on set regarding the look and color of the show — another voice that would have the director’s ear. I believe strongly in maintaining the authority of the cinematographer, and having all that responsibility myself enhances that authority, which is how I like it.”

The Enterprise team also concluded that the HDCAM format was better suited to greenscreen than bluescreen work, mainly because the cast’s uniforms on the show are blue, but also because, according to Rush, the format compresses blue more than green.

“However,” he says, “in testing the [HDW-F950], a 4:4:4 system with a separate recording deck, we discovered we could get clean plates with bluescreen that way, since there was less compression. We had a plan to rent a 950 and a deck this season if a situation ever required bluescreen, but it never came up.”

The basic pipeline for the show’s production has been to transfer the original HDCAM tapes to D5 at Level 3, synch them with the separate DAT sound tracks, downconvert them to Beta SP, and offline each episode in standard-def on an Avid Media Composer (version 11.0.6) system (Meridian platform) at the show’s editorial offices at Paramount.

“Then, after the cut is approved, we online tape-to-tape — D5 to HDCAM — and do minor color correction at Level 3,” explains Lauritson. “We cut visual effects shots into the master there after Eden Effects and Technicolor are finished with them. Eventually, we deliver a D5 HD master and a D2 standard-def digital master to UPN.”

Top to bottom: A live-action, greenscreen plate, a CG plate created at Eden Effects, and the final comp from an Enterprise effects sequence. Effects producers on the show report that only minor modifications in how they shoot plates were required this year following the show’s switch to HD.

Trek Effects

A major, welcome change the HD switch brought to the show’s visual effects workflow was the elimination of the need to stabilize film effects plates via pin registration during the telecine process, according to Dan Curry, the show’s longtime visual effects producer.

“We don’t have to do pin-registered transfers any more; dirt cleanup has gone away — those are great changes,” says Curry. “That makes HD very useful to us because it eliminates steps. Our director of photography loves it, and generally, the changes in the technology since we first tested it a few years ago have been astounding. So we haven’t had any real difficulties in the visual effects process because of the change, mainly just some minor differences in how we approach the initial capture of greenscreen.”

Ron Moore, the show’s co-visual effects supervisor (along with Art Codron), points out that the show’s effects template has been gravitating toward an HD workflow since Enterprise debuted in 2001, long before this season’s switch to HD acquisition. That was, in fact, a big reason Curry, Moore, and Codron located the pipeline at major HD-capable facilities in close proximity to each other — Eden Effects, Hollywood, where all CG is created, and across the street at Technicolor Creative Services (formerly Complete Post), Hollywood, where all compositing is performed. The two facilities are linked by a fiber optic connection to a central server that allows them to efficiently transfer digital material back and forth seamlessly.

“Live action plates come over here as HDCAM dupes from the original camera tapes, and the CG comes to us from Eden over the fiber optic connection,” explains one of the show’s Inferno artists, Paul Hill. “When they finish rendering elements or shots at Eden, they just dump them off to disc, and we have instant access to them. We have eight terabytes of combined storage between our three Discreet-based compositing bays [two Inferno bays and a Flame station]. It’s a very efficient process.”

Moore adds that this HD effects pipeline still periodically renders out standard-def elements and then up-rezzes them if a render-intensive shot would threaten the schedule. But, he adds, the need to turn to SD elements became less frequent as the season wore on.

“We had to move to facilities that can deal with lengthy render times, and with some of our material, those times can be very long at HD resolution,” Moore explains. “We get matte shots coming over, and those things can take huge amounts of time to render. Our schedule doesn’t always allow for it, but on the other hand, if you don’t do it, you can end up with aliasing and various artifacts. Our tests showed us the difference between shots rendered out in HD and those rendered in SD and up-rezzed. So that is one reason we took the show to [Technicolor] — to be in a facility fully equipped to do HD 100 percent of the time. I always ask for HD stuff, and if we run into a render problem, we evaluate it at that time and do what we have to do to make it work in our timeframe.”

At both Eden and Technicolor, two effects teams — one headed by Moore and one by Codron — move in and out, with one usually supervising shooting of elements on stage and pre-visualizing and prepping shots for an upcoming episode (each averaging about 30 to 40 effects shots), while the other team builds effects for the previous episode. Eden does all CG work in Lightwave (versions 8.0 and 8.2), while Technicolor artists composite in two Inferno (version 5.5) bays. Eden uses Boujou (version 2.3.1) for tracking, while Inferno’s tracking module handles that chore during the compositing phase at Technicolor.

“We rely a lot more on tracking these days, which we would be doing whether we shot HD or film,” says Codron. “The advent of sophisticated HD image capture, however, has coincided with the arrival of extremely powerful tracking systems. We are freer now, as opposed to the old days of ‘here comes the optical shot and lock down the camera.’ We are building an entire CG creature right now, and we are building the whole sequence as if it were shot handheld. That would not have been easy to do in the old days.”

DP’s Authority Returns

Back on set, Rush emphasizes his satisfaction with this HD production model, particularly in terms of the heavy reliance on the DP to color correct each episode himself. Indeed, Rush considers this development “a great thing when it comes to re-establishing the cinematographer’s authority and control.”

He points out that television production schedules normally preclude the DP from routinely participating in telecine color timing sessions or offline color correction sessions for film-acquired shows. The arrival of HD, he suggests, has returned control over the final color template on television shows to the cinematographer, and that, Rush declares, is a good thing.

“Cinematographers lost that responsibility [for color correction] in television because of the schedules and the nature of the telecine,” he says. “When we shot film, the colorist in the [telecine] room got to see everything first — he was the one changing the look first. I might call the guy and talk to him, but the DP did not control that process. We lost that control for a long time. Now, with HD [acquisition], I have that control back.”

This, says Rush, is the legacy of Enterprise, Battlestar Galactica, and a handful of other shows now ramping up with similar production models.

“The move to HD acquisition represents the forerunner of an absolute, unstoppable trend in which digital origination will become the absolute standard [for television],” he says. “Our show, and a few others, are part of that trend, and I’m sure our approach will be looked at long after this show has finished.”