'New York Story:' Donna Lawrence Productions Develops a Panoramic Museum Presentation
New York City has hit the big screen... the really, really big screen. The Big Apple is the subject of some groundbreaking digital cinema treatment at the New York Historical Society (NYHS) at 77th St. and Central Park West in Manhattan.
New York Story, a panoramic telling of the history of the town that never sleeps, is screening this summer and fall (and for the foreseeable future) at the only venue in the world capable of presenting this 18-minute digital video production. A trio of overlapping digital projectors handles the 4K immersive video using three separate panels and eventually a huge panoramic screen that expands from 25 feet to more than 70 feet wide during the show.
The project was produced by Donna Lawrence Productions of Louisville, Ky. “Images at the start of our presentation are displayed on a series of moving panels that drop into the visual space and go in and out,” says Donna Lawrence, director and executive producer. “Eventually the screens are removed and we reveal this massive 73-foot-wide panorama of New York City, and the remainder of the show remains in this uniquely designed large-format scenario. It’s quite breathtaking.”
Lawrence says that while the presentation, which is narrated by actor and native New Yorker Liev Schreiber, taps into archival footage and stock imagery of New York growing into a world-class city, a lot of original content was also captured for the NYHS feature. “We shot New York City in all four seasons, with lots of night and day shooting and some wonderful aerial shots,” Lawrence says.
While a RED digital cinema camera was the workhorse for nighttime acquisition, Buddy Squires, the film’s director of photography, surprised himself (and Lawrence) with the format he chose to shoot much of the daytime footage. “We knew we needed to have very, very high resolution because of the custom-made screen and theater,” Squires says. “The front row is probably 15 feet or less from the screen, and with such a huge screen in a small theater, we really needed the highest quality.”
Squires and the production team thought the only viable 4K solution at the time was a RED camera, “but we tested RED versus Super 35,” he says. “On a normal size screen in a screening room, the 35mm film came through the best for contrast and color rendition. Everything we know and love about film was shown to be true,” says Squires.
Lawrence insisted on a larger-scale test.
“To my total surprise, the film footage felt a bit unstable,” Squires continues, “and the registration issues at that type of enlargement were great enough that the film didn’t feel as crisp and clean as the RED footage. The other limitation is that the aspect ratio is 4.7:1, while it’s 1.85:1 for a regular 35mm theater screen. So we’re really using less than half of a 35mm negative.”
Squires still wasn’t crazy about the test results. He went back to AbelCine, where applications specialist and blogger Mitch Gross asked Squires why he hadn’t tried Vision Research’s Phantom 65 camera. Squires replied that he didn’t need a high-speed camera and Gross countered with resolution. Squires remembers Gross’ contention, that the Phantom 65 has “the world’s biggest sensor. It’s got the best imaging you can get for large-screen productions.”
“So we did tests with the Phantom 65 versus the RED and there was just something about the Phantom footage that was astonishing,” Squires says. “We were shooting from the roof of AbelCine in lower Manhattan, a couple blocks from the Hudson River, and with a wide lens we could make out individual sailboats out near the Statue of Liberty.” (Much of the original footage of Lady Liberty and the iconic New York skyline in the film was shot from Governor’s Island.)
Squires says that camera movements were tricky because of the large size of the screen, so the crew turned to Pictorvision Eclipse, a gyro-stabilized aerial camera platform. “No one had ever mounted a Phantom 65 into an Eclipse before,” Squires says. “For the 65 to work well, you have to black-balance it every few minutes, and to do that you have to physically cap the lens—which can be hard when, for example, it’s mounted on an Eclipse under the nose of a Twin Star helicopter a few thousand feet up. AbelCine then came up with a way to [black-balance] remotely.”
Jamie Pence, principal editor at Louisville-based Videobred, felt he’d get the best results if he could keep the postproduction process fluid. “I needed to find a way to do a rough cut in the actual resolution the presentation was eventually going to be seen in. We needed to cut it in its native resolution and we didn’t want to wait for renders. We often use [Apple] Final Cut Pro for projects, but we started looking at [Adobe] Premiere, which we hadn’t used before on something like this. We ran some tests and it looked like it could work,” Pence says. “Adobe Premiere was amazing.”
Pence used a 12-core Apple Mac Pro loaded with 64 GB of RAM and an NVIDIA Quadro 4000 graphics card for processing and connected it to a 16 TB SAN. “We rough cut everything in real time... the Phantom footage and the RED footage. I used an 11-foot projection screen in my edit suite and a Panasonic cinema projector and it was a great workflow,” Pence says. “On a project of this size and scale, you’ve got to be organized, you’ve got to understand technology and you’ve got to be able to troubleshoot.”