Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now

×

OK, So… How Exactly Do You Make an “Instant Movie”?

This “instant movie” was all about demonstrating technologies now available to moviemakers—including the use of cloud storage and playout—that make it possible to greatly shorten production time for cinematic productions

This article originally appeared on TVTechnology.com

The Hollywood Professional Association (HPA) Tech Retreat got underway with a “first”—a complete motion picture “short” that was shot, assembled and “premiered” all in a single day.

Although some of the production’s scenes had been recorded earlier, a large portion of the action was captured during the February 18 conference’s opening day program that interspersed the filmmaking with presentations about the various technologies that were used to put the 11-minute comedy together.

Shooting took place at the Westin Mission Hills Golf Resort & Spa—where the four-day HPA Tech Retreat was taking place—as well as on the hotel grounds, and some additional action being captured by drone in a nearby southern California desert. The interior scenes featured action both on stage and off, with conference attendees interacting with actors at one point in the script.

“The Lost Lederhosen” was the brainchild of Joachim “JZ” Zell, vice president of technology at EFILM, a Los Angeles-based company specializing in digital imaging. Zell developed the movie’s storyline and served as the project’s host and producer. He explained that this “instant movie” was all about demonstrating technologies now available to moviemakers—including the use of cloud storage and playout—that make it possible to greatly shorten production time for cinematic productions, allowing operations that formerly took days and weeks of post-production effort to be done in near-real time.

The HPA cinematic production featured a mix of professional and volunteer actors, including SMPTE’s executive director, Barbara Lange. It was directed by Hollywood veteran Steven Shaw, with another television and motion picture notable, Roy Wagner, serving as director of photography.

To achieve the extremely quick turnaround time, the production utilized “virtual production stage” technology, in which prepared video backgrounds and scenery are inserted into the live action in real time through the use of large LED video display screens, rather than being done in post-production via the more conventional, but time-consuming, greenscreen process for montaging such footage.

As explained by conference presenter and chief executive officer at Stargate Studios, Sam Nicholson, a large degree of realism is achieved in virtual production stage scenes through the use of on-camera “trackers,” which allow computers to precisely follow camera movement and adjust the video being inserted into the scene accordingly to mimic what would be experienced in a real-world situation. Even the brightness and color temperature of the background display screens can automatically track changes to match what would happen in a real-world environment.

“If you kill all of the lights on the set, everything will adjust day-to-night,” said Nicholson. “So you save a tremendous amount of time on set because all lights are slaved to a virtual environment. If you go into a tunnel the lights go out. As daylight changes, lights track these changes. Again, everything comes back to what we’ve learned in post-production. That expertise is now being applied to photography.”

The short incorporated all elements of a “real” Hollywood production, including an ACES workflow; color correction; even delivery of “dailies” and an “outtake” reel. Editing was done on three Avid Media Composers located in a Hollywood facility, all cloud-connected to the previously ingested footage. Sound effects and audio “sweetening” operations were performed by Skywalker Sound at the Skywalker Ranch in northern California.

The services of a number of other companies were employed, ranging from Amazon Web Services to Frame.io and Red Bee, to handle such production tasks as color grading and even minimizing the appearance of facial blemishes on some cast members.

Read more: Anatomy of a Cloud Workflow: How the HPA Made a Movie with Frame.io

In addition to demonstrating how cinematic production time can be drastically reduced by performing tasks on set as the shooting occurs—instead of in post-production—the HPA “Lederhosen” project illustrated how cloud technology can allow multiple entities operating from a number of widely spread geographic locations to collaborate on post-production.

Five loaned cameras were used in the shooting of “The Lost Lederhosen:” an ARRI Alexa, a Sony Venice, a RED Monstro2, Blackmagic Design URSA Mini, and a Panavision DXL2.

Read more: HPA Tech Retreat 2020’s All-New Supersession

Close