cpn connect
careers

'Expedition Antarctica': Exploring the VR/360° Postproduction Process

"We chose Adobe Premiere Pro and Adobe After Effects to do the editing and effects work, combined with Mettle SkyBox VR plug-ins," says production company Neotopy.2/27/2017 2:00 PM Eastern

Parisian production company Neotopy includes storytellers, filmmakers and sound engineers who specialize in immersive VR/360° cinematic projects. Neotopy’s team recently managed the postproduction of Expedition Antarctica, an experience distributed through European broadcaster Arte’s 360°/VR platform Arte 360.

Expedition Antarctica is the return to the frozen continent for French director Luc Jacquet, who directed the documentary La Marche de l’Empereur (March of the Penguins), which received an Academy Award for Best Documentary Feature in 2006. The director wanted to return to Antarctica to see what had changed over the years, so he and his filmmaking team, including co-director Jeanne Guillot, arranged an expedition to shoot a second documentary there. The film is a co-production among Arte France, Paprika Films, Wild Touch Productions, Andromede Oceanology, Kolor and Neotopy.

Neotopy VR supervisor and colorist Alexandre Regeffe explains, “We completed all of the postproduction process: conform, color grading, finishing, and creating and compositing of VFX. We also worked as advisors for the storytelling.

“The expedition team came back with a lot of shots,” Regeffe recalls. “We received folders with MP4 files from individual cameras on the rigs. First, we decided with the directors which shots were usable. We chose Adobe Premiere Pro and Adobe After Effects to do the editing and effects work, combined with Mettle SkyBox VR plug-ins. With these tools, we were able to decide how to rotate the whole sphere, in real time, with real-time views in a headset (Oculus Rift or HTC Vive).

“We did a pre-stitch in low res so we could do the editing process in Adobe Premiere with Mettle plug-ins,” he continues. “Once the video was edited, we did the final stitch and processed the stitched shots in 4K ProRes (4096 x 2048) at 60 fps, which is the shooting frame rate we wanted to preserve. We used Kolor Autopano Video to do the stitching.

“Some shots were difficult to stitch. Seam lines were very visible, so we had to handle this with rotoscoping and tracking in Assimilate Scratch VR. Then we went in Scratch VR to do the color grading, shot by shot. The final step was to de-noise, add some sharpness, and create some visual effects to hide rigs.

VR footage in Assimilate SCRATCH

“For the color grading and finishing, we relied on Scratch VR because it offers advanced VR tools that are streamlined into a single workflow. For example, the masks automatically repeat on the right or left side of the equirectangular image. Scratch VR also makes it easy to manage the highlights and lowlights within the material, drawing out the depth of shadows and softening the glares.”

Regeffe adds, “To manage VR material, you need huge amounts of GPU and CPU power. You’re working with high-definition frames and high frame rates, and you want to work in real time. We built several workstations with multiple NVIDIA TITAN X Pascal GPUs to render the final stitching as fast as possible—like some optical flow stitches with the Nokia OZO camera that are very time-consuming at the rendering point.”
Want to read more stories like this?
Get our Free Newsletter Here!

No Recent Articles