Crime, Cops & Codec Testing: Netflix Short Film “Meridian” Integrates HDR, HFR & 4K

The primary purpose of this short film is as test footage to evaluate streaming codecs and image acquisition technologies.
Publish date:

Meridian: a circle of constant longitude passing through a given place on the earth’s surface and the terrestrial poles.

Meridian: a set of pathways in the body along which vital energy is said to flow.

It’s difficult to know whether either of these definitions applies to what is going on in the Netflix short film “Meridian,” which offers up a quintessential “riddle wrapped in a mystery inside an enigma” scenario. Persons gone missing on the California coast in 1947 draw the attention of L.A. cops, but here, unlike in most short films, the crime—if there is one—goes unsolved. “Meridian” concludes with both investigating officers trapped with the others in a zone of darkness that is more “twilight” than conventional noir would allow.

However, the primary purpose of this short film is as test footage to evaluate streaming codecs and image acquisition technologies, so perhaps it’s not necessary to focus on the plot so much as the way the “Meridian” looks.

Image placeholder title

“Meridian” opens with archival Kodachrome footage of Los Angeles shot in 1947.

The film’s director and co-writer, Curtis Clark, ASC, has been involved with new technologies extensively, both as a cinematographer and as chair of the American Society of Cinematographers’ Technology Committee. Before “Meridian,” Clark and his group worked with Digital Cinema Initiatives to produce standardized test material to study the effectiveness of digital cinema projection.

Intended by streaming giant Netflix to serve as a test bed for any and all looking to learn from the challenges that went with melding multiple high-end technologies, the 4K 60 fps HDR short film “Meridian” was designed deliberately to pose significant engineering and reverse-engineering challenges. (Under a Creative Commons license, the film’s source .mxf file is available for download and evaluation at

“Vendors would be welcome to deconstruct compression codecs to see what impact they have on the film’s images,” explains Clark, “while also getting a look at how SMPTE’s new Interoperable Master Format [IMF] works to resolve issues that arise when files pass from studios to distribution venues.”

Image placeholder title

The driving scenes—during which a sudden change in weather darkens the skies and makes the clouds erupt with lightning—combine aerial cinematography with a studio greenscreen shoot.

Clark continues, “I’m not sure that all these technologies and methodologies have ever been combined in one project. I’m very familiar with each of these high-end aspects individually, and except for high frame rate [HFR], have used them all previously, but dealing with them as a converged entity was something new.”

Clark is excited by the way this combination of technologies is showcased in the film. “People get to see how these elements can be combined in a way that expands the creative canvas,” he elaborates. “And it doesn’t force one into surrendering to a video look and feel, so there’s a potential aesthetic gain arising out of all this. Initially I was looking at the HFR part of things with a raised eyebrow, but when combined with HDR, it does create an incredibly immersive effect. Getting all this to work together in a harmonious way, in an additive fashion, makes this more than the sum of its parts. For example, the sense of presence in the highlights takes on a dimensionality that is almost magical.”

High among Clark’s creative objectives was maintaining a filmic look and feel. “Part of that was that we didn’t compromise camera composition and lighting by shooting with multiple cameras,” he acknowledges, “but just as important was a Kodachrome emulation process. Kodachrome was the stock de jour for a long time with amateurs, and since we open with archival 1947 16mm reversal footage shot on that stock in L.A., that gave us a starting point. Color scientist Joseph Slomka and final grading colorist John Daro at FotoKem created a lookup table within [open source] ACES called a Look Modification Transform [LMT]. This LMT incorporated parameters that gave a Kodachrome feel but let it effectively live in this very different context of digital acquisition.”

Image placeholder title

“VFX had to match their sky plates to our on-set lighting cues,” notes Clark, “including the storm cloud transitions. I find VFX can achieve goals more effectively when that part of the operation is integrated and not an after-the-fact consideration.”

The team also incorporated a subtle, fine film grain, helping offset any “video” feel from digital motion picture camera HDR acquisition. “There’s always a question of just how much  motion blur is desirable for the cinematic look. If you’re at 24 fps, it is an intrinsic component. Clark took inspiration from Doug Trumbull’s 60 fps Showscan film system, recording at 59.98.

Although the shoot lacked time and resources for a full previsualization, Clark availed himself of storyboards. “I work with my editor [David Sconyers] during preproduction so he understands my ideas about how shots will cut together,” he reveals. “Having that kind of clarity in communication is just a no-brainer to me, no different than the kind of close collaboration I need with my cinematographer [Markus Förderer, BVK] and production designer [Mari Lappalainen]. This was a bit different from a conventional TV production in that I had creative control, so no one was second-guessing what I shot. I honestly believe most single-camera cinematic styles can be adapted to work for TV. You shouldn’t have to routinely rely on multicamera setups.”

Except for scenes showing a drive to the coast, which were lensed on the 6K RED Weapon, “Meridian” relied on the 4K Sony F65 CineAlta camera. The first dialogue sequence, set in a squad room office, serves as a visual bridge from the archival 1947 Kodachrome; to facilitate the blend, older Cooke Speed Panchro lenses were used, while later scenes employed Leica Summilux C glass. The wide range of shadows and light here, as well as the cigar smoke, makes the scene in the squad room challenging to encode.

Image placeholder title

Investigating a cave on the California coast.

“I prefer wider lenses for their greater depth of field,” Clark declares, “and believe the graphic qualities present in carefully composed imagery can better facilitate a compelling visual experience. I shot close-ups on a 35mm lens instead of the usual 50mm or 75mm, which I think has a more immersive impact on a viewer’s emotional experience with the scenes.”

The driving scenes—during which a sudden change in weather darkens the skies and makes the clouds erupt with lightning—combine aerial cinematography with a studio greenscreen shoot. “On stage, we had a tracking shot taking us across the hood of a 1947 Ford and into the car as the sky dims outside the vehicle,” Clark remarks. “That was a single-axis movement, so while there were slight camera adjustments, nothing happens that would produce motion blur.  I’d have shot this in the same way even if we were at 24 fps.” He used ARRI SkyPanels to effect the transition from sunlight to cloudy conditions. Digital Sputniks provided lightning-style flashes.

“VFX had to match their sky plates to our on-set lighting cues,” notes Clark, “including the storm cloud transitions. I find VFX can achieve goals more effectively when that part of the operation is integrated and not an after-the-fact consideration. I had my VFX supervisor [Bill Taylor] on set with me as we shot so we could talk things out and not have surprises that hurt us later. Initially we thought the storm clouds would be a matte painting solution, but then we found real storm cloud imagery that could be modified to work for us instead. Ultimately it was quicker and less expensive than the original solution, and it looked right, too.”

Image placeholder title

One of the police officers enters a black limbo environment within an otherwise natural looking rock formation.

The film climaxes with the second officer entering a black limbo environment within an otherwise natural looking rock formation at the shoreline. Within, he finds a strangely animated clock and a ventriloquist dummy whose eyes move of their own accord. “The clock was done in-camera with a zoom lens to make it change in size, and the rotation aspect was handled in post,” says Clark. “The dummy gave the scene a feel of The Twilight Zone, which is something also echoed when he sees the missing men, who are like mannequins, another image I associate with that series.”

As a mystery woman—whose earlier manifestation seemed to trigger his colleague’s disappearance—appears behind him, he sees a stained glass window and a projected image of himself from just before he descended to the beach. “That was nearly all in-camera,” reveals Clark. “When we did the initial shot with the window, we had a subtle, carefully designed tracking movement—which didn’t read like a track because we shot against black so everything remained in place as if it were a static shot, which caused the natural reflections on the floor to appear to liquefy a bit. VFX added the projection element, along with a matching liquid reflection beneath it.”

Postproduction on the short required the creation of multiple versions. “Netflix only streams HDR images to those with HDR-compliant monitors, but it was something they wanted addressed,” says Clark. “We started with the HDR grade, then downconverted to standard dynamic range in Dolby Vision, using the Content Mapping Unit to map HDR into an SDR context. You still see as much of the detail and highlights as you should, but without the dynamic brightness range of HDR.”

Image placeholder title

The ventriloquist dummy evokes

The Twilight Zone


Clark acknowledges that while “Meridian” seems close to a perfect storm of cutting-edge cinematic technologies, there’s no ceiling on technological advances. “We’re on the cusp of a higher level of spatial resolution now, with the 8K RED Weapon and the Panavision 8K DXL cameras.”

Such tools might figure into future installments of “Meridian.” “Beyond this short film, there was a larger story in the back of my mind,” Clark admits. “The very nature of this genre means you can leave some mysteries to percolate and suspensefully stir the imagination, leaving people wanting more. I’m hoping that combining new motion imaging technologies with this kind of immersive approach to storytelling could positively impact visually driven narrative television production. Being able to take advantage of larger display screens with higher resolutions, higher dynamic range and wider color gamut while maintaining a strong film/cinematic feel can change the game in term of how a filmmaker will choose to creatively approach content.”   



Immersive Digital Experiences Alliance (IDEA) to Create Specifications for Next-Gen Immersive Media, Including Light Field Technology

The Immersive Digital Experiences Alliance (IDEA) will launch at the 2019 NAB Show with the goal of creating a suite of royalty-free specifications that address all immersive media formats, including emerging light field technology. Founding members, including CableLabs®, Light Field Lab Inc, Otoy, and Visby, created IDEA to serve as an alliance of like-minded technology, infrastructure, and creative innovators working to facilitate the development of an end-to-end ecosystem for the capture, distribution, and display of immersive media.


NextRadioTV Deploys Extensive MediorNet Routing Solution From Riedel Communications

NextRadioTV has deployed a large-scale MediorNet real-time signal network from Riedel Communications to serve as the backbone for the broadcaster's brand-new audiovisual infrastructure. The 204-node MediorNet system has been installed in the new NextRadioTV facilities on the Paris campus of the Altice Group, which acquired NextRadioTV in 2016, to support signal distribution, routing, and processing over a single decentralized real-time network.