Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now

×

 
 

The Mill, Epic Games Collaborate on Real-Time Augmented Reality Project

Short film “The Human Race” fuses real-time VFX and live action imagery.

At one time it was the games industry that wanted to emulate film, but now the film industry is adopting the technology of video games.

In the words of Tim Sweeney, Epic Games founder and CEO, “We’re really trying to bridge the gap between the tools that game developers have used in the past and full suite of features and capabilities of Hollywood today.”

In that spirit, content creation studio The Mill collaborated with Epic Games on “The Human Race,” a short film and augmented reality presentation that debuted on March 1 during the 2017 Game Developer’s Conference in San Francisco.

Combining an advanced implementation of Epic’s Unreal Engine with Mill Cyclops, The Mill’s proprietary virtual production toolkit, “The Human Race” merges real-time visual effects and live-action storytelling. The technologies were pushed beyond the scope of existing real-time rendering capabilities to produce a futuristic film that features the 2017 Chevrolet Camaro ZL1 in a heated race with the Chevrolet FNR autonomous concept car.

https://vimeo.com/206280957

Chevrolet ‘The Human Race’ Short Film from The Mill on Vimeo.

The only physical vehicle filmed for “The Human Race” was The Mill Blackbird, a fully adjustable rig that enables filmmakers to insert almost any car model into any filmed environment. Until now, CG cars were added by visual effects artists in postproduction, requiring days of rendering to produce high-quality imagery. During this shoot, however, live video feeds as well as positional data from the Arraiy tracking system were fed directly into Unreal Engine. The Camaro was then rendered and composited seamlessly into the footage in real-time augmented reality, allowing the directors to instantly see the final look and composition of each shot during filming.

The same real-time technology was used to create, alter and produce the short film, blurring the lines between production and post. The ability to create “final pixels” in real time will ultimately change the way filmmakers create content and make critical decisions.

A live re-creation of the AR visualization was also shown during Epic’s GDC presentation, offering an up-close view of the real-time tech used to produce the project.

‘“The Human Race’ blends cinematic storytelling and real-time visual effects to define a new era of narrative possibilities,” said Angus Kneale, chief creative officer at The Mill in New York. “This is a pivotal moment for film VFX and the coming era of augmented reality production. Using Unreal’s cutting-edge game engine technology, filmmakers are able to see their photoreal digital assets on location in real time. It also means the audience can effect change in films in ways previously unimagined, giving interactive control over vehicles, characters and environments within a live-action cinema experience. With Mill Cyclops, The Mill’s proprietary virtual production toolkit, we are able to render and integrate digital assets into the real world to a level never seen before.”

“We’re always thrilled to showcase the levels of innovation our customers can achieve in Unreal Engine,” said Kim Libreri, chief technology officer at Epic Games. “Today’s presentation [at the GDC] demonstrates what’s possible when combining live action with the power of Unreal Engine’s real-time photorealistic rendering capabilities—essentially taking the ‘post’ out of postproduction.”

The film was conceived by The Mill and Chevrolet to kick off a multi-platform campaign marking the 50th anniversary of the Camaro.

In a series of press conferences last week, representatives from Epic Games, The Mill and Chevrolet detailed the film’s creation, as well as the development of technology necessary to achieve it.

The Mill Blackbird – Why would a visual effects company build a virtual car?

Visual effects content creation studio The Mill specializes in “photoreal CGI, visual effects design and interactive experiences. We’re constantly utilizing emerging technologies in order to tell stories that move people through moving image,” noted Angus Kneale.

Based on the company’s extensive automobile work, engineers at The Mill built a technology called Blackbird a few years ago to solve a problem: the cars they need to film for commercial projects are often not available on their production timetable.

Alistair Thompson, executive vice president, international, at The Mill, explained, “Being a company that does a lot of advertising and filmmaking, we do an awful lot of content that involves cars. That could be car advertising, but in fact there’s a lot of stuff that isn’t direct car advertising but has cars in it. And it might sound strange but it’s actually very difficult to get cars that people want for shoots due to the manufacturing process and design process. Quite often you can’t get the cars you need on set, or it’s just cost-prohibitive, or you’re looking at the designs of the future, concept vehicles. This just kept coming up on a daily basis with all the auto-oriented work we did.”

Sam Russell, general director of global marketing at Chevrolet, noted a second complication in automotive advertising: secrecy. “We work in a very competitive market, and letting people see what we’re going to launch ahead of time can really undermine an effective launch strategy.”

The Mill’s Thompson continued, “We started to push a methodology a few years ago of shooting dummy vehicles. We realized that if we did CG cars, they only really looked good if we referenced them on the movement of a real vehicle. That was great, but it meant that we then had to go and do lots of stuff manually—we had to capture data references for reflections, etc. So we were slightly maddened by that whole process, even though it was working. And we set ourselves the task of building—it sounds a bit crazy, now, thinking about it—but building an adjustable car that could become the shape of any car, and could also capture the environment around it so we can make the perfect reflections for a CG vehicle. We went off and we did all of that and we created this Batmobile-type device, the Blackbird, which can adjust by about 4 feet in length and 10 inches in width. The suspension can adjust to match any kind of car. It’s programmable to match acceleration, torque, all those sorts of things.”

Chevrolet’s Russell added, “Blackbird allows us to film wherever we want, whenever we want, and not worry about having cars or keeping the secret until the right time.”

The Mill Cyclops – Seeing visual effects before postproduction even begins.

The second tech contribution from The Mill on this project was Cyclops, The Mill’s proprietary virtual production engine. As a filmmaking tool, Cyclops blurs the lines of production and post. Directors and creatives are now able to work with finished-quality photoreal digital assets live on location. Until now, photoreal CG objects required days of rendering to produce high-quality imagery. Mill Cyclops allows realistic objects to be rendered and integrated into live-action footage instantly, in real time, drawing on the responsive nature of gaming to create interactive film.

Behind the Scenes: Chevrolet ‘The Human Race’ from The Mill on Vimeo.

The Mill’s Angus Kneale explained, “One of the biggest problems in visual effects is visualizing these effects on location or on set, when you’re shooting things. You can’t always find talking chameleons or get Diablo to turn up for a call time, so the visual effects business has always found inventive ways to help filmmakers and actors visualize what will eventually be there using cardboard cutouts, clay sculptures, tennis balls for eye line, the craft list goes on. One of the most difficult things is you can’t actually tell what the final visual effect is going to look like when you’re actually shooting it. It takes days, weeks, even months until the final visual effect is actually done. After we’ve finished animating, lighting and rendering, it’s only then that you can truly visualize what’s happened.

“While all this is happening on our visual effects projects, our emerging technology team is using game engines for our augmented reality and virtual reality pieces. We started to think about how game engines could help us visualize visual effects in real time on set,” Kneale added. “So we started experimenting. All of this experimentation led to Cyclops, our custom virtual production toolkit. It utilizes real-time tracking and rendering to integrate photoreal CGI into live-action film.”

“The challenge is that when you look through the director’s viewfinder, you see the Blackbird,” said Epic Games’ Kim Libreri. “You don’t see a Camaro or whatever type of car you’re actually shooting. What they wanted to do is take the expertise they have in their real-time group, take our engine and enhance it to do professional-grade augmented reality. So instead of just seeing the Blackbird there, you see a fully lit, real-time vehicle overlaid on top of the Blackbird as they’re shooting. That was the initial goal on this project: to basically come up with techniques and visual fidelity good enough that the film director would be able to see the car that the Blackbird is meant to be transformed into as they’re shooting.”

On top of Blackbird is a camera array with a LIDAR laser scanner and four RED cameras that shoot high-resolution, high frame rate 360° video of the environment in which the car is moving. Footage from the camera array is processed by a computer on Blackbird, then transmitted in real time to Unreal Engine. Lighting, reflections, tracking and positioning data are fed into Epic’s Unreal Engine to produce realistic, nuanced and high-quality imagery of the CG car that is seamlessly and immediately integrated into Blackbird’s environment.

Boo Wong, group director of emerging technology at The Mill, described the shoot for “The Human Race.” “There’s a resident computer on the back of the Blackbird that is taking the video feeds, undistorting them—because they’re all fisheye lenses—and stitching them together into a panoramic environment with high dynamic range—we need high dynamic range for the lighting—and then beaming that wirelessly from the Blackbird into the camera truck. Unreal Engine is in the camera truck, receiving that panoramic environment and then turning it into a lighting environment in real time. It’s incredibly complicated just getting the wireless to work from the Blackbird to the camera trucks. There were unbelievable logistical challenges when we started this project, and we didn’t start that long ago. A few months ago, people were like, ‘Impossible. You’ll never do it. You’ll never track it. You’ll never get the wireless working. It’s never going to work.’ And it worked.”

“So all of the reflections, the interactivity with the light—all of that is actually happening for real,” said The Mill’s Alistair Thompson. “You’re taking the imagery from the cameras, which is being transmitted live, stitched at 360° and then put back over the Unreal Engine-generated CG model. And then we’re tracking that using a system called Arraiy. So it’s all allowing us to bring to life an object that isn’t actually there as if it really is in its environment. That will work for cars, in this instance, but it will work for characters and CG objects, or just about any other type of production. So the days of directors or clients or actors on set being very confused as to what was actually there with them and what they’re interacting with are gone. This toolkit will solve that problem for everybody.”

Epic Games’ Kim Libreri detailed the technology powering Cyclops’ real-time visual effects visualizations. “It’s all well and good to say that your game engine is physically-based and produces realistic pixels, but that all falls apart when you try to combine things with live action. Live action is photorealistic because it is real, and it is a photograph. Making sure that the engine’s physically-based shading models and lighting models were absolutely a match for reality took a little bit of work. You’ll notice straight away if something doesn’t fit into the environment.

Libreri mentioned some of the technologies implemented into Unreal Engine to accommodate this project, including Bent Normal Ambient Occlusion with reflection occlusion, Fast Fourier Transform blooms and Dynamic Skylight Image-Based Lighting, multiple streams of uncompressed EXR images, multi-element compositing, compatibility with NVIDIA Quadro graphics card, and support for Google Tango-enabled devices. These technologies will be rolled out in Unreal Engine later this year.

“A lot of work went into matching everything, in a similar way we would do to movies, but the entire frame on this piece has to be generated in 41 milliseconds,” Libreri added. “It’s a 24 frame a second cinematic piece. We’re streaming EXRs off disk. Our data throughput just in terms of the EXRs is 1.3 gigabytes a second. Which, when I was growing up in the industry, sounds like something from Star Trek.”

Cyclops uses real-world lighting and real-world reflections in order to integrate real-time CGI into the environment.

Chevrolet’s Sam Russell described the appeal of Cyclops technology on an automotive shoot. “When we’re in a beautiful city, we’re in a canyon, we’re in a desert, and we want to film the car, we want to be able to see through the camera when we’re capturing images what that new shiny sheet metal’s going to look like in that environment. Blackbird doesn’t allow us to do that—what we’ll see is that car [the Blackbird] and not the one we want. By adding the technology that Epic has with the Unreal Engine, we are now able to do a live render and, through the [camera] viewfinder, actually see what that new vehicle looks like in that environment. And that to me is the real game-changer. The future of what we can do with this technology is quite exciting.”

State of Unreal | GDC 2017 | Unreal Engine.
(“The Human Race” presentation begins at 16:37.)

The Mill’s Alistair Thompson led a demonstration of the live effects visualization technology at the Game Developers Conference on March 1. Blackbird, configured to the dimensions of the Chevrolet Camaro ZL1, spun on a turntable on the stage, but visible on the monitors throughout the auditorium was a real-time visualization of a Camaro ZL1 in the environment.

“What you see through the camera here is a virtual representation of the car,” said Thompson. “It’s being tracked using laser tracking software through Arraiy. We’re using The Mill Cyclops technology to take the four cameras on top, live stitch those together into a 360° image, and then wrap that into the virtual CG here that you see. And you’re all enjoying seeing your reflections. That’s the beauty of this. It is completely live, completely interactive. A car is basically a mirror on wheels. It’s seeing the world around it.

“As Sam [Russell] mentioned, this changes things for filmmakers. If you’re a director, you’re a DP on set, you don’t have to worry about the Blackbird. What you see through the camera is the car that you need it to be.”

“The Human Race” showcases the incredible technical collaboration between Epic Games and The Mill.

“We asked ourselves, what if we tried to do an entire commercial in Unreal Engine?” said Epic’s Kim Libreri. “Instead of just using it as a visualization capability for while you’re shooting, what if we could take the data from the shoot—the background plates shot on ARRI camera, the 360° video that the Blackbird itself shoots for lighting and reflection information—take that into Unreal Engine and generate the cars shot by shot by shot just like you would do in a nonlinear editor and visual effects pipeline. What we ended up doing is extending the engine considerably to take broadcast-quality or movie-quality EXR frames—as background plates, as lighting environments—and be able to have them, as computer-generated objects in Unreal Engine, cast shadows onto the actual photograph plates, have the photograph plates cast reflections and lighting onto the computer graphics.”

What’s Next for Real-Time Visual Effects?
What’s the next step for this technology outside the world of automotive advertising? “Blackbird is a prototype. Our big thing at the moment is figuring out how to turn it into a much more available production tool,” The Mill’s Alistair Thompson said. “We want it to become as familiar a sight on shoots as the pursuit vehicles you see with the Technocranes on them—the Russian arms, as they’re usually known. So that’s one thing for automotive, but we can apply the same technology to on-location shooting. You use similar technology if you need to see an object or a character during shooting—you just don’t need a massive camera array stabilization system on a car to do it. You can work with a much smaller version of that to actually understand the environment that you’re in.

“I think that’s the big difference between this and other systems out there,” Thompson continued. “A lot of other systems have been based on shooting in studios and re-creating the entire environment in CG; our system is very much about blending the world of live action and traditional visual effects, and all of that with the flexibility of real-time-generated CG characters.”

“At The Mill, we believe in moving people through moving image. A big part of our business is photoreal CGI blended with live action,” said Angus Kneale. “We believe that the story is the most important thing—visual effects should never get in the way of narrative. We’re constantly innovating and looking for ways to engage the audience. We’re at the point now where we can create real-time CGI that looks really, really good, and it’s only going to get better. Using Unreal Engine, we’re able to create photoreal interactive experiences like never before. At The Mill, we believe that not just the future of visual effects is real time—we believe that people will want to customize their content and engage with their media so that it is relevant to them. We see this as a huge opportunity for studios, brands and storytellers alike moving forward.”

Close