cpn connect
careers

Immersion Learning: Advances in Virtual Reality and 360° Video

There's a tendency these days to combine virtual reality and 360° video into a single category. Maybe people blur the distinction because the viewing headset is often the same, but that's where the 6/20/2017 1:45 PM Eastern

There's a tendency these days to combine virtual reality and 360° video into a single category. Maybe people blur the distinction because the viewing headset is often the same, but that's where the similarities end. Virtual reality is an idea that's been around for 50 years or more, though it's most commonly associated with real-time interactive simulation—video games, training and simulation work—that first became technically feasible in the 1980s. Spherical video is a more recent concept, and it can involve some pretty upscale broadcast engineering if there's a desire to do it live.

The difference for the audience is the degree of interactivity. In VR, we can go anywhere and look at anything from any angle, interacting with things as we would in a video game. In 360° video, we can only be where the camera is; further, there's generally no interaction with the surrounding world, and it's often a shared experience much like any live event. The purpose of these two mediums is clearly not the same, and the techniques and equipment required to produce exceptional material for either are completely different. The industry has only just begun studying when and how to deploy 360° video for compelling storytelling experiences, and a thorough understanding of the topic is still a ways off. It's becoming clear that simply placing a viewpoint in a fixed position at the edge of a sports arena is far less interesting than watching a conventional broadcast of the same event.

"Virtual reality" meant the playback of recorded experiences in 1983's Brainstorm, with Natalie Wood and Christopher Walken.

When VR was first employed for simulations of the physical world in the late 1980s, with early applications in medicine, aviation and robotics, computing power was a hugely limiting factor. It was simply not possible for computers of the day to render lifelike imagery in real time—not at 30 frames per second, not even at 1 frame per hour. As a consequence, environments were often constructed of wireframes or line drawings.

Recent advances in computing and rendering are narrowing the authenticity gap. The quality of real-time cinematics for video games has started to approximate some of the more traditional visual effects of feature films. It is important to emphasize here that we are talking about visual effects rendered in real time as a game is played being similar in scope to visual effects rendered by a farm of computer processors for a feature film.

The Human Race, an augmented reality project in which the photoreal cars were rendered in real time with Unreal Engine

Some of this feature parity is due to work from Epic Games, whose Unreal Engine is commonly used to create big-budget games for multiple platforms. Unreal Engine's influence reaches far and wide in the media and entertainment industry; the technology is now being adopted in other types of real-time, interactive applications—including enterprise functions such as architectural visualization and cinematic experiences such as VR and episodic television.

In addition to announcing Unreal Engine integration with broadcast partners including Ross Video, Vizrt and Zero Density, Epic Games general manager Marc Petit used the recent NAB Show to introduce the company's enterprise division, which focuses on use of the tech in markets other than games. Additionally, users took to the stage to discuss how Unreal Engine is helping to break new ground in episodic entertainment production and delivery.

The Frontier Virtual Engine from Ross uses Unreal Engine, bringing photorealistic rendering, particle systems and physics to the broadcast virtual set. Previewed at IBC 2016, Frontier was developed in partnership with The Future Group.

Non-gaming uses of Unreal Engine include producing real-time photoreal images of moving cars for a Chevrolet augmented reality experience, rendering Rogue One: A Star Wars Story's K-2SO droid in real time to aid the film's production process, and turning Mattel's Barbie into a prolific YouTube vlogger.

None of this has anything to do with headset-oriented VR, but the technological link is clear: if games now look good enough cross over with (some types of) film production, the gap between the interactive and non-interactive worlds can be bridged.

Nokia OZO camera at NAB Show 2017
(Photo by Al Powers)

An Unreal application linked rather more directly to the world of virtual and augmented reality is Vizrt's Viz Virtual Studio. The system uses Epic's rendering engine in conjunction with camera tracking technology from Ncam to fill an LED video wall behind the talent with a perspective-correct view of a virtual space, as well as to place CG-rendered objects anywhere in the image.

Spherical/360° video remains a nascent technology probably because its audience is so small—a problem that's likely to continue until headsets reach a price-performance ratio more acceptable to mass consumers. Nokia's OZO camera remains dominant because it's an off-the-shelf device with good solutions for stitching (the process of taking the views of multiple cameras and combining them into a viewable 360° video image). Its price—$40,000, though early adopter discounts are available—puts it well out of the reach of casual users.

Three Lucid VR LucidCams in 360 rig

OZO has competition in this rapidly expanding market. Insta360's Pro camera, at $3,500, is in a significantly different price bracket; it offers 8K pictures for less than 10 percent of the price of an OZO. Lucid VR's LucidCam is an even less expensive contender. A single LucidCam (expected to ship in July for $499) captures 180°; three LucidCams are mounted in a triangular pattern for 360°. Resolution of LucidCam footage is 4K per eye.

With its elevated picture requirements, 360° video represents an application perhaps ideally suited to the abilities of modern cameras to shoot very high resolution pictures.

The resolution required for 360° video is something of a bugbear—a fact that Ericsson has been demonstrating at events like CES, presumably because they're keen on everyone having a 5G cellphone contract. 360° images intended for headset display are still recorded and transported as rectangular video frames, with geometry of the image modified to include the complete sphere. As a result, relatively small areas of that rectangular image can end up fairly large in the resulting headset view. If we compare the images produced by a 4K image with those produced by an 8K image, even with current, fairly low-res headsets achieving a resolution of under 2K per eye, the difference is clear. 360° video requires high resolution for good results.

All Mobile Video and VRLIVE collaborated to transmit a live 360° video link between the Super Bowl in Houston and a U.S. Army unit stationed in Poland, allowing three families to virtually watch the game together.

Orah is promoting its 4i 360° camera particularly for live work, something for which there's been a small but noticeable push over the last six months. One particularly upscale example of live production involved a collaboration between All Mobile Video and VRLIVE that created a live 360° video link between the Super Bowl in Houston and a U.S. Army unit stationed in Poland, allowing three families to virtually watch the game together. This event, plus behind-the-scenes material, was used to create a presentation sponsored by Hyundai and broadcast at the end of the game. Recalling the resolution issue, we discover that AMV was forced to use four HD-resolution satellite links in parallel, each transmitting HD-sized quarters of the 4K 360° material from OZO cameras.

A more purpose-built solution is represented by Immersed Live's MCC-1 truck, which uses Blackmagic equipment to provide a complete 12-gigabit SDI workflow to handle both high resolution and high frame rate. The truck is described as "hybrid VR" in that it is able to unify 4K "flat video" and 360° production for dual live production. Inside the truck, dual production bays allow for separate direction of flat 4K video and 360° video, but the 360° operators are able to access all flat video assets and feeds. Integrating these traditional parallel operations ensures that neither is disadvantaged in favor of the other. As well, it avoids the issue we mentioned earlier: watching a live event from a static, albeit 360° viewpoint is often less fun than watching a conventional broadcast. MCC-1 allows Immersed to fully integrate graphics, conventional video and 360° video, delivering a greater level of immersion and entertainment for audiences.

Immersed Live's "hybrid VR" MCC-1 truck has separate bays for flat and 360° video. It was deployed for its first event in April.

The key thing to realize in mid-2017 is that while we're enjoying worthwhile progress in both VR and 360° technology, there's so far been no reason to consider them as anything other than fundamentally quite separate. Whether that means that these related disciplines might have very different destinies remains to be seen. VR as an adjunct to video games works right now and seems well placed to succeed, assuming the price-performance ratio of headsets can improve significantly. Whether the home viewer has any tolerance for yet another new technology, this one for live viewing of 360° video, is less certain.  

Download the July 2017 issue of Digital Video magazine

VR on Twitter:

Unreal Engine on Twitter:

Marc Petit on Twitter:

Ross Video on Twitter:

Vizrt on Twitter:

Zero Density on Twitter:

Ncam on Twitter:

Nokia VR on Twitter:

Insta360 on Twitter:

Lucid Cam on Twitter:

Ericsson on Twitter:

Orah on Twitter:

All Mobile Video on Twitter:

VRLIVE on Twitter:

Immersed Live on Twitter:

Blackmagic Design on Twitter:

 

 

 

 

 

 

 

Want to read more stories like this?
Get our Free Newsletter Here!

No Recent Articles