This past December, Vic Love was headed to Jamaica to shoot
footage for a documentary about the history of reggae music. With
him traveled a tech-averse producer and an assistant who was to
function as both the camera assistant and the primary audio
engineer. In Jamaica, the team would hire a driver and another
For most run-and-gun documentary projects, this would be a
customary (if not always sufficient) crew. Perhaps a typical
project of this nature would also have another camera operator and
a second camera. No second operator for Love's production, but he
did bring a second camera — it was rigged to the first. Upon
completion, this documentary will be viewed in 3D.
Love, a stereographer based in Los Angeles, was to function as
the director of photography, the digital imaging technician (DIT),
and the director of 3D photography. In Jamaica, he'd be composing
shots, maneuvering the two-camera rig, and manipulating the
interocular distance between the two cameras — all more or
less on the fly. The latter task requires the stereographer to
adjust the distance between the two cameras using the electronics
of the rig, thereby emulating the separation of our eyes. Typically
about 2.5in., that space between our eyes is what gives depth to
our visual perception; a similar distance between stereo cameras
creates the illusion of depth when we don 3D glasses.
According to Love, manipulating interocular distance is akin to
pulling focus — instead, you're pulling stereo. “You
know how to ride it,” he says. Even most smaller HD cameras
— the Sony PMW-EX3, for instance — are too bulky to
achieve that 2.5in. interocular distance between the centers of the
Love uses a rig with two Iconix Video HD-RF1 cameras side by
side. These 1920×1080 3CCD box cameras capture images to a
Sony SRW-1 HDCAM SR tape deck — which, with its dual-link
HD-SDI connection, can record the stereo 4:2:2 streams. Measuring
at 1.32"×1.50"×1.92" and weighng 2.5oz. each, the cameras
are tiny enough to sit side by side in a rig and achieve the
correct interocular distance, which obviates the need for a
beam-splitter rig. (In that configuration, one camera points down
at a mirror that reflects roughly the same image that is captured
by the forward-pointing camera.) Of course, Love's setup includes
more than just two cameras. There's also the Iconix camera-control
units, which he straps to his side for handheld shots, and the
portable SRW-1 deck, which is on his back.
In Jamaica, Love also used a prototype Stereo Image Processor from 3ality Digital, which alerts the stereographer to any color, iris, or zoom mismatches between the two cameras. “The name of the game is to get two cameras to behave like one,” Love says. He says it takes 15 minutes to align the images properly, which poses a challenge for an unscripted production such as a music documentary.
Still, Love maintains that shooting in stereo is not the black art it might resemble from a distance. He says that over the course of the production, he was able to show the ropes of his niche to the camera assistant/audio engineer, Clifford Cruz. “He took to it pretty well,” Love says. “It's not rocket science. If you have a background in video and film production, it's not counterintuitive.”
Love is undertaking this project in partnership with Stereoscope, a Burbank, Calif.-based postproduction company that's affiliated with the S3D Studio. The outlet for this Jamaica production is not exactly clear — the cineplexes that typically run independent documentaries don't have 3D-enabled theaters outfitted with RealD projector systems. (RealD currently holds 97 percent of the domestic digital 3D market, according to the company, with more than 800 screens in the United States.) The home market is nowhere near ready to adopt stereoscopic 3D (S3D) technology. Hannah Montana/Miley Cyrus: Best of Both Worlds Concert Tour, one of the big live-action 3D success stories of the past year, used '50s-style red-and-blue anaglyph technology in its DVD release. The 3D Super Bowl commercial for DreamWorks' Monsters vs. Aliens, and the accompanying 125 million giveaway glasses (using Intel InTru 3D and ColorCode 3-D technologies, not anaglyph), should make progress toward making the mainstream viewing public comfortable with the idea of watching 3D in the home. The 2009 CES show proved that 3D-ready HD sets are there for the offing, but that's but one piece of the puzzle. The technology is here, but outside of a few notable examples, there exists neither a supply of content nor the demand for it.
So for Love, the fact that he's working on an independent stereoscopic documentary puts him way past the cutting edge — almost into a three-dimensional free-fall. Luckily, he enjoys it. “I love the pioneering aspect of it,” he says. “I felt like an Army Ranger dropped into the middle of a war zone.”
And it's not the first time he's been there, Iconix rig in tow. In May 2007 in Hawaii, he executed a similar production that produced about 7 hours of airborne footage. Love and his pilot attached a rig to the front of an ultralight plane to shoot Hawaiian beaches and other terrain. They also attached it to the wing to shoot the tiny cockpit, throwing the ultralight into heavy 3D relief against the ever-changing vista. (3ality has used this high-flying, immersive footage in its demo reel, which has played at events such as the Bowl Championship Series [BCS] football game's broadcast to theaters in January.)
But for now, most of the action — for Love as a freelance stereographer and for the industry in general — is at the high end. Starting next March with Monsters vs. Aliens, all of DreamWorks' animated features will be relased in theaters in 3D. James Cameron and Fox have begun production on a $220 million project that's expected to serve as a watershed for this current stereoscopic renaissance: Avatar, a live-action/animation hybrid that relies on motion capture. (This production is using Burbank-based Pace's Fusion 3D camera, which was developed by Vince Pace and his company's team in collaboration with Cameron himself.)
“Certainly there's been great excitement around the 3D releases so far, and we've seen an uptick in the 3D business with every movie,” says Bruce Long, president of S3D Studio. “But there's such anticipation [with Avatar] because it seems every time Jim Cameron makes a movie, it moves the bar.”
Long, who was CEO of Iconix until December 2008, has participated in the current S3D resurgence as a manufacturer and as a content producer. Recently, S3D has shot several stereo productions that aren't aimed at the cineplex, including a commercial for eMotion studios for the CES floor and a proof-of-concept demo involving three stereo Iconix rigs and two grappling mixed martial artists. “We're committed to working in 3D, but we're not naive enough to think that 3D will carry the day over the next 18 months,” Long says. “We believe it's going to take a little longer to deploy than we thought a year ago.”
For that reason, he says, S3D Studio and its associated but independent postproduction company Stereoscope are diversifying their business models to incorporate 2D productions as well, as the companies develop ways to make stereo productions easier and cheaper to do. The ultimate goal, according to Long, is to get 3D budgets in line with those of traditional 2D productions. “Then and only then will the financial support for distribution be able to make it over the hump,” he says.
Whether or not Avatar starts a stereoscopic craze in the realm of dramatic films, there's likely to be another area that sees a surge in viewer interest and production: live sports. The NBA All-Star Game in February 2008 was the first live sporting event to be transmitted in S3D, though its reach was limited to a 500-capacity theater at the Mandalay Bay Resort and Casino in Las Vegas. To shoot the event, Pace employed five Fusion 3D HD cameras, based on stereoscopic pairs of Sony HDC-F950s.
Pace's main crosstown competitors also use Sony cameras. 3ality is perhaps best known for its production of U2 3D, a concert film that's considered a triumph of stereoscopic technology. That project required months of painstaking postproduction in order to correct minute deviations in camera sync and to adjust the depth of frames across cuts.
Shooting live, of course, there are no such second chances for adjustments. And recently, 3ality has dived headlong into live stereoscopic sports production. Luckily, the company, founded in 2000, has been perfecting on-the-fly camera calibration all along. 3ality founder and CEO Steve Schklair says that on the production end, the most expensive aspect of shooting live stereo 3D is the years of R&D that have gone into the software that enables live cuts from camera to camera. “It's really easy when you're shooting 3D to change depths,” Schklair says. “You've got a director in an [outside broadcast] truck cutting from camera to camera, and you're not sure which camera they're going to go to. If the depth is different on each of those cameras, it'll tear people's eyes out making those changes.”
That's the depth problem, and its solution required eight stereographers when 3ality shot the Dec. 4, 2008, NFL game between the San Diego Chargers and the Oakland Raiders for a live broadcast to movie theaters in Boston, New York, and Los Angeles. 3ality used eight dual-Sony HDC-1500 beam-splitter rigs in a split-block configuration, which means the optical block is extracted from the body and wired back to it in order to make the rigs smaller. Those eight stereographers were each assigned a single stereo camera rig, and the directive was to pull convergence match the camera's depth to that of the program camera, so as to minimize viewer headaches.
Such a labor cost is probably not sustainable for a broadcast industry that's ruled by quarterly profits. Love, who worked this game as a stereographer, sat in the truck with seven other convergence operators and manipulated sliders to control the convergence angle of the camera to which he was assigned. 3ality has made quick progress in its foray into live sports. Even at the December NFL game, 3ality was plotting behind the scenes how to cut down on the unwieldy manpower costs. As a proof of concept, a rackmounted SIP2900 unit in the truck was doing the job of the eight convergence pullers. “It's now handling automated depth balancing so that we're using image processing to read the depth of every shot, and all the cameras are slave to the program camera depth,” Schklair says. “So no matter what the director cuts to, there won't be a jump in depth.”
The next live football event in 3D was the BCS game between the Oklahoma Sooners and the Florida Gators on Jan. 8 in Miami. 3ality was ready to roll with only one convergence puller with an assistant. The SIP2900 was ready for primetime. For live 3D sports to have a viable future, these budget-cutting measures will have to continue. “The more we do it, the less expensive it becomes,” says Jerry Steinberg of Fox Sports, which broadcast the BCS game. “And the less expensive it becomes, the more we can do.”
On the evening of Jan. 8, I sat in a mostly full theater in Brooklyn with about 100 college football fans and other technologically curious viewers wearing RealD's bulky, polarizing glasses. (The polarization effectively blocks the right eye's view from the left eye and vice versa — if you close your right eye and look at yourself in a mirror, the left lens will appear transparent and the right will be almost opaque. In the theater, the projector alternates left and right eye frames at a very high rate; the liquid-crystal screen in front of the single projector's lens polarizes the image circularly: clockwise for the right eye, and counterclockwise for the left.) Like most of the folks in Sooners caps and Gators jerseys, I'd paid my $24 and was eager for a head trip to enhance some superior football.
For the most part, 3ality succeeded on this count; the audience did audibly protest at several brain-bending moments. But watching the game in 3D opened up much more than just a visual dimension — it helped a non-football fan understand the nuances of the game a lot better. Viewing players in 3D on a theater screen, I started to appreciate better the various body types of football players and their roles on the field. (Seeing the massive yet agile left tackles in three dimensions, for instance, brought home the reason these rare specimens make almost as much money as quarterbacks.) The lower-to-the-ground coverage by the Sony rigs enhanced the stereo effects as running backs leapt out of the screen. Hits were more brutal.
As with HD, many expect live sports to be a killer app for the home adoption of S3D. If the unparalleled popularity of pro football doesn't trigger a surge in adoption of stereo 3D in the home, there are still videogames and the DVD releases of the upcoming theatrical S3D releases. At this point, the technology is still developing, but quite viable today. The content is piling up slowly but surely. The demand is slowly emerging — in January, My Bloody Valentine 3D did more business in 3D than in 2D. Outside of the 800-or-so 3D-equipped theaters, however, the distribution channels are unclear. But that's not stopping independent content producers such as Love from suiting up with heavy yet sensitive gear and dropping into foreign locales, pulling stereo, and troubleshooting as they try to frame their shots.
If the demand for stereoscopic 3D content is to continue its growth, skeptics are going to have to be won over. It's not HD — naysayers can't be convinced of a new format's superiority the first time they walk by a football game playing on a new television at Best Buy. For the most part, they're going to need glasses.
How are theater chains going to convince their customers to pony up the premium to view a 3D title for the first time? A company called Alioscopy thinks it has the solution.
Glasses-free autostereoscopic 3D is the holy grail of stereoscopic display, and as such, it's an expensive, bleeding-edge proposition in 2009. Still, for certain public display environments, no expense is too much — witness the success of Panasonic's 103in. plasma. Alioscopy is targeting movie-theater lobbies as one environment that's worth the price of glasses-free display. And the company's current offering is a technological leap forward. It's not stereo, technically — by assigning a tiny lens to each sub-pixel (thereby sacrificing a little resolution, but not much), Alioscopy displays are able to display eight camera views. A viewer standing in one of the screen's many sweet spots can pan left and right slightly and see around an object to a degree that's impossible for traditional stereoscopic display systems.
It's not just theater lobbies where Alioscopy wants to play, of course. There's also medical and aerospace firms that would benefit from a hyper-3D screen in order to view complex models. “We're in the process of moving into two research and engineering agreements with two medical firms,” says Pia Maffei, director of operations for Alioscopy.
But on the content side, who's producing eight camera views at this point? Alioscopy is playing in that space as well. The flagship product from the company is a modified NEC 40in. LCD display with a lenticular cover, but there's also a 24in. display that Alioscopy offers production companies creating autostereoscopic content. To facilitate this new production paradigm, the company also offers content creators the necessary scripts and plug-ins to achieve eight-camera-view rendering in Softimage|XSI and Autodesk 3ds Max and Maya, as well as e-training.
There's competition for Alioscopy in the autostereoscopic display space; for several years, Philips has offered glasses-free 3D flatscreens covered by lenticular sheets, but these actually display a single-camera view with Z-depth information transmitted via grayscale percentages. Of course, Philips' new 52in. LCD, the QFHD 3D display, blows its previous 42in. and 52in. 3D models' specs out of the water. A new rendering technique, combined with the quad-HD resolution of 3840×2160, enables up to 46 camera views. For its part, LG Electronics is pushing glasses-free True3D displays to the home market.
Will the home market be ready for 3D any time soon? The stereoscopic content producers I've spoken with sure hope so. A lot of pots are coming to a boil at once: The dropping prices of suitable HD cameras make lower-budget independent S3D productions possible; the preponderance of 3D-friendly post software means postproduction in stereo isn't such a black art anymore; CES 2009 saw another crop of consumer flatscreens that are 3D-ready; starting this year, every Pixar and DreamWorks animated release will get a 3D theatrical run — and those studios will undoubtedly want to repurpose the stereoscopic versions as home releases.
Early adopters might soon take the plunge — the first blockbuster videogame that's viewable in 3D is going to drive the purchases of 3D-ready television sets. (Avatar's companion videogame will be in 3D, according to Director James Cameron.) The technologically adventurous might even get used to the wireless liquid crystal shutter (LCS) glasses that go with the 3D sets from Samsung and Mistubishi.
But the big question now is how autostereoscopy will come to the home. Few observers believe that the glasses approach is viable for mainstream viewers. (Among other concerns, who's going to have a dozen sets of LCS glasses lying around for the Super Bowl party?) For now, the price premium for a lenticular overlay is simply too much for a technology with extremely little associated content. And 3D effects simply aren't as eye-popping in autostereo as they are when viewed with glasses. That's a technological problem without a clear solution.
The other big problem for home viewing in stereo is obvious. Where's the content? And where's the demand for 3D content if there are as yet no viewers? It's the classic chicken-and-egg impasse that's familiar to observers of HD's adoption process. Stereographer Vic Love says he thinks that simply having an outlet for home display of stereo 3D material will help solve the problem of nonexistent content. If cable providers build a venue, stereoscopic productions will come. “This year's going to be about content producers trying to monetize their content away from the traditional theater-distribution model,” he says. “As soon as you can monetize the content, there will be a lot more produced.” Your move, networks and cable providers.
To comment on this article, email the Digital Content Producer editorial staff at firstname.lastname@example.org.