Is Realtime Real? Part 2

Web-Expanded Version

Creative fans of such games as Grand Theft Auto III from Rockstar Games are using game engines to previz and even make animated movies.

Early Realtime Previz: Virtus WalkThrough

A movie and a game from the same title are like Siamese twins. They may share some vital organs, but they're still two distinct personalities. Copyrights and demographics drive the connection between these media, but on the development and production side the work is done separately. Although all this keeps lots of people employed, it looks very inefficient to a CEO.

For more years than I can remember, production software and hardware companies have been touting the workflow concept of games and movies being derived from the same assets — literally the same models and scene files. This is engineer speak, and there are so many reasons why this will not happen quickly that it's not worth the space to go into it.

As suggested in Part 1 of this article (February 2005 and here), there is a very real convergence between movies and games. But it's not the assets that are coming together; it's the rendering technology. Anyone who has seen a preview of S.T.A.L.K.E.R Shadow of Chernobyl knows that game images are closing in on broadcast-level visual effects.

Previz in Realtime

We asked Nvidia, a company in a unique position to evaluate the quality of realtime graphics, to compare realtime and full-on renders of a selected scene. Nvidia is the only graphics hardware developer to be simultaneously developing a realtime shader product (Cg) and a production rendering system (Gelato). The images on p. 54 give a detailed look at just what a realtime render solution is capable of doing. No one is suggesting that realtime is going to replace ILM's render farm any time soon, but the wall dividing these two dimensions is becoming unstable (did I actually say that?).

Two images rendered leveraging the Nvidia Quadro FX GPU, the first using the Nvidia Gelato rendering system (left), and the second a realtime render that uses the Cg graphics programming language (right). Note that Gelato is capable of higher-resolution rendering and the use of far more detailed texture maps than realtime rendering.

As early as two years ago, Spielberg used the video game Unreal Tournament for on-set previews of A.I.'s Rouge City. The scene was shot in front of a 160'×60' bluescreen, making it difficult for the director to visualize camera placement. Because the Unreal game engine allows users to create mods (customized versions of the game), ILM artists were able to construct a game version of Rouge City with realtime performance. This gave Spielberg instant previews of camera angles and camera moves.


More recently, Peter Jackson investigated using flight simulation software to choreograph the demise of King Kong on the Empire State Building for his upcoming remake. This is next-generation previz because it's not just the director moving models in realtime. Jackson is going one better and having each plane piloted by a gamer who will then attack virtual Kong. In a sense, this is previz inhabited by the art department. Has a faster way to generate action choreography ever been devised?

Previz has to be the most obvious use of realtime game engines, but technology flows both ways: Gamers who have used their joysticks to control virtual shootouts, car chases, and tank battles have realized that they're the head of production at the cheapest studio imaginable. In the late ‘90s, a few enterprising hackers began experimenting with using game engines to make movies. It has now become a full-fledged movement called Machinima and made it to Sundance this year (in the form of a panel discussion).

Game-inspired imagination sparked the Machinima movement in the early 1990s, when a few enterprising hackers started experimenting with using game engines to make movies. Film pictured: Thin Ice by Mike Berry.

Although game engines make some of the moviemaking process near-instantaneous, the reality of virtual reality moviemaking turns out to be more complex since game engines do not have accessible interfaces like those in 3ds Max or Maya. In fact, the set building and character modeling assets in most games are created in traditional animation software, with the prodigious labor and skill that task entails. Making virtual worlds requires considerable upfront design and programming skill — a fact overshadowed by the improvisational possibilities that jump to mind when your vintage Mustang Fastback careens down the streets of virtual Los Angeles in Grand Theft Auto II. In the early days of Machinima, the pioneers had to trick gaming engines into doubling as production tools. Lately, game engine developers have added features to help Machinimists create and output movies.

Unreal, the North Carolina gaming company that created the Unreal Tournament franchise (among others), has built a collection of camera direction tools into its game engine expressly for the Machinima community. The toolset is called Matinee and is just one part of Unreal's mission to open its software to allow players to customize their games and record their own stories. This parallel movement of player/artists is yet another subcurrent in the confluence of movie and game entertainment.

Rendevous by Nanoflix

Although Machinima is gaining momentum, its future is unclear. Yes, inheriting game assets is a big boost to getting your digital movie made. However, the truth is that achieving specific results in a game world can be nearly as time consuming as creating them with traditional animation. Realtime performance is also available in 3ds Max, Maya, and Lightwave. In fact, any animation application that supports hardware rendering has some realtime capability. However, once the game footage is captured, you still have to take the raw information through some kind of postproduction process for editing, music, and sound editing. At this point, Machinima is anything but realtime.

Then there is the issue of copyrights. A Machinima movie made using assets from a commercial game cannot be sold without the permission of the game developer. Although there is no way to predict what might happen if a Machinima feature based on a game became an underground hit, game publishers seem to be more enlightened about the benefits of viral marketing than the movie studios. Given that cinema-friendly tools are being built into game engines, it's fair to say that the entire Machinima movement has the approval and encouragement of game developers.

The Strangerhood by Rooster Teeth Productions

Machinima has produced a few technically ambitious indie projects, but the first real attempt by professional filmmakers to use the technology in actual production has been by Scotland-based Strange Company. Shrewdly sidestepping photoreal rendering, its first production, Rogue Farm, is a pilot for an episodic animated series based on a story by sci-fi author Charles Stross.

The art direction of Rogue Farm resembles sumi brush painting, with a limited palette rendered using a toon shader, and thus avoids having to achieve realistic rendering. This makes sense because the current generation of cinema tools in game engines is last year's technology, without the latest high-level shading. In fact, all Machinima and previz to date has used older gaming technology. This raises the question of how Machinima is defined. Must a game engine be used, or is the absence of traditional keyframed animation the sole criterion? Or is Machinima only about realtime rendering?

Hardly Workin'' by the Ill Clan

At NAB 2004, Kelseus was showing Antics, its realtime, nonlinear animation system. This is essentially a game-style engine with behaviorally driven character performance. Antics was designed as a previz, prototyping tool, or content creation environment. Character motion is based on imported mocap data or keyframed actions, although Antics ships with a fairly extensive library of motions. The program is particularly adept at making changes to existing walk paths and simple actions. However, dramatic and comedic performance is harder to create and control. This points up one of the limitations of Machinima.

Machinima is the spawn of opportunism. Play a game like Digital Illusions' Battlefield 1942, and you begin to imagine your own Saving Private Ryan. Games are all action, so minimal performance is required. This is the first impediment Machinimists run into when they try to adapt game action to a narrative that isn't, well, game action. The result is stilted performances. Although this limitation is used to comic effect in the Machinima series Red vs. Blue, the gag wears thin quickly. Game-style, scripted, realtime character motion has not achieved the level of sophistication of even a low-budget animated series on television — a problem for narrative filmmakers looking to hook into Machinima.

Like any grass roots creative movement outside the mainstream, the essential appeal of the work is the potential subversiveness and creative freedom. There is no financing and, hence, no rules — good news for anyone interested in breaking new ground. Hopefully that means that Machinima will be a lot more than just a faster way to create game cinematics.

A mass movement of gaming creators is probably several years away, but there is already a trade organization: the Academy of Machinima Arts & Sciences. The organization's website,, is a good first stop online for anyone interested in trying realtime movie-making. Science is driving Machinima and doing so very quickly. We'll have to wait to see where this evolves as the latest game engines bleed into the hands of filmmakers. It may take some time, but a lot of people believe that realtime cinema is a lot more than a game.

One of the first (if not the first) 3D interactive video games, The Colony, was written by David Smith in 1986 and was available on the Mac the following year. A POV-style interface that operated in realtime without special hardware, The Colony caught the eye of Mike Backes (an all-around digital guru whose business card read “futurist”). He brought the game to Jim Cameron, who was at work planning The Abyss. The opportunity, Backes pointed out, was that The Colony and The Abyss both shared similar environments — long corridors connecting small rooms. Cameron hired Smith to create a virtual set and virtual camera system to help previz the film although it later turned out that was more of a science experiment than a practical tool. This was more than a decade before Spielberg's use of similar technology for A.I.

Smith went on to develop Virtus WalkThrough, an architectural previz tool that debuted in 1990. I became an early adopter of the program and taught previz workshops at the AFI using Virtus and other Mac-based graphics tools the following year.

Mindscape was the publisher of The Colony. You can download a free copy of the game at Underdogs (however, they fail to credit David Smith as the developer):

(Recommendations by Hugh Hancock, artistic director, Strange Company.)

Anna by Fountainhead Entertainment

A great story about a flower''s life.

Ozymandias by Strange Company

Very popular adaptation of the Romantic poem.

Rendevous by Nanoflix

Two space probes in love. Wonderful dialogue.

Hardly Workin'' by the Ill Clan

Possibly the most successful Machinima film ever.

Red vs. Blue by Red vs Blue

OK, it''s game satire, but it''s very, very funny.

Also check out DVDs and the Machinima Film Festival at