Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now

×

The Visual Effects Industry Needs a New Production Model

The use of digital technology to produce visual effects for feature filmsis at an all time high. Today, virtually every Hollywood movie employsdigital effects in one form or another while certain blockbuster titlesfeature hundreds of incredibly complex digital scenes. Whereas JurassicPark involved digital effects encompassing several gigabytes of data (anunprecedented number for 1993), Independence Day, Starship Troopers, andTitanic featured data sets measured in terabytes-a staggering increase injust a few years.

This surge in demand has spurred significant growth in the visual effectsindustry and has created high paying jobs for hundreds of digital artistsand other technical specialists. Today, a dozen or more suppliers take onthe effects load while, just a few years ago, single companies usuallyprovided most or all of the visual effects for a film.

As fast as Hollywood’s need for visual effects services has risen, the costof the hardware used to produce effects has dropped. Visual effects housesthat paid $80,000 for a computer workstation three or four years ago canget more speed and power for far less money. A visual effect that mighthave required a Cray a decade ago can now be executed on a desktopworkstation.

Software prices reflect the same trend. Companies that 10 years agoemployed teams of programmers to write effects software from scratch cantoday license off-the-shelf software with more features and betterreliability for just a few thousand dollars.

While these trends are well known, what does surprise some people is thediminishing return for effects houses. With mushrooming demand andplummeting costs, one might reasonably expect to find visual effects housescarting money to the bank in bushel baskets. But they are not. If you couldget a peek at their books, you would find that many visual effectssuppliers are making scant profits or are operating in the red. For many itis a struggle to survive.

Not too long ago, the failure of two high profile companies, Boss Films andWarner Digital, rocked the industry. One popular hypothesis for theirdemise cited the companies’ huge staffs and high salaries for digitalartists. These enormous overhead costs became unbearable during downperiods between big projects. Another theory was that low-ball bidding bycompetitors was at the root of the financial problems. There is somevalidity to these claims, but they tell only part of the story. Highoverheads and low bids do exist, but they are not the cause of theindustry’s woes. They are symptoms of a deeper and more fundamental problemthat lies at the heart of the way production houses go about producingvisual effects.

The true source of the problem starts to become clear when you examine theper-frame cost of producing visual effects. Because computers have beengetting cheaper and faster, costs per element have been going down.However, complexity is rapidly increasing. In the time that machine speedhas doubled, data sets have increased tenfold. Producers may be runningfaster, but they are falling further behind. This is a scalebility problem.Although the cost of individual workstations has fallen, the total costcompanies are spending on hardware, software, and salaries continues torise. Because the per-frame complexity is not dropping, producers have tobuy more workstations, license more software, and hire more artists just tofinish existing projects. Faced with this spiraling overhead, productioncompanies find themselves in a position where they cannot afford to bewithout work. This inevitably leads to underbidding-a problem the studiosexacerbate by pressuring suppliers to cut costs. It is a vicious cycle, andit will continue to get worse so long as the gap between production demandand production efficiency continues to widen.

This problem affects large companies more than small companies becauselarge effects producers do not enjoy significant economies of scale. Inother industries, large companies produce goods more efficiently than theirsmaller competitors. General Motors can build a car more cheaply than asingle artisan can working in his garage. That is why GM can support thecost of huge factories. But, in the visual effects industry, largeoperations are not necessarily more efficient than small operations-theyare only bigger. Why? Because large and small effects houses follow thesame production model, one that is based on the individual workstation.

When effects houses grow by adding more workstations, those workstationscontinue to operate, more or less, like self-contained production units.It’s as if General Motors hired 10,000 workers to each build cars by hand.As a result, big visual effects companies acquire large company drawbacksin the form of high overhead without gaining much advantage in efficiency.This explains why Boss Films and Warner Digital, two relatively largecompanies, were among the first to go.

Why, after 10 years, are these disadvantages of scale now becoming aninsurmountable problem? Because production companies have come to depend oncomputer prices falling at least $20,000 with every production cycle, aphenomenon that is clearly not sustainable. Machine speed increases do notequal cost reductions. The declining cost of computers may reflect the samepercentage decreases as before, but the real effect of a $2,000 savingsdoes not impact the bottom line the same way that a $20,000 to $40,000 costreduction used to. Even if machine speed could rise ad infinitum, thefuture effect of increases in individual workstation speed will not producethe same benefit.

If data sets continue to grow at the present rate-and there is no reason todoubt they will-then the yawning gap between scaleability and productiondemand will grow wider. Focusing on the speed of individual workstationsalso misses the point. The inefficiency inherent in today’s productionmodel is not primarily due to the slowness of individual computers-it isthe result of slowness between applications. The real issue is not machinespeed or application speed, it is machine-to-machine latency andapplication-to-application latency.

1989*Approximately 600 animators in the industry make a full-time living indigital film effects.*The industry spends less than $100 million for the development of effects.*The average number of shots is less than 30.*Eighty percent of content creation software is proprietary.*Ninety percent of computing is done on SGI graphic workstations.

In the current production model, most components exist as small computerislands, networked to one another in a uniform fashion. In this model, theonly differentiating factor is the number of islands. The system as a wholeis no more efficient than the sum of all the parts. No driving force existsto create a large-scale facility. In order to create a cost-effectivesolution that scales in efficiency, it is necessary to re-evaluate thesystem as a whole. The core of the problem lies not in the number and speedof processors, but in how the processors interact with one another withouta common memory space. A cost-effective solution demands a hierarchicalapproach and much finer granularity in the delineation of data across thesystem as a whole.

Production also bogs down because houses can not arrange work into a chainof individual tasks: modeling, followed by animation, then texturing, thencompositing, and so on. The trouble is, in the real world, effectscompanies can not work this way-production schedules are too short. All ofthese tasks must be ongoing, more or less, simultaneously. Problems thenarise when someone involved in one of these tasks makes a change, for thatchange affects everyone else in the chain. Distributing updates to 100separate workstations takes an enormous amount of time and leads to furtherbottlenecks.

Hardware and software manufacturers have given little attention to theproblem of system-wide latency. They are concerned with making incrementalimprovements to individual pieces of hardware and individual programs. Chipmanufacturers focus on making faster chips. Drive manufacturers on makingfaster drives. Software developers want to make their software run fasterand provide more features. Everyone is busy making their piece of thesystem work better, but no one, it seems, focuses on making the productionmodel work better as a whole.

We need a new production model that will enable large visual effectsoperations to take advantage of their size, work more efficiently, anddramatically reduce the per-frame cost of producing visual effects. Kodak’sCinesite offers one of the only recent attempts to make a fundamentalchange in the visual effects production model. Cinesite did not try toimprove efficiency by lowering equipment costs-it used more expensivecomputers to significantly reduce the per-frame cost of producing visualeffects. Their concept of reducing the overall cost of producing visualeffects by using more expensive equipment was sound and necessary-althoughit came with too much baggage.

1991*Approximately 2,000 animators in the industry make a full-time living indigital film effects.*The industry spends less than $200 million for the development of effects.*The average number of shots is less than 75.*Fifty percent of content creation software is proprietary.*Ninety percent of computing is done on SGI graphic workstations.

The new production model must enable large visual effects companies toenjoy economies commensurate with their size. A dramatic improvement inproduction efficiency would enable them to operate at a profit. Ideally,the new model would organize workstations not as a network of individualboutiques, but as components of a unified production machine. In order tobe cost effective, the extensible units must be relatively inexpensive(perhaps NT). Because of cost constraints, the extensible pieces probablyattach to the whole system through relatively slow links. Components withhigher data demand can integrate through faster (more costly) links with acore computer cluster built from more expensive and the most efficientcomponents.

Most production companies believe they are already doing this now. However,for the outermost components to be useful in this modified star topology,the system as a whole must be aware of the latencies inherent in all thelevels of memory into which each component can see. In addition, theperiphery devices must retrieve only subsets of data. To use the slowerlinks effectively, data should migrate from the central location to theindividual stations. In this model, the longer operators work, the closerthe data resides and the faster they can work without adversely affectingthe rest of the system.

1993*Approximately 4,500 animators in the industry make a full-time living indigital film effects.*The industry spends less than $600 million for the development of effects.*The average number of shots is less than 150.*Thirty percent of content creation software is proprietary.*Ninety percent of computing is done on SGI graphic workstations.

This proposed workflow demands a finer breakdown of available memory. Thejump between local and remote memory is too great and requires the movementof entire contiguous data pieces before any work can begin. Someintelligent non-standard changes in data sharing at the memory and filesystem levels would produce large strides in increasing the overallefficiency of visual effects production.

Such changes in data sharing would require multiple delineations betweenlocal memory and remote memory with all gradients capable of sharing at thepage level (as opposed to the whole file level). The outermost machinesneed to become page granular memory caches for the inner layers of thesystem. Of course, each distinct machine would maintain some amount ofprivate memory for local non-shared processes and operating system kernel.Already existent directory-based cache coherency is capable of addressingshared information problems inherent in such a distributed memory model.

We must address similar issues at the file system level. Fortunately,modern technology allows for file system sharing to occur independent of aserver. Distributed file systems allow data sharing in much the same way asdistributed memory. Furthermore, a page granular file system cache allowsdiscrete data to map directly to local memoryand removes the need to copythe data within memory. This, coupled with some form of shared memorybetween workstations, would allow for the creation of a true virtualmachine that we could apply to the production process as one cohesivesystem.

This would solve the latency problem, as each workstation would have directand immediate access to the same data. No more downloading scenes from aserver. As artists work, the data migrates throughout the facility. Therewould no longer be a need to provide updates to everyone in the productionchain, as changes would be immediately available on a system-wide basis.Multiple workstations could also work on the same scene simultaneously.Amuch finer breakdown in the representation of the data in memory, removalof the file system server (and all the overhead associated with sharing atthat level), and direct mapping of file system cache data to sharedworkstation memory could accomplish this memory migration.

Some will argue that visual effects operations cannot be scaled becausecreating visual effects is an “art,” but that is simply not true. As thingsstand today, visual effects companies have little time for artisticexperimentation-they are too busy cranking out effects. Only when theirworkload is made manageable again will visual effects houses enjoy theluxury of time necessary to experiment for art. Only if the effectsbusiness is made economically sustainable will artists get the chance tocreate the kind of breakthroughs that inspired them in the first place.

Star Wars and Terminator II changed the face of movie making and fueled adecade or more of intense invention, excitement, profits, and growth.Artistic and technological investment at the production company levelcreated the innovations of these films. Today, neither production companiesnor software developers can develop a new production model on their own.The problems that must be solved involve how networked computers sharememory and access data, and these issues are imbedded in the operatingsystem. Only a manufacturer has the ability and resources to makefundamental changes to the operating system architecture.

When the digital entertainment industry needed to complete tens of shots,it required the utility of a single integrated compute environment-aworkstation. Silicon Graphics stepped up to the plate to provide thesolution. As shot complexity grew in both size and number, the industrycould always count on SGI to deliver the breakthrough technologiesnecessary.

Most often, these breakthroughs grew outside the world of standards only tobecome standards themselves. Hardware advances such as processor cache,symmetric multi-processing, and non-sequential instruction executioncoupled with operating system advances such as a full suite of TCP/IPservices, Sun RPC services, XFS, 64-bit address space, OpenGL, and runtimelinking have increased the efficiency of the individualmachine.

1995*Approximately 7,500 animators in the industry make a full-time living indigital film effects.*The industry spends less than $1 billion for the development of effects.*The average number of shots is less than 225.*Ten percent of content creation software is proprietary.*Eighty percent of computing is done on SGI graphic workstations.

Unfortunately, though advances have made inter-operability possible, littlehas been done to address multi-workstation system-wide efficiency. Todayall the inter-operability software is buried under multiple levels ofstandardization; it is impossible to extract any appreciable amount ofefficiency. Non-standard innovation again is probably the only way to solvethe problem at hand. The next logical step is for someone to step back andaddress the industry’s need for a single integrated compute environment, nolonger at the individual machine level, but at the systems level. It is notat all clear that there is a manufacturer ready and able to take on thischallenge since the effort to develop a new production model could costtens of millions of dollars. Silicon Graphics, the most likely candidate,currently lacks the resources to undertake such an ambitious project and,caught up in trying to broaden its market, seems also to lack the will.Microsoft has no interest in developing a niche market in the filmindustry. IBM, Sun, and Apple, too, appear unlikely to step forward with asolution.

If no manufacturer accepts responsibility for addressing this problem, theconsequences for the film industry could be severe. Large effectsoperations will continue to find it impossible to operate profitably, andthat could lead to a shakeout similar to that of the 1980s when severalleading CGI companies-Robert Abel & Associates, Digital Productions,Cranston/Csuri, and Omnibus-went bankrupt, leaving only one principalsupplier, PDI, standing. If the flaws in today’s production model are notsolved soon, many more companies could go bankrupt. As production companiesclose, hardware and software vendors will be forced to move to greenerpastures, leaving a much smaller visual effects industry to face a slow andpainful recovery.

In the field of visual effects, some of the most difficult times have comeon the heels of significant films that have epitomized the technicalprocesses of the day-taken to a new level. Usually the result of oneindividual’s vision against incredible odds, such films take their place ascrowning achievements in the art of visual effects. Those knowledgeable inthe art marvel that films such as 2001 and Blade Runner could get made atany price. But all too often, these achievements are also box officefailures, and it is only years later that they clearly stand alone as ameasure of a particular era in visual effects. It is a tribute to JimCameron that Titanic is both a commercial success and a watershed visualeffects statement. But, like 2001, it too marks the end of an era in visualeffects, not the launch of a new model.

1997*Approximately 10,000 animators in the industry make a full-time living indigital film effects.*The industry spends less than $2 billion for the development of effects.*The average number of shots is less than 300.*Ten percent of content creation software is proprietary.*Seventy-five percent of computing is done on SGI graphic workstations.

There is no surpassing such motion pictures without significant changes andadvances to the art and technology of visual effects. The manufacturers whocould provide the next-generation solutions have turned their backs onproduction, and it seems unlikely that the movie industry can generate theneeded innovations alone. It was a long nine years between 2001 and StarWars, and once again the barbarians are at the gate. The Renaissance isover and a new Dark Age has begun.

Ray Feeney, the founder of Silicon Grail and RFX, developed many visualeffects technologies that are now industry standards. Formerly of RobertAbel & Associates, Feeney has earned Academy Awards for his work onmotion-control camera systems, the Solitaire film recorder, film scanningtechnology, and software used in blue screen matte extraction.

Kevin Mullican is a co-founder of Look!, a provider of visual effects anddigital production services for motion pictures, television, andcommercials. Previously the head of research and development at RFX,Mullican has provided consulting services on digital technologies toeffects houses including DreamQuest Images, Rhythm & Hues, and VIFX.

Chuck Spaulding, Silicon Grail’s director of marketing and businessdevelopment, worked as a documentary filmmaker and a television editorbefore becoming involved in the visual effects industry. In 1992, heco-founded the computer animation company Procrastination Animation. Twoyears later, he joined RFX and has been working with its sister company,Silicon Grail, since its founding in 1995.

Close