AI, Your Eyes and 'The Angry River' - Creative Planet Network
Using eye-tracking tech and machine learning, 'The Angry River' narrative changes based on where a viewer looks

What if viewers – not editors — determined the trajectory of a story? Filmmaker Armen Perian intends to find out with the release of his new short film, “The Angry River.”

Perian, along with tech partner Crossbeat New York, have created a film with the interactivity of a video game but without a game controller.

The idea came to Perian during a long editing session. “Someone said, ‘Man, I wish we could just edit this thing with our minds,’” he says. “It was totally off the cuff, but the idea stuck with me.”

His new short film, starring Jim Beaver ("Deadwood") and Brooke Smith ("Bates Motel") employs eye-tracking technology to determine what a viewer is watching – where the viewer’s eye lingers — and then edits itself into one of five possible storylines, each matched to the viewer’s interest.

While enamored with the idea of game theory, Perian "wanted to preserve a cinematic experience, even though you’re watching it on your computer."

Like any other film, a viewer watches the screen, but when the viewer pays attention to something on screen, that drives the action, and the movie changes based on what is being viewed.

A custom-built algorithm turns the five different perspectives into a story that looks and plays like a traditional movie.

A custom-built algorithm turns the five different perspectives into a story that looks and plays like a traditional movie.

"You watch it like any other film. But watching something is still making decisions, and what the viewer decides to pay attention to ultimately drives the action. It changes the movie they see."

Read more: The Next Big Step For AI? Understanding Video

From Armen Perian's short film 'The Angry River'

From Armen Perian's short film 'The Angry River'

Crossbeat New York developed the eye-gaze detection and machine learning that power the story’s algorithm. Perian directed the film, which was shot in Oregon over four days. Alex Hall edited the film into five distinct narrative tracks. A custom-built algorithm turns the five different perspectives into a story that looks and plays like a traditional movie.

"I resist the idea that this project puts me in the tech world. I'm still a filmmaker, a story person. I just wouldn't be able to tell this story without the tech."

Related