At the 2018 Sundance Film Festival, innovation, creativity and new tech tools help filmmakers use AI to create deeper, more meaningful human connections.
Computer-generated characters that look impossibly human, interactive exhibits that evolve from audience interaction and art that literally feeds off peoples’ emotions are just some of the new innovations that captivated visitors at the 2018 Sundance Film Festival.
Using artificial intelligence (AI) to create films once was limited to the likes of Steven Spielberg and directors of big-budget science fiction movies. But more powerful and increasingly more affordable technologies are helping filmmakers use machine learning techniques to both create and inform art.
This was evident throughout Sundance, which has been breaking barriers in technology innovation for over a decade thanks in large part to Sundance Institute’s New Frontier, which supports independent artists in exploring new and emerging mediums, such as virtual reality (VR), augmented reality (AR) and AI.
“This year, New Frontier artists are opening new avenues to a deeper understanding of AI through direct narrative engagement with the technology,” said Shari Frilot, Sundance senior programmer and chief curator for New Frontier.
“AI is a ubiquitous technology that is having a significant impact on the development of the media landscape,” said Frilot.
Until recently, creating computer-generated characters required filmmakers to have big studio backing and costly post-production muscle to even remotely create something able to suspend an audiences’ disbelief. That’s changing, as new technology tools become more accessible.
Computer as Animator
At Sundance, visitors to the Intel Tech Lodge on Park City’s Main Street could see demos by Vancouver-based Ziva Dynamics, whose software uses machine learning algorithms to create computer-generated simulations that can transform the way special effects studios create characters.
“Our software equips people with the tools to create the most realistic representations of creatures and characters as possible,” said VFX pioneer James Jacobs, Ziva’s co-CEO who won a SciTech Academy Award in 2013 for the engineering that powered characters in Avatar, The Hobbit and Planet of the Apes, among others.
Ziva’s software uses offline finite element simulations combined with geometric warping and machine learning to create real-time characters that can progressively learn, according to Jacobs. Instead of learning from time consuming and expensive scans, the software can synthesize data from the simulation itself — in minutes.
Ziva’s software is capable of mapping the natural body features of a real person — or any living, breathing animal — by creating computational models of anatomy and soft tissue like skin elasticity, fibrous muscle and gelatinous layers of fat.
All of those details are fed into machine learning algorithms that learn the body’s movements.
“Imagine a person doing every conceivable thing, and then when you have all this data, you can compress it down into something you can interact with,” said Jacobs. “It looks just as real as a thing that took hours to produce.”
Back at Intel’s Tech Lodge, visitors could also see an example of ProtoMax, a template character that any filmmaker or special-effects artist can customize to quickly create and replicate animated, engaging characters.
“Max is a representative human,” said Jacobs. “A lot of studios are using him right now as a base upon which to create different characters.”
While Ziva’s technology has been used in major productions such as Ghostbusters, Suicide Squad, Fantastic Beasts and Pacific Rim, Jacobs said the technology is now easier and more cost-effective, so more creators can use it.
He said it just requires software and hardware robust enough to keep up.
“We use Intel on all levels,” said Jacobs. “With our software written against a number of Intel frameworks, people using Ziva generate their results by running a number of computationally expensive offline simulations. With our Intel Xeon Scalable server running 8164 processors, we are able to run multiple simulations in parallel. When we used it to build this demo for Sundance, it gave us an over 2,000 percent speed and capacity increase.”
Ziva’s website offers tutorials and a community forum to help creators collaborate and keep pushing the capabilities of Ziva’s tools.
“Ziva is leveling the playing field for independent filmmakers or even aspirational filmmakers to create content in a way that would never have been imaginable before without millions of dollars in pre- and post-production,” said Rick Hack, who manages media and entertainment partnerships at Intel.
Intel will continue to innovate the tools and tech as fast and precisely as possible, said Alyson Griffin, vice president of global marketing at Intel.
“Our job is to help get the amazing experience imagined by the creator out for the world to enjoy,” said Griffin.
Audience as Creator
Movie experiences are evolving from passive to more active endeavors where audiences become part of the story narrative, even changing outcomes, according to Sundance’s Frilot. Today, audiences are becoming an active part of the narrative — even changing the outcome.
“The New Frontier art works, Frankenstein AI: A Monster Made by Many and TendAR, aim to challenge popular assumptions, fears and hopes around AI, through artfully engaging audience participation with it,” said Frilot.
The Frankenstein AI experience celebrates the 200th anniversary of Mary Shelley’s seminal work by examining the cultural and ethical ramifications of bringing something artificial to life — literally creating a “monster.”
The project comes from the Columbia University School of the Arts Digital Storytelling Lab, led by founding director Lance Weiler, a digital pioneer who Businessweek once called “one of the 18 who changed Hollywood” (sharing company with George Lucas and Steve Jobs).
“Sundance is a really amazing opportunity to be able to show the new grammar that’s emerging for storytelling, and what does it look like where machine intelligence and human creativity intersect,” said Weiler. “This project is right at the cusp.”
The Frankenstein AI experience lasts around 45 minutes. Eight audience members are taken into a dark room and paired up with a stranger. The duo swaps stories based on scripted questions — such as a time they felt really seen, or a time they felt isolated.
The questions were painstakingly designed to create instant intimacy and connection, according to Rachel Ginsberg, creative strategist and experience designer on the project.
“This is about human connection first, and that we’re exploring artificial intelligence through a human lens, as opposed to leaning into the really dystopian narrative that has dominated that universe for such a long time,” said Ginsberg.
The project’s AI begins to learn about human emotion based on the participant’s emotional responses to provocative questions like, “Why do people kill other people?” and “What is something you’ve overcome?”
“We’re trying to shift away from those formally known as the audience,” said Weiler. “When you come into this piece it is a monster made by many, so it’s literally at the edge of what’s happening in terms of the culture.”
In another New Frontier installation at Sundance, TendAR is an AI-driven virtual pet that evolves by feeding off people’s emotional responses.
The installation experience has two people tethered together via a shared ear piece. They interact with each other via a mobile app, while both holding the phone. The partners answer several questions and, based on the answers, get assigned a distinct animated fish “friend,” which learns more about the pair and the surroundings by asking questions.
“By going through and engaging with this AI fish, you are training his algorithm and training his software to understand emotion,” explained Samantha Gorman, co-founder of the new media design studio Tender Claws, which designed the app.
As the guppy friend gulps up responses via facial and object-recognition software, it learns how to better read people’s emotions. One of the implications, said Gorman, is that brands could learn about their audience and tailor their products based on the biometric feedback.
“This idea that storytelling can have a physicality to it, can be customized or personalized in interesting ways is like a whole new frontier in terms of the work that can be done,” said Weiler.
“Cinema has been a certain way for over 100 years, but now we have these new palettes and incredible new tools to play with.”