While it may look like it in movies, the multiverse doesn’t create itself. It takes a team of artists, writers, visual effects pros, and in the case of Everything Everywhere All at Once, AWS customer Runway. Based in New York, Runway builds generative AI tools that run in the browser to generate, edit, remix, and reimagine media. After the Oscars hubbub waned a bit, we talked about the future of movies and design with Runway co-founder and CEO Cristóbal Valenzuela.

An AWS AI & ML Scholarship program gives students hands-on experience and skills to compete in the job market.

Tell us about the origins of Runway and how your background in researching and teaching design led to its founding. How is Runway an extension of that thinking and exploration?

My two co-founders and I met in art school at New York University. We were really interested to see how computational creativity and neural techniques—an approach used in machine learning models that mimics in part how we learn—could serve filmmakers, artists, and creatives. And so, we started building tools and conducting research for storytellers to interact with this emerging technology. Today, our research has continued to advance, and we have over 30 AI Magic Tools across video, image, 3D, and text, all with the goal of democratizing content creation and making creative tools more accessible.

What does Runway do and offer?

Runway is a full-stack AI applied research company. We invent and build AI models for content creation, and we develop tools for our users to create and edit content, through every aspect of the creative process, from preproduction to postproduction. We’re allowing anyone, regardless of skill level, to be able to create professional-grade content.

For those of us without a background in machine learning, can you give us some of the fundamentals of generative AI and how it gets applied in your world?

At a general level, generative AI and AI systems are complex mathematical algorithms used to understand data or generate new data points. But for our users, they are simple tools that make videomaking workflows effortless. That’s a huge part of why tools like ours have taken off, because they fit right into an existing workflow with a familiar look and feel. Of course, underneath those interfaces, there’s incredibly complex algorithms working together to produce the results you see.

Generative Image Workflow | Runway Academy

More broadly, generative AI tools work by generating new data using patterns learned in existing data. These algorithms can then synthesize new data that didn’t exist before. To generate new things—like images, video, or text—you can use different input mechanisms. For example, natural language has become one of the easiest ways to control image generation algorithms, but other input systems like video are also possible. This is how the first iteration of our video generation model, Gen-1, works.

How was AI used in making 'Everything Everywhere All at Once'? And can you please also explain what the “multiverse” is?

The Everything Everywhere All at Once film had an unusually small team of visual effects artists who were working against tight deadlines. Evan Halleck was on this team, and he used Runway's AI tools to save time and automate tedious aspects of editing. Specifically in the film’s rock scene, he used Runway’s rotoscoping tool to get a quick, clean cut of the rocks as sand and dust were moving around the shot. This translated days of work to a matter of minutes.

A special effects clip of rocks moving across sand from the film, "Everything Everywhere All at Once".
Graphic provided by Runway.
A clip of rocks moving in the film 'Everything Everywhere All at Once', created Runway, which builds generative AI tools that run in the cloud to generate, remix, and reimagine all kinds of media.
Graphic provided by Runway.

Where does AWS and the cloud fit in the creation process?

AWS is critical for what we do. To create, train, and deploy enterprise-scale AI models requires immense computing power, and our work with AWS has helped us deliver products that millions of people can use.

PBS is making media of all kinds more accessible to more people than ever, with help from the cloud.

Movies are phenomenally expensive to make, and in particular, visual effects are expensive. Do the tools that Runway offers coupled with the AWS Cloud change that calculus?

Yes, absolutely. The power of tools like Runway paired with the cloud make it possible for filmmakers and other creatives to save significant amounts of time and money as they bring their ideas to life. We are building toward a future where it will be possible to craft entire feature-length films and stories—from the characters, scores, b-roll, backgrounds, and everything in between—through words alone and in a fraction of the time.

Are AI-driven tools the future of design? And if so, are they part of a continuum of design tools or a completely new way of thinking about design?

AI-driven tools are absolutely the future of design, allowing artists to express themselves in new and previously unimaginable ways. Tools and technological advancements throughout history have had an impact on design. If we go back to the mid-1800s, there’s a great example of this in the invention of the paint tube, which gave paint a longer shelf life and could be repeatedly opened and closed for painting outdoors. This ultimately led to the impressionist movement for modern painting—it’s no different from the massive creative shift we’re experiencing now in content creation.

How will generative AI drive the film world going forward? What does that look like for creators and for movie fans?

As adoption of generative AI tools continues to increase, we’re moving toward a world where everyone is going to be able to make the films that only a handful of people were able to create before. The rise of generative AI is saving hours to days of editing labor, freeing up time for users to focus on more creative aspects of the job. Movie fans can expect to see entirely generated films in the future.

You just sponsored an AI film festival. What were the rules, and what was the outcome?

Runway hosted the first-ever AI Film Festival with an event in New York City, where we showcased ten short films from creators. The films considered had to be created using AI-powered editing techniques and/or feature AI-generated content. The festival also featured an expert panel to discuss the future of filmmaking more broadly, and this included award-winning filmmaker Darren Aronofsky.

Can we see the winners?

Absolutely. Here are the 10 films that took home the prizes.

Final question, for all the artists—and the artist inside all of us—how should we think about the relationship between technologies like generative AI and art?

We’re on the precipice of a massive shift when it comes to content creation that is empowering for creatives around the world. New technologies like generative AI are allowing more diverse stories to be told and opening the door to possibilities that artists could only dream of in the past. These tools are ushering in the next generation of artistic expression, which will be highly democratized and accessible.

Visit Runway’s new entertainment and production division for a taste of how they’re partnering with the next generation of storytellers.