What I learned making an AI movie trailer

What I learned making an AI movie trailer

In 2014 I wrote a movie script for a character called "Frank Fenix." It never got the greenlight from Hollywood, so it sat collecting digital dust in my Google Drive for over a decade.

But with the advent of new generative AI video tools like Veo 3, I thought this would be a great chance to bring the script to life with an AI trailer.

The Constraints:

  • Everything seen and heard needs to be AI. (The visuals, music, dialog)
  • Use only tools from Google: Gemini, Flow, Veo 3, Google Vids, and MusicFX.
  • AI needs to create the prompts. (Or more precisely, I need to prompt AI to make the prompts.)
  • The whole video needed to be finished in a Sunday morning. (Otherwise, I would tinker with it for months.)

Here's the result:

Frank Fenix started as a project in grad school

In 2014, I took a transmedia storytelling class taught by Rob Salkowitz as part of the Master's program at the University of Washington. The class was a blast. It was about the rise of comic books and how media properties used transmedia storytelling tactics to engage audiences and build fandom across multiple media types. It was a prescient course as it was prior to the Marvel Cinematic Universe becoming one of the largest media franchises of all time.

As the final project, students were asked to pitch a transmedia storytelling strategy as part of a marketing campaign to promote a product.

I came up with Frank Fenix, the detective that was so smart that he realized he was a fictional character in a reboot of his own story.

Here's a Google Slide link to the full presentation from 2014.

Frank Fenix Inspiration: Dick Tracy and Warren Beatty

Frank Fenix is mostly inspired by Dick Tracy, the detective comic started in 1931. Warren Beatty made into a bonkers movie in 1990, with incredibly stylized art direction and prosthetics. It's flawed but way ahead of its time.

In 2008 - nearly 20 years after the movie – Warren Beatty did something wild: he made a TV appearance with Leonard Maltin in character as Dick Tracy. It aired only once but it's preserved in infamy on YouTube.

Beatty would go on to record an even more bizarre interview over Zoom called Dick Tracy Zooms In in 2023.

Evidently, these schlocky appearances are part of Beatty's ploy to keep the rights to Dick Tracy.

The 2008 appearance got me thinking, "What if a fictional detective like Dick Tracy caught on to the fact that his IP was being manipulated?"

I coupled this idea with the trend of gritty reboots of Hollywood franchises, which was at its peak in 2014.

I continued to work on the Frank Fenix project after the class concluded. I turned my presentation into a movie script and a "transmedia bible." I even met a development executive at Comic Con 2014, and then visited a Hollywood studio to pitch the script.

Unfortunately, my pitch wasn't very good (and to be honest, neither was the script). Frank Fenix never got the greenlight and I shelved the script for a decade.

Rebooting 'Frank Fenix' with AI

I'm always tinkering with AI tools and as a former filmmaker I'm especially interested to see how AI video generation will transform filmmaking. Naturally, the new Veo models from Google caught my attention. They seemed like the perfect tool to reboot Frank Fenix, a project that would otherwise never be made.

(Google is also my employer, so it's a bonus that this would help me familiarize myself with their products.)

Here are some thoughts:

It's insane what you can accomplish in a short amount of time.
What previously would have required $100K+ and a year of effort could be accomplished on a Sunday morning. It was one of those times where technology feels like magic. But...

AI isn't automatically perfect
You work with the AI, not just set it and forget it. A single shot required multiple takes and fine tuning. Sometimes the 2 seconds of the clip would look perfect, then it would do something totally nonsensical and random. I was able to salvage these in the trailer with quick cuts, but it was puzzling and a constraint to work around.

Identical prompts get drastically different results
If I was on a film set shooting a second take, there would be incremental changes. I would have the same set, actors, and props, and I could direct actors to be faster or slower while other factors would be mostly consistent. But this isn't true at all with AI – at least not in its current state. The same prompt or a near-identical prompt with minor tweaks would reset everything – new framing, camera motion, lighting, set, actors, and other random factors.

Spoken AI dialog gives me goosebumps
There's a shot in the trailer that happens at 35 seconds in (time stamped link). A woman says, "Do you think he had a psychotic breakdown?" And a man replies, "Maybe it's some deep form of method acting." It felt surreal to see two characters interact with each other that are entirely AI. (It's very difficult to get two speakers in one shot... this is why you see so many AI clips where there is only one person speaking.)

There are different levels to AI generated video
You can simply enter a prompt and leave it as an 8-second clip. You can string a few of these shots together in a montage or theme.

You can go a step further like I did, and edit in a tool like Premiere Pro, adding music, voice over, pacing choices. There are steps beyond what I did, like what Kalshi has accomplished with their AI ads.

We need more precise language than "AI Generated"
Our current language to describe AI videos doesn't do a good job of distinguishing between these levels. Someone writing a low-effort one-shot prompt to make a random 8 second clip vs. the people behind Kalshi's AI generated ads are both making "AI videos" but they're doing so on totally different levels.

Same goes for written content and static images. My bet is that as these tools become more widely adopted we'll refine the language accordingly.

Making an AI trailer exposed flaws in my script
The trailer helped me visualize scenes written in the script, and I noticed patterns that reveal my shortcomings as a screenwriter. There's a chase scene. Then there's a scene where characters stop to talk. The exposition is handled through dialogue rather than action, which makes for a boring movie. The trailer required a lot of voice over to tell you what was happening.

I'm not saying this just to be self-deprecating. As a screenwriter, this information would be an incredible gift!

If I were to write another script, I would make an AI movie trailer much sooner in the process. This would allow me to test ideas, visualize how they work, see whether the trailer hooks people into the story, and make adjustments accordingly.

AI didn't take away creativity... It helped me make something that would otherwise have never been created

I am hopeful that there will be more projects like this trailer. I'm sure there are millions of aspiring screenwriters and filmmakers out there who have been working on a project but don't have the time, connections, or budget to bring it to life. Hopefully, they can use AI to help create a version of their project and share it with the world. As these tools get better, cheaper, faster, and more widely available I'm bullish that they will enable more people to tell their stories.