
Ford Mustang Commercial

Release Year
2026
Role
All Aspects
Category
Motion Media / Commercial
3D Animation
AI Workflow
Toolkit
Autodesk Maya
Substance Painter
Nano Banana 2
Seedance 2.0
Lovart
After Effects
Premiere Pro
ACE Studio
Overview
Ford Mustang Commercial is my first project exploring AI-assisted workflows. In 2026, there are already many popular AI-assisted and generative tools available across different stages of the creative process—from earlier image-generation platforms such as Midjourney, to text-based tools like ChatGPT and Gemini, and now to video- and audio-generation platforms such as Seedance, VEO, Runway, and ACE Studio. When developing this project, my main idea was to explore whether these increasingly popular AI tools could be integrated into a more traditional 3D animation and motion media workflow. That question became the starting point for creating Ford Mustang Commercial.
To connect the project with a more traditional 3D motion media pipeline, I first modeled a simple vehicle in Autodesk Maya as a visual mock-up, then textured it in Substance Painter and rendered still images to serve as input references. At the same time, I gathered a large number of online reference images showing the red Ford Mustang from different angles, in different environments, and with both exterior and interior views. I then used my rendered stills in Nano Banana 2 to generate additional style frames from new angles and in different visual styles.
While reviewing these style frames, I simultaneously developed the storyboard and shot list—thinking through camera angles, how the car should move, what kinds of animation should happen, and how the lighting and reflections should evolve from shot to shot. As I continued generating more images in Nano Banana 2, I also kept refining and adjusting the shot list. Once I was satisfied with the overall shot design, I brought together the final shot list and all reference images—including my own rendered stills, downloaded real-car photography, and AI-generated images from Nano Banana 2—and used them as inputs for Seedance to generate the video footage.
After several rounds of experimentation, I selected the strongest moments from different generated clips and edited them together in Premiere Pro. During post-production, I also used After Effects to create the logo animation, incorporating techniques such as camera tracking and masking. After that, I used ACE Studio to generate and compose sound effects, and completed the final composite in Premiere Pro.
The entire project was completed within 24 hours. Although the final piece still contains imperfections in several areas—particularly product consistency across some generated shots, as well as limitations in image quality, even though I did my best to improve them in post-production—I am still very satisfied with the overall result. As my first exploration of AI-assisted workflows in motion media, this project was a meaningful experiment. In future explorations, I hope to incorporate more tools, further improve video quality, and continue finding better ways to address consistency issues across generated footage.
Style Frames

Shot List






















