Create Your First Project
Start adding your projects to your portfolio. Click on "Manage Projects" to get started
Precision. Performance. Power.
Date
August 2025
This project was an AI-assisted creative exploration designed to evaluate practical, production-ready use cases for Adobe Firefly within an existing creative workflow. The goal was not experimentation for its own sake, but to determine where generative AI could meaningfully support ideation, motion development, and efficiency without compromising creative control or quality.
I developed a hybrid workflow that combined Firefly’s generative capabilities with traditional design and post-production. A custom vehicle image was generated and manually composited with graphic design elements, then refined into left and right perspectives to establish continuity and directional movement. These assets were transitioned into motion using Firefly’s image-to-video feature, with prompt iteration guiding framing, pacing, and cinematic movement.
Sound design was produced using Adobe’s AI sound effect generator, informed by live vocal recordings to shape engine behavior and synchronize audio dynamics with motion. All AI-generated video and audio were then brought into After Effects for manual compositing, timing refinement, and final polish.
The final result demonstrates how AI can accelerate early-stage ideation and asset generation while traditional post-production ensures creative intent, quality control, and brand-ready execution. This project positions AI as a directed creative tool within a larger system, rather than a replacement for design judgment or craft.





