Can AI truly replicate the nuanced process of artistic creation? I set out to answer that question by generating 'Lights Out,' an ...
Can AI truly replicate the nuanced process of artistic creation? I set out to answer that question by generating 'Lights Out,' an AI pop-rock band, and their debut single, 'TONIGHT,' from the ground up, aiming to generate not just a song, but a complete artistic identity, from backstory to music video.
It's worth mentioning here that I have produced real life music videos in the past for both television and streaming platform release, however with a production hiatus that inevitably comes from having a young family I've not really had the time to throw into a drawn out shooting and editing schedule, so thought I might draw on previous experiences and apply them to this experiment.
The Narrative Construction. Lyrics & Storyboarding with Collaborative AI
The Sonic Blueprint. Music Generation with SunoI initiated this step by crafting the musical foundation using Suno. This wasn't a simple "generate a song" scenario. Besides the pre-crafted lyrics being used as input here, I also explicitly defined the parameters for better results:
Epiphany. The Humans are Still Indispensable!
Throughout the project, I encountered a recurring theme: Yes AI is a powerful tool, but it still requires human guidance and oversight. The creative process is not a linear, algorithmic operation; it's a dynamic, iterative dialogue between human and machine.
It's worth mentioning here that I have produced real life music videos in the past for both television and streaming platform release, however with a production hiatus that inevitably comes from having a young family I've not really had the time to throw into a drawn out shooting and editing schedule, so thought I might draw on previous experiences and apply them to this experiment.
See the results of this AI experiment! Watch the 'TONIGHT' music video below, and then read on below to learn the process behind its creation
The Project Overview of Lights Out and "TONIGHT"
"Lights Out" a completely AI generated construct, is a fictional pop-rock band hailing from Australia, more specifically the Western Sydney suburb of Campbelltown. Featuring front man Kai "Riot" Riley: lead vocals/guitar, Finn Carter: guitar, Jed Carter :bass, (yes Finn and Jed are brothers) and Taj "Crash" Miller on drums. The band's debut single, "TONIGHT", is designed as a high-energy party anthem. Think: a mashup with Sum 41 and 5 Seconds of Summer type feels and you'd be halfway there without hearing the actual track.
The project began with establishing a comprehensive backstory for the band, grounding their identity and musical style. This foundation although perhaps not critical, was still useful in guiding the AI tools in the right creative direction.
The Narrative Construction. Lyrics & Storyboarding with Collaborative AI
Lyrics and a storyboard were essential for creating a cohesive artistic vision for the debut track and the music video. I employed a collaborative Ai approach:
- Osmi AI: For initial brainstorming and unconventional lyric generation.
- Gemini: For refining and structuring the narrative, ensuring coherence.
- ChatGPT: For polishing and adding emotional depth to the lyrics and storyboard.
Osmi AI excelled at generating novel ideas, Gemini at structuring them, and ChatGPT at refining them. Leveraging these complementary strengths achieved a well-rounded result in my opinion.
The Sonic Blueprint. Music Generation with Suno
- Genre Specificity: I explicitly stated "pop-rock," drawing on the platform's understanding of musical styles.
- Emotional Tone: I described the desired emotional landscape, eg energetic, rebellious, and anthemic.
- Instrumentation: I implied the classic pop-rock instrumentation such guitars, bass, drums – through genre selection. eg prompt "Up-tempo pop-rock with a strong back beat and rhythhmic fills, a driving bassline featuring distorted guitar rhythms, and melodic guitar solos."
The insight here was that AI, even in its nascent stage, can interpret and translate complex artistic directives. However, the output's quality was directly proportional to the precision of the input. Vague prompts yielded generic results, while detailed instructions produced more nuanced and genre-authentic tracks closer to what I had in mind.
Visual Identity. Character Generation with Leonardo AI and ImageFX
Visual consistency ie "consistent character generation" proved to be the most significant challenge I encountered. I wanted to create believable characters, and the same believable characters on-demand if possible which is actually still rather difficult to do although platforms are improving these features quickly:
- Leonardo AI: For generating stylized and artistically driven images. For creating and training reference models.
- ImageFX (Google Labs): For creating photorealistic and adaptable images.
The challenge was maintaining consistency across multiple generations. Subtle variations in prompts and random seed values led to significant discrepancies in character appearance. Even locking seed values did not result in the consistency required as there were still minor variations due to the other random elements used within the generation process. I discovered that:
- Creating detailed reference images from multiple angles was crucial.
- Using consistent prompts and locked seed values helped, but wasn't foolproof.
- Manual editing and compositing were sometimes necessary to provide a generation input to achieve the desired level of character consistency with the output.
This step revealed the limitations of current AI image generation in maintaining precise visual continuity. You can see examples of this throughout the video if you are looking for them, however the video overall still had a good level of character consistency for a human observer given the limitations we are working with.
![]() |
Creating quality reference models of the band, trying for consistent character generation. |
Animation. Bringing the Characters to Life with Kling AI
Animation posed its own set of challenges. Kling AI was selected for its lip-syncing capabilities in comparison to its competition, but I encountered limitations:
- The platform's ability to maintain character consistency during motion was limited depending on the circumstance.
- Complex camera movements and dynamic scenes often resulted in glitches and artefacts.
- The length of continuous animation clips was restricted.
The lessons I learned here:
- Short, controlled animation clips were more successful than long, complex sequences.
- Careful planning and editing were essential for mitigating glitches and artefacts.
- Using consistent reference images was extremely important for character continuity.
Epiphany. The Humans are Still Indispensable!
Throughout the project, I encountered a recurring theme: Yes AI is a powerful tool, but it still requires human guidance and oversight. The creative process is not a linear, algorithmic operation; it's a dynamic, iterative dialogue between human and machine.
While AI may excel at generating content, humans are still essential for curating and refining it. And ultimately if there were no human to witness it, what would the point really be anyway?
The creative process with AI is not about replacing human creativity, but augmenting it. I would argue that the most successful AI projects are those that leverage the strengths of both human and machine.
While my experiment started primarily around seeing if I could create an AI band; it became so much more than that. I started to understand better the evolving relationship between creativity and technology. And as AI continues to evolve, we should continue to explore its capabilities and limitations to ensure it remains a tool for human expression rather than a tool to replace it.
A Summary of Key Lessons Learned
- Prompt Engineering: Detailed and specific prompts are essential for achieving superior outputs from your current set of AI tools.
- Character Consistency: Maintaining visual consistency across multiple AI platforms requires planning and iterative refinement. Creating at least basic reference models is crucial.
- Platform Integration: Effective use of AI in creative projects often involves combining the strengths of multiple platforms.
- Iterative Workflow: Be prepared for an iterative process, involving frequent adjustments and refinements.
- Planning is Paramount: I learned this the hard way when I created the album art prior to finalizing the character models. This created a visual discrepancy. Plan all aspects of the project before creating assets.
COMMENTS