Descriptions:
Creator Dan Kieft walks through a complete workflow for producing AI-generated lyric music videos, starting from a Suno AI song and finishing with a fully synced video using Open Art’s music video builder. The tutorial targets creators who want professional-looking results without expensive editing software or post-production experience.
The process begins in Suno AI, where Kieft uses a custom music prompter to generate both lyrics and an audio track. The exported song is then imported into Open Art’s lyric video feature, where Cream 4.0 is selected as the image model and Kling 2.6 or 2.5 handles video generation. A critical step Kieft emphasizes is previewing the auto-generated storyboard before committing to the full render — catching lyric mismatches and regenerating individual scenes costs only one minute and 460 credits, versus the cost of an entire re-render.
For visual consistency across scenes, Kieft demonstrates an intermediate step using ChatGPT: uploading a reference frame and prompting it to analyze the visual style and generate scene-specific image prompts line by line, which are then fed back into Open Art to keep the aesthetic coherent throughout the video. The final step uses Open Art’s built-in timeline editor to adjust scene duration and sequencing before export. The end result is a fully AI-generated music video — music, visuals, and lyric sync — achievable in roughly 30 minutes using tools available through a standard Open Art subscription.
📺 Source: Dan Kieft · Published January 18, 2026
🏷️ Format: Tutorial Demo







