Descriptions:
Jack Roberts walks through integrating Remotion — a framework that treats every video frame as React code — into Google’s anti-gravity development environment, using Claude Code as the generation engine. Because Remotion represents video programmatically, AI models can use loops, conditionals, and functions to automate editing, captioning, animation, and audio processing at scale, rather than interacting with a timeline-based editor.
The video covers three levels of complexity. The first involves automated video editing and audio enhancement: Roberts shows a workflow where a raw video file is processed through the Whisper API for transcription, silence-trimmed using ffmpeg, and run through an audio enhancement pass — all from a single prompt. The second level adds brand-matched caption generation, where a tool called Glido extracts colors, fonts, and logo assets from any website URL, and Claude generates Remotion caption components that match those brand guidelines exactly. The third demonstrates fully dynamic dashboard creation: a data visualization built in one prompt that updates in real time when underlying values change, again matched to specific brand assets.
Roberts references Submagic ($1M ARR within three months) and Crayo ($6M ARR) as evidence that AI-native video editing is a fast-growing market, framing the Remotion skill as a way for creators and agencies to cut editing time by 80–90%. Node.js is required for Remotion installation; all setup steps run through anti-gravity with Claude Code handling code generation.
📺 Source: Jack Roberts · Published January 28, 2026
🏷️ Format: Hands On Build







