Descriptions:
Bart Slodyczka gives Claude Opus 4.6 full root access to a Raspberry Pi 5 equipped with an AI Hat 2 (Hailo 10H AI accelerator) and then uses Claude Code’s agent teams feature to autonomously build a local RAG application with voice input—entirely from the terminal.
The video begins with hardware discovery: Claude Code SSHed into the Pi scans the system, identifies the Raspberry Pi 5B with 16GB RAM, confirms the Hailo accelerator is active after a driver reload on boot, and locates the locally running Ollama model (a 3.2-billion-parameter LLM). From there, Slodyczka pastes a high-level prompt into Claude Code and watches four parallel sub-agents spin up: a research agent to map available hardware, a backend agent to build a vector store and RAG pipeline, a frontend agent to create the UI, and a QA/deployment agent to validate and ship the result.
The approach highlights how agent teams can coordinate in real time—if the backend agent hits an issue, it can query the research agent for hardware-specific context without human intervention. The video is a practical look at running agentic AI development on edge hardware, combining Claude Code, Ollama, local embeddings, and a USB microphone into a self-hosted voice-enabled document assistant built almost entirely by the model itself.
📺 Source: Bart Slodyczka · Published February 11, 2026
🏷️ Format: Hands On Build







