Descriptions:
Kevin Madura, a technical consultant at AlixPartners, presents a thorough workshop on DSPy—the declarative Python framework for building modular LLM-powered programs developed by Omar Khattab and collaborators. The session is structured around a public GitHub repository attendees can follow along with, covering everything from DSPy’s core abstractions to a live optimizer demo.
Madura argues that DSPy’s value lies in treating LLMs as first-class computational citizens inside a proper program rather than as strings to be nudged. He walks through the key primitives: Signatures (typed input/output declarations), Modules (composable LLM-backed functions analogous to PyTorch layers), and optimizers like Beacon/Jeepa—which, according to recent research, match or exceed GRPO-style fine-tuning methods on prompt optimization tasks. Metrics, including LLM-as-a-judge approaches for subjective criteria, complete the optimization loop by giving the framework a quantifiable signal for improvement.
Practical use cases demonstrated include sentiment classification, PDF boundary detection, arbitrary-length text summarization, a lightweight multimodal pipeline, and a simple web research agent, all wired to the Arize Phoenix observability platform for tracing. The session is particularly relevant to developers who want to move beyond ad-hoc prompt tweaking toward maintainable, optimizable AI software—and to those curious how DSPy fits compared to LangChain or raw structured-output APIs.
📺 Source: AI Engineer · Published January 08, 2026
🏷️ Format: Hands On Build







