Descriptions:
Machine Learning Street Talk hosts Dr. Jeff Beck in a wide-ranging technical conversation that spans agency theory, geometric deep learning, Joint Embedding Prediction Architectures (JEPA), and the relationship between the Free Energy Principle and energy-based models. Beck opens by arguing that agency is not a categorical distinction but a matter of policy sophistication — measurable via transfer entropy and context-dependent behavior — and that even a rock technically executes a policy, making agency a continuum rather than a binary.
The conversation’s most substantial thread covers JEPA, the architecture championed by Yann LeCun. Beck explains that the key insight is learning to predict in latent embedding space rather than pixel space — producing gestalt, high-level representations instead of forcing the model to reconstruct every low-level detail. He contrasts this with purely generative approaches and discusses the tradeoffs of contrastive versus non-contrastive learning objectives.
Beck also draws a precise distinction between energy minimization and free energy minimization: the free energy in the Free Energy Principle adds an entropy penalty term absent from standard maximum-likelihood energy minimization, and he explains how Laplace approximations make fully Bayesian treatment tractable in practice. Throughout, geometric deep learning is framed as the natural framework for physical world modeling — building in translation and rotation invariance rather than learning them from data. For researchers and practitioners working at the intersection of theoretical ML and applied architecture design, this episode functions as a rigorous conceptual map across several interconnected fields.
📺 Source: Machine Learning Street Talk · Published January 25, 2026
🏷️ Format: Interview







