Descriptions:
Andy Masley, director of Effective Altruism Washington DC and independent researcher, joins Nathan Labenz on the Cognitive Revolution to systematically evaluate the most common fears around AI’s energy and water consumption. The central finding: AI’s resource footprint, while real, is far smaller than popular discourse suggests.
Key heuristics developed in the conversation: a single ChatGPT query uses roughly the same energy as running a microwave for one second; a cross-town car trip, a hamburger, or a hot shower each equal approximately 10,000 ChatGPT queries in energy use. At the infrastructure level, a 1-gigawatt data center powers roughly 1 million homes, and the full projected $7 trillion AI buildout — estimated at 80 gigawatts — would represent a 1–2% increase in global energy usage, less than ordinary economic growth over the same period. Water usage follows similar proportions with added complexity around cooling methods.
Masley and Labenz acknowledge that local community impacts from large data centers can be genuinely significant even when global-scale impacts are modest, and recommend that local leaders negotiate carefully with data center operators. Both speakers note that rising AI usage and extended reasoning chains push resource intensity upward while efficiency gains push it down, making current estimates uncertain in absolute terms but directionally well-grounded.
📺 Source: Cognitive Revolution “How AI Changes Everything” · Published December 14, 2025
🏷️ Format: Podcast







