Elon Musk – “In 36 months, the cheapest place to put AI will be space”

Elon Musk – “In 36 months, the cheapest place to put AI will be space”

More

Descriptions:

In a wide-ranging interview with Dwarkesh Patel published in February 2026, Elon Musk makes the striking claim that space will become the cheapest location to run AI infrastructure within 30 to 36 months. His argument rests on the stagnation of terrestrial electricity supply outside China — flat or barely growing — contrasted with exponentially increasing chip output. In space, solar panels deliver roughly five times the power output they achieve on Earth, with no batteries needed for overnight operation, no atmospheric losses, and no permitting bottlenecks. He expects the economics to eventually become incomparably better in orbit.

Musk addresses the GPU reliability concern directly, arguing that infant mortality can be screened out on the ground before launch, and that modern accelerators are sufficiently reliable past initial burn-in to operate unserviced. He also briefly positions Tesla AI6 chips alongside Nvidia, Google TPUs, and Amazon Trainium as plausible orbital compute options.

On the competitive landscape, Musk cites OpenAI at roughly $20 billion in annual revenue and Anthropic at around $10 billion, framing xAI’s current ~$1 billion figure as an early-stage baseline. He argues that once a true “digital human” emulator exists, the total addressable market scales to trillions — because every major high-value company today (Nvidia, Apple, Microsoft, Meta, Google) is fundamentally a digital-output business. The interview also touches on Tesla’s Full Self-Driving progress and why Musk believes the “Tesla path” of data plus algorithms is the correct approach to autonomous driving.


📺 Source: Dwarkesh Patel · Published February 05, 2026
🏷️ Format: Interview

1 Item

Channels

3 Items

Companies

1 Item

People