Elon Musk – "In 36 months, the cheapest place to put AI will be space”
Elon Musk tells Dwarkesh Patel that space will be the cheapest place for AI compute within 36 months and outlines plans for orbital data centers, a TeraFab chip foundry, and 10,000 Starship launches per year.
- Musk predicts space beats Earth on AI compute cost in ~30–36 months: solar delivers 5x output with no batteries, yielding ~10x effective cost advantage.
- In 5 years SpaceX plans to launch more AI compute annually than all cumulative on-Earth capacity combined, implying ~10,000 Starship flights per year.
- Gas turbines are sold out through 2030; only 3 casting companies worldwide make turbine blades/vanes, making electricity the 1-year AI scaling bottleneck and chips the 3–4 year bottleneck.
- TeraFab targets millions of wafers per month by 2030 covering logic, memory, and packaging; Musk says memory is a harder scaling problem than logic.
- At cryogenic temperatures stainless steel matches carbon fiber strength-to-weight but costs 50x less and welds easily — Musk says Starship should have used steel from day one.
- Powering 330,000 GB300s requires ~1 GW at the generation level once networking, cooling peaks, and service margin are included — roughly 3x the naive chip-wattage estimate.
- DOGE finding: making appropriation codes mandatory on Treasury’s 5-trillion-payment-per-year system (previously optional) projected to save $100–200B annually.
- Biggest remaining Starship problem is a reusable orbital heat shield — never achieved by anyone; current ships lose too many tiles on reentry to be truly reusable.
2026-02-05 · Watch on YouTube