Jonathan Ross: DeepSeek Special - How Should OpenAI and the US Government Respond | E1253
Watch on YouTube ↗ Summary based on the YouTube transcript and episode description.
Groq CEO Jonathan Ross argues DeepSeek’s $6M training figure is marketing spin, recommends OpenAI open-source its models, and calls Nvidia a screaming buy post-panic.
- DeepSeek’s $6M training figure is marketing — they spent far more distilling OpenAI’s model outputs to generate higher-quality training data.
- Ross recommends OpenAI open-source its models immediately; brand is their real moat right now, not proprietary weights.
- Groq hosts DeepSeek but runs zero persistent storage — RAM only — so user queries cannot be retained or forwarded to the CCP.
- $500B Stargate is not ridiculous; inference will grow to ~95% of AI compute spend as training becomes a niche high-margin business.
- Jevons Paradox: compute costs drop 1000x per decade and demand grows 100,000x — Ross called Nvidia a screaming buy after its 16% DeepSeek-driven drop.
- DeepSeek’s MoE architecture is genuinely innovative — 671B sparse parameters with low active compute — not mere copying of Western models.
- Biggest fear: AI-automated zero-day exploit scanning by nation states is already happening, deniable, and outpaces human defenders.
- DeepSeek restricted new signups to Chinese phone numbers because they ran out of inference compute, not for geopolitical reasons.
2025-01-29 · Watch on YouTube