Will Open Source AI Overtake Closed Models? Ft. Olama, Fireworks and Open Router

· ai · Source ↗

Watch on YouTube ↗ Summary based on the YouTube transcript and episode description.

Jeff Morgan (Ollama), Dmytro Dzhulgakov (Fireworks), and Alex Atallah (OpenRouter) debate whether open-source inference will reach parity with closed models in five years.

  • Open-source models are currently ~20–30% of inference tokens vs. closed-source, per Alex Atallah’s estimate.
  • DeepSeek succeeded partly because Fireworks and others had to self-host it — DeepSeek’s own servers crashed and blocked payments at launch.
  • DeepSeek was the first strong open-source reasoning model with visible chain-of-thought, which closed models like o1 did not expose.
  • Fine-tuning may weaken as a differentiator as powerful RL-trained foundation models reduce the marginal benefit of customization.
  • Enterprise demand for full model ownership — including fine-tuned weights — is a structural driver of open-source adoption.
  • All three panelists predicted roughly 50/50 open vs. closed inference share in five years, with open-source fragmenting across model families rather than one dominant model.
  • Atallah argued decentralized inference providers are the wildcard: without them, closed source likely stays above 50%; one provider he cited earns $360k/day in incentives but sustainability is unproven.

2025-05-12 · Watch on YouTube