Reflection AI: The Race to Unlock Superintelligence
https://sequoiacap.com/article/reflection-ai-spotlight/-
Ex-DeepMind duo founded Reflection AI to build superintelligent coding agents.
- Ioannis: AlphaGo, AlphaZero, MuZero, led Gemini RLHF.
- Misha: Berkeley postdoc, co-authored Decision Transformer paper.
-
Move 37 (AlphaGo 2016) was the founding insight: RL beats human intuition.
- 1-in-10,000 probability move; convinced Misha to abandon quantum physics.
-
ChatGPT (Nov 2022) confirmed RLHF is the unlock for capable LLMs.
- Google’s “code red” merged Brain + DeepMind; both founders led Gemini 1.0/1.5.
-
LLM scaling laws asymptote; RL post-training curve is just beginning.
- Self-generated RL data sidesteps the pretraining data ceiling entirely.
-
Coding is the beachhead: machine-friendly, verifiable, infrastructure-ready.
- “Superintelligence” defined pragmatically: creates value doing work on computers.
- Waymo-style: bounded domain, strong reliability guarantees, then expand.
-
Superintelligent software developer = sufficient proof of AGI completeness.
- Recipe scales to other domains once cracked in code.
-
Agents amplify successes, reduce failures — simple mechanism, massive scale.
- First superintelligent systems will be “jagged”: superhuman in some, limited in others.
X discourse
- @LiamFedus: “Next frontier: AI systems operating well under uncertainty, like RL against verifiable” (798 likes)
- @So8res: “Someone building superintelligence with substantial world-ending risk — ‘Eh wha’” (266 likes)
- @ahall_research: “Need explicit agenda for ‘political superintelligence’ to remake institutions as AI scales.” (264 likes)
- @olliezliu: “Open-weight superintelligence presents new safety constraints and technical challenges.” (145 likes)
- @Raullen: “AI’s superpower: relentless trial and error. Real gains in atoms, not bits.” (134 likes)
Anthony Wing Kosner, Editorial Director, Sequoia Capital — profiling Ioannis Antonoglou (ex-DeepMind, AlphaGo/Gemini) and Misha Laskin (ex-DeepMind, Berkeley postdoc, Decision Transformer co-author) · 2025-04-03 · Read on sequoiacap.com
| Type | Link |
| Added | Apr 3, 2025 |
| Modified | Apr 17, 2026 |