A Robust Safeguard for Generative AI
https://sequoiacap.com/article/robust-intelligence-spotlight/-
Harvard prof Yaron Singer founded Robust Intelligence after finding AI trust gap.
- ML algorithms treat inaccurate model outputs as ground truth — systemic risk.
- Cofounder Kojin Oshiba: former Singer undergrad, Tokyo-raised, third-culture kid.
-
JP Morgan rejected in 2019; Sequoia’s Bill Coughran was first believer.
- First customer Expedia closed off a Figma mockup — no working product yet.
-
Jan 2023 pivot: LLM firewall prototype built in six weeks after market shift.
- Entire company retooled to securing LLMs within six months.
- Proved ChatGPT leaks full copyrighted articles — supported NYT v. OpenAI suit.
-
Cisco acquired for $400M (summer 2024); BMW AI car among first new clients.
- Thesis: every AI app running on the internet can be secured at the network layer.
X discourse
- @sleepinyourhat: “We trust the model enough to use it heavily, but in the handful of cases where it misbehaves in significant ways, it’s d” (275 likes)
- @dabit3: “Not long ago most teams wouldn’t trust AI-generated code in production. Soon, it will be a liability to trust anything O” (199 likes)
- @IOHK_Charles: “Tune models for use cases and put in guardrails. It would vastly outperform humans, but they want headlines instead of p” (353 likes)
- @repligate: “Intense positive reactions to this paper, but it’s underwhelming, conservative. Raise standards for more.” (450 likes)
- @libshipwreck: “Generative AI driving a breakdown in trust. Harder to trust things you see and people around you. Puts everyone on edge.” (8827 likes)
- @johnsemley3000: “Polling shows overwhelming public distaste for generative AI. People simply do not like this stuff.” (8028 likes)
Lacy Warner (documentary director, nonfiction writer) — Sequoia Capital spotlight, Jan 2025 · 2025-01-21 · Read on sequoiacap.com
| Type | Link |
| Added | Jan 21, 2025 |
| Modified | Apr 17, 2026 |