American AI startup Poolside launches free, high-performing open model Laguna XS.2 for local agentic coding

· ai ai-agents coding · Source ↗

TLDR

  • Poolside releases two MoE models for agentic coding: open-weight Laguna XS.2 (33B) and proprietary Laguna M.1 (225B).

Key Facts

  • Laguna XS.2 is a 33B-parameter MoE with 3B active parameters, Apache 2.0 licensed, runnable on a single GPU with 24-32GB VRAM.
  • Laguna M.1 is a proprietary 225B-parameter MoE with 23B active parameters, targeted at enterprise and government environments.
  • Both models were trained from scratch on 30 trillion tokens; XS.2 scores 44.5% on SWE-bench Pro, M.1 scores 46.9%.
  • M.1 is temporarily available free via API through OpenRouter, Ollama, and Baseten; XS.2 is on Hugging Face.

Why It Matters

  • XS.2 outperforms Claude Haiku 4.5 and Gemma 4 31B on SWE-bench Pro despite having only 3B active parameters.
  • Poolside also released two developer tools: pool (a terminal-based coding agent) and shimmer (a cloud-based mobile-friendly coding environment).

Carl Franzen, VentureBeat · 2026-04-28 · Read the original