OpenWarp

· ai · Source ↗

TLDR

  • Community fork of Warp adding BYOP (bring-your-own-provider) support: custom Base URL, API Key, model, and minijinja system prompt templates, with credentials stored locally.

Key Takeaways

  • Supports any OpenAI Chat Completions-compatible endpoint: OpenAI, Anthropic gateways, DeepSeek, Qwen, Groq, Ollama, LM Studio.
  • System prompts use minijinja templates with context variables like cwd, locale, and user.role for dynamic rendering.
  • Credentials never leave the device; no telemetry, no cloud upload, stored in ~/.config/openwarp.toml.
  • Merges Warp upstream continuously, preserving blocks, workflows, AI commands, keymaps, and themes.
  • Early development, no formal release yet; licensed AGPL/MIT dual, matching Warp upstream.

Hacker News Comment Review

  • Warp founder confirmed BYOM (bring-your-own-model) is coming natively to Warp, making the fork’s primary rationale potentially short-lived.
  • A critical warning emerged: at least one user found OpenWarp still requires a $20/month account to use a custom provider, contradicting the project’s own pitch and not disclosed in the README.
  • Broader skepticism centers on the fork being premature given Warp only open-sourced 24 hours prior, the name reuse being a likely trademark issue, and the lack of an actual community behind it yet.

Notable Comments

  • @mark_l_watson: Installed from source; custom provider still gated behind a $20/month signup, not disclosed in README.
  • @zachlloyd: Warp founder confirms native BYOM is planned; links to GitHub discussion for input.
  • @SwellJoe: “how can there be a ‘community fork’ when there is no community?” flags trademark and timing issues.

Original | Discuss on HN