The Prompt API

· ai · Source ↗

TLDR

  • Chrome’s Prompt API brings on-device LLM inference to the browser, letting web pages run prompts locally without external API calls.

Key Takeaways

  • Part of Chrome’s “AI on Chrome” initiative, exposing a built-in model to JavaScript running in the browser.
  • Prompts run against a local model on the user’s device, keeping data off external servers by default.
  • No backend infrastructure or API keys required from the developer side; the browser handles model execution.
  • Targets a broad surface: any web page or extension can call the Prompt API if the user has Chrome with the model available.

Hacker News Comment Review

  • The dominant concern is on-device model size: users face a ~22 GB disk space requirement before any AI-on-Chrome feature works, which is a hard gate for many machines.
  • This storage barrier shifts the friction from API keys and latency to local hardware constraints, making the “no backend needed” pitch conditional on users having modern, high-capacity devices.

Notable Comments

  • @fg137: highlights the user-hostile implication of the storage gate: “sorry, to use our website, you must have at least 22 GB of free disk space.”

Original | Discuss on HN