Zed shifts AI billing from prompts to tokens, adds newer models and lowers Pro price
Zed is moving its AI usage from a prompt-count model to token-based billing, with the change live for new users and a migration path for existing accounts. The move pairs a simplified price structure with additional hosted model options and new token credit allowances on Pro and trial tiers.
What changed
- Billing model: Prompt-based limits replaced with token-based pricing. Additional usage is billed at provider list price + 10% for Zed-hosted models.
- Pro price: The Pro plan drops from $20/month to $10/month and now includes $5 of token credits.
- New hosted models: Zed Pro adds GPT-5 (including mini/nano variants), Grok 4, and Gemini 2.5 Pro/Flash alongside the existing Claude Sonnet and Claude Opus.
- Free tier: The free plan continues to include 2,000 accepted edit predictions (previously included 50 Zed-hosted prompts per month; prompts are removed under the new model).
- Pro trial: The 14-day Pro trial now includes $20 of token credits.
Zed frames the change as an alignment of pricing with the actual cost of AI inference and infrastructure. The token model allows variable context sizes and clearer cost attribution for small fixes versus large multi-file operations.
Pricing snapshot
- Free: 2,000 accepted edit predictions (no hosted prompts included)
- Pro: Unlimited accepted edit predictions, $10/month, $5 token credit included, additional usage at list price + 10%, and new model availability
- Pro trial: 14 days, unlimited accepted edit predictions, $20 token credit included
Migration timeline
- Pro customers have until December 17, 2025 to migrate to the new pricing structure; accounts that cancel will move to the new Free plan when the subscription ends. For earlier migration assistance, contact [email protected].
- Free users will be transitioned to the new Free plan on October 15, 2025 and will receive a fresh 14-day Pro trial with $20 in token credits that can be started any time.
- Trial users are being moved back to the old Free plan today, September 24th, and can follow the same migration path as Free users.
Alternatives and flexibility
Zed emphasizes several non-hosted ways to use AI to control cost or privacy:
- Bring-your-own API keys with providers like OpenAI, Anthropic, and Grok: https://zed.dev/docs/ai/llm-providers
- Local models (for example via Ollama): https://zed.dev/docs/ai/llm-providers#ollama
- Third-party agents through ACP (Gemini CLI, Claude Code, and more planned): https://zed.dev/docs/ai/external-agents
- Payment hubs and integrations such as GitHub Copilot, OpenRouter, and AWS Bedrock: https://zed.dev/docs/ai/llm-providers#github-copilot-chat, https://zed.dev/docs/ai/llm-providers#openrouter, https://zed.dev/docs/ai/llm-providers#amazon-bedrock
- An option to disable AI features entirely: https://zed.dev/blog/disable-ai-features
Context for developers
The main practical impacts are clearer unit pricing tied to actual token consumption and the inclusion of newer, typically higher-capability models under Zed’s hosted offering. The 10% markup is intended to cover Zed’s infrastructure, support, and enhanced rate limits for Pro users compared with BYOK usage. The token model also removes arbitrary prompt caps, letting workflows scale based on chosen context and cost tolerance.
Zed remains available for macOS and Linux downloads: /download. Hiring information: /jobs.
Original source: https://zed.dev/blog/pricing-change-llm-usage-is-now-token-based