OpenAI Codex now supports VS Code integration, uses plan limits

OpenAI’s Codex now integrates with VS Code, supports cloud delegation, and aligns with ChatGPT plans, with enterprise controls, data options, and documented limits for local and cloud tasks.

OpenAI Codex now supports VS Code integration, uses plan limits

TL;DR

  • VS Code panel for chat, edits, and previews with context from opened files or selected code; panel placement is configurable.
  • macOS integration: ChatGPT app can connect to VS Code; enabling “Work with VS Code” allows in-editor questions or simple edits.
  • IDE coverage beyond VS Code: Codex CLI runs in an IDE terminal; VS Code extension works with most VS Code forks.
  • Cloud delegation with continuity: Larger tasks run in the cloud with progress tracking; results can be opened locally, and Codex preserves context between cloud and local execution.
  • Availability and sign-in: Works with ChatGPT Plus, Pro, Team, Edu, and Enterprise; Enterprise/Edu CLI/IDE sign-in is still rolling out and not available yet.
  • Entry points: Codex CLI, Codex IDE extension, and Codex web (requires connecting ChatGPT to a GitHub account); documentation at the listed links.
  • Usage limits (by plan): Local tasks: 30–150 messages/5 hours (Plus/Team/Enterprise/Edu); 300–1,500 messages/5 hours (Pro); cloud task limits described as “generous limits for a limited time.”
  • Compliance and controls: RBAC supported; Compliance API covers web/cloud usage (local usage not exposed); Data Retention & Residency compliant.
  • Data use and privacy: Team/Enterprise/Edu defaults do not use inputs/outputs to improve models; Pro/Plus may be used to improve models unless disabled.
  • Models and customization: Default model selected automatically; FAQ recommends GPT-5 with medium or high reasoning; model can be customized in the app.
  • Documentation and guidance: Enterprise setup and general Codex docs available at the provided links.

Overview

OpenAI describes Codex as a coding agent that works alongside an IDE and can also take on larger workloads in the cloud. The product spans local workflows inside VS Code and remote execution for bigger jobs, while maintaining continuity between the two.

IDE integration and context

Codex can be added as a panel in VS Code to chat, edit, and preview changes. With context from opened files and selected code, prompts can be shorter, and outputs can be more relevant to the current selection. The panel placement is configurable within the editor.

On macOS, the OpenAI Codex extension supports connecting the ChatGPT macOS app to VS Code. When enabled via “Work with VS Code” in the ChatGPT app, ChatGPT can answer questions or make simple edits inside the editor.

For IDE coverage beyond VS Code, the Codex CLI can run in an IDE’s terminal, and the VS Code extension is compatible with most VS Code forks.

Cloud delegation and continuity

Larger tasks can be delegated to Codex in the cloud. Progress can be tracked and results reviewed without leaving the IDE. For finishing touches, cloud tasks can be opened locally, and Codex keeps the context consistent between cloud and local execution.

Access, clients, and setup

Codex works with ChatGPT Plus, Pro, Team, Edu, and Enterprise plans. Sign-in requires an active ChatGPT subscription, with upgrades managed in ChatGPT account settings. OpenAI notes that logging into the Codex CLI and Codex IDE with ChatGPT Enterprise or Edu accounts is still rolling out and is currently not available.

Codex is available through multiple entry points:

Documentation and setup instructions are published at https://developers.openai.com/codex/ and https://developers.openai.com/codex/ide. Enterprise setup guidance is available at https://developers.openai.com/codex/enterprise.

For users previously accessing the Codex CLI via API key, OpenAI instructs updating the package, then running “codex logout” followed by “codex” to switch to subscription-based access.

Plans and usage limits

OpenAI states that Codex usage limits vary by plan and by where tasks run (local vs. cloud). The number of messages per window also depends on the size and complexity of work.

For Plus, Team, Enterprise, and Edu plans:

  • Local tasks: Average users can send 30–150 messages every 5 hours, with a weekly limit.
  • Cloud tasks: Described as “generous limits for a limited time.”
  • The plan description labels these tiers as “best for developers looking to power a few focused coding sessions each week.”

For Pro:

  • Local tasks: Average users can send 300–1,500 messages every 5 hours, with a weekly limit.
  • Cloud tasks: Also listed as “generous limits for a limited time.”
  • The plan description states “best for developers looking to power their full workday across multiple projects.”

For Enterprise and Edu plans using flexible pricing, Codex usage draws down from a workspace’s shared credit pool. OpenAI notes that for a limited time, Codex credits are waived for flexible Enterprise and Edu pricing plans.

When a usage limit is reached, Codex access is paused until the usage window resets. To obtain more capacity, it is possible to use an API key to run additional local tasks, with API credits billed accordingly.

Enterprise controls and compliance

OpenAI publishes an Enterprise Admin Guide at https://developers.openai.com/codex/enterprise. RBAC is supported, allowing access to be granted to specific user roles; setup guidance is available at https://help.openai.com/en/articles/11750701-rbac.

For compliance and reporting, Codex usage on the web or when delegated to the cloud is available in the Compliance API at https://chatgpt.com/admin/api-reference#tag/Codex-Tasks. Usage in local environments is not available through this API. OpenAI states that Codex is compliant with Data Retention & Residency policies.

Data use and privacy

OpenAI’s FAQ states the following:

Models and customization

The Codex CLI and IDE extension automatically select a default model. The FAQ currently recommends GPT-5 with medium or high reasoning, and this choice can be customized in the app.

Documentation

Further product and setup information is available at:

TL;DR

  • VS Code panel for chat, edits, and previews with file/selection context
  • Cloud delegation with progress tracking and context continuity when opened locally
  • Works with ChatGPT Plus, Pro, Team, Edu, Enterprise; Enterprise/Edu CLI/IDE sign-in is still rolling out
  • Codex web requires GitHub connection; clients available via CLI, IDE extension, and web
  • Usage limits vary by plan; local ranges from 30–150 (Plus/Team/Enterprise/Edu) to 300–1,500 (Pro) messages every 5 hours; cloud listed as “generous limits for a limited time”
  • Compliance API covers web/cloud usage; local usage is not exposed there
  • Business tiers default to no training on data; Pro/Plus may contribute unless disabled
  • Model selection is automatic; FAQ recommends GPT-5 with medium or high reasoning

Continue the conversation on Slack

Did this article spark your interest? Join our community of experts and enthusiasts to dive deeper, ask questions, and share your ideas.

Join our community