OpenAI Adds Filesystem 'Skills' to ChatGPT and Codex CLI

Following the increasingly spreading adoption of Claude Code Skills, OpenAI adds filesystem-based 'skills' to ChatGPT's Code Interpreter and the Codex CLI, enabling tools to discover markdown-described folders. Vision-enabled models render PNG pages for PDFs to preserve layout and images.

openai cover

TL;DR

OpenAI adds “skills” support to ChatGPT and the Codex CLI

Elias Judin has recently discovered that OpenAI has begun exposing a simple, filesystem-based skills mechanism across both ChatGPT’s Code Interpreter and the open-source Codex CLI. The implementation mirrors the approach introduced by Anthropic earlier in the year: a skill is essentially a folder with a Markdown descriptor and optional resources or scripts that an LLM-enabled tool can discover and use.

Skills in ChatGPT’s Code Interpreter

ChatGPT’s Code Interpreter now exposes a /home/oai/skills folder that can be accessed via prompts such as Create a zip file of /home/oai/skills.

Simon Willison recently shared examples and a downloadable snapshot of the folder:

The skills found so far focus on common document tasks — spreadsheets, docx and PDFs. For PDFs and other layout-rich documents, the chosen strategy is to render pages as per-page PNGs and then feed those images to vision-enabled GPT models, rather than relying solely on text extraction. This preserves layout, figures and other visual context that would be lost with plain text extraction.

Practical testing shows the system can author and iterate on substantial outputs. A PDF briefing on rimu mast and kākāpō breeding was produced via a prompt and took over ten minutes as the model rendered, inspected, and fixed issues such as missing macron support by switching fonts:

Skills in the Codex CLI

OpenAI’s Codex CLI recently merged experimental support for skills (PR: https://github.com/openai/codex/pull/7412) and documents the approach in docs/skills.md (https://github.com/openai/codex/blob/main/docs/skills.md). The Codex CLI treats folders in ~/.codex/skills as discoverable skills.

Inspection of the Codex code that assembles the prompt driving the skills system is possible in the repository (notably codex-rs/core/src/skills/render.rs) and a more readable prompt version has been posted as a gist: https://gist.github.com/simonw/25f2c3a9e350274bc2b76a79bc8ae8b2.

An authored skill for generating Datasette plugins demonstrates the workflow:

To test locally, the Codex CLI must be run with --enable skills. An example run used codex --enable skills -m gpt-5.2 and produced a functioning Datasette plugin that serves an ASCII cowsay page.

What this means for developers

  • The skills model relies on a minimal, filesystem-centered spec that is easy to author and distribute.
  • The Codex CLI’s acceptance of local skill folders makes it straightforward to add project-specific capabilities that the assistant can invoke.

The approach is lightweight and interoperable — skills authored for one platform look very similar to those from another. There is a clear case for formalizing a small specification to improve discoverability and interoperability across tools; the Agentic AI Foundation has been suggested as a possible place to host that effort: https://aaif.io/.

Original source: https://simonwillison.net/2025/Dec/12/openai-skills/

Continue the conversation on Slack

Did this article spark your interest? Join our community of experts and enthusiasts to dive deeper, ask questions, and share your ideas.

Join our community