New Plugin Lets You Access GPT-5.5 Through Your Codex Subscription
Simon Willison just released **llm-openai-via-codex 0.1a0**, an open-source plugin that bridges his popular LLM command-line tool with OpenAI's models — using your existing Codex CLI credentials instead of requiring a separate API key.
The concept is simple but clever: if you're already paying for and authenticated with OpenAI's Codex CLI, this plugin "borrows" those credentials to let you call any available OpenAI model through the LLM tool. One command gets you GPT-5.5:
```
llm -m openai-codex/gpt-5.5 'Generate an SVG of a pelican riding a bicycle'
```
**Why it matters:**
- **Zero extra setup** — install the plugin, and your Codex auth handles the rest
- **Dynamic model discovery** — automatically lists all models your Codex subscription can access
- **Scriptable AI** — enables automation pipelines using GPT-5.5 from the terminal
- **Officially tolerated** — Willison notes the Codex team has acknowledged this approach is acceptable
The plugin is Apache 2.0 licensed, 100% Python, and already has 50 GitHub stars within days of release.
This is part of a broader trend: developers finding creative ways to reduce friction in accessing frontier models. Rather than managing multiple API keys and billing accounts, tools like this consolidate access through credentials users already have.
For anyone already in the LLM CLI ecosystem, this plugin instantly adds OpenAI's full model lineup to your toolkit.
📄 Source
simon-willison