OpenAI Codex
Adapter for OpenAI's Codex CLI.
Install
amux install codex
Minimum CLI version: 1.0.0. Supported on macOS, Linux and Windows.
Auth
- API key only — set
OPENAI_API_KEYin your environment.
Config file: ~/.codex/config.json.
Minimal run
amux run codex --prompt "Write a unit test for utils.ts"
Notable flags
--model <id>— defaulto4-mini;codex-mini-latestalso available.--full-auto— emitted whenapprovalMode: 'yolo'.--quiet <prompt>— used by the adapter to stream prompt output.
Session files
- Location:
~/.codex/sessions/*.jsonl - Parsed via the standard JSONL session reader.
- Resume/fork supported by the adapter layer.
Plugins
Plugin support: yes. Codex has a plugin directory with @plugin-creator skill.
Plugin Management
amux plugin install codex <plugin>
amux plugin list codex
MCP Servers
amux mcp install codex <mcp-server>
amux mcp list codex
Registry: https://modelcontextprotocol.io for MCP servers.
Capabilities
Thinking models (o4-mini) with low/medium/high effort levels, tool calling with parallel calls, JSON / structured output, text streaming.
Known limitations
- No image input/output, no file attachments.
- No MCP or plugin ecosystem — bring-your-own tooling only.
- Only two bundled models; other Codex variants must be specified explicitly via
--model. - Project-level config is not supported (
supportsProjectConfig: false); configuration is global.