Wraps any OpenAI API interface as Responses with MCPs support so it supports Codex. Adding any missing stateful features. Ollama and Vllm compliant.
-
Updated
May 24, 2025 - Python
Wraps any OpenAI API interface as Responses with MCPs support so it supports Codex. Adding any missing stateful features. Ollama and Vllm compliant.
An MCP server that scales development into controllable agentic, recursive flows, and build a feature from bottom-up
This repository demonstrates how to run the OpenAI Codex CLI inside a Docker container with your local Windows folder mounted for development. It uses an .env file to securely provide your API key and includes .gitignore and .dockerignore to keep secrets and unnecessary files out of version control.
Add a description, image, and links to the openai-codex-cli topic page so that developers can more easily learn about it.
To associate your repository with the openai-codex-cli topic, visit your repo's landing page and select "manage topics."