Problem
Docs/README.md does not contain description how to configure the config.toml for OpenAI s Codex-CLI for use with this MCP.
Proposed solution
Add description/link to README.md so Codex-CLI users can use this MCP from Codex CLI.
Additional context
Assume, the server is running locally already (e.g. VS Code or Docker) or Stream is used. To my knowledge Codex-CLI does not integrate a MCP-server-runner.
Codex seems to support streamable MCP servers as experimental feature.
Problem
Docs/
README.mddoes not contain description how to configure theconfig.tomlfor OpenAI s Codex-CLI for use with this MCP.Proposed solution
Add description/link to
README.mdso Codex-CLI users can use this MCP from Codex CLI.Additional context
Assume, the server is running locally already (e.g. VS Code or Docker) or Stream is used. To my knowledge Codex-CLI does not integrate a MCP-server-runner.
Codex seems to support streamable MCP servers as experimental feature.