Why use the LLM Proxy
- Let someone use Codex CLI or Claude Code through MeshAgent even if they do not already have their own OpenAI or Anthropic account set up for that CLI.
- Centralize provider access in MeshAgent instead of configuring raw provider credentials on each developer machine or CI job.
- Route Codex CLI, Claude Code, and similar tools through the same MeshAgent endpoint pattern.
- Keep token usage reporting inside MeshAgent while the CLI continues to work like a normal OpenAI- or Anthropic-compatible client.
How it works
The LLM Proxy exposes stable MeshAgent endpoints for both OpenAI-compatible and Anthropic-compatible clients:- OpenAI-compatible clients such as Codex CLI:
https://api.meshagent.com/openai/v1 - Anthropic-compatible clients such as Claude Code:
https://api.meshagent.com/anthropic
Setup: Shared first steps
These steps apply regardless of which CLI agent you are using.1. Authenticate to MeshAgent and create an API key
If you haven’t set up MeshAgent yet, start withmeshagent setup. Then create an API key. Copy the printed value — it is shown only once.
2. Create a token spec and generate a participant token JWT
The MeshAgent LLM Proxy requires a participant token (JWT) as the bearer credential. The API key from Step 1 is used to sign it. The JWT is scoped to a specific room, and it must include theapi.llm grant or the proxy will reject requests.
Create a token spec file (contains no secrets — safe to commit):
token-spec.yaml
my-room with your room name, then generate the JWT:
eyJ... JWT. Store it somewhere secure — you will use it as the API key value in your CLI agent’s configuration.
Tokens do not expire by default. Regenerate at any time by re-running this command with the same spec file.
Codex CLI
Codex CLI supports custom model providers in~/.codex/config.toml, including custom base URLs and API key env vars — making it a natural fit for the MeshAgent LLM Proxy.
Step 1: Install Codex
Step 2: Store the JWT as an environment variable
Add the JWT from the shared setup to your shell environment:Step 3: Configure ~/.codex/config.toml
Open the config file:
~/.codex/config.toml
| Key | Description |
|---|---|
base_url | The MeshAgent OpenAI-compatible LLM proxy endpoint. |
env_key | The env var name holding the JWT from the shared setup. |
model | Any model available through the OpenAI API. |
Step 4: Run Codex with the MeshAgent profile
Step 5: Verify Codex is using MeshAgent
After Codex starts, run:meshagent profile you configured in ~/.codex/config.toml, including output like:
Claude Code
Claude Code is Anthropic’s CLI coding agent. It uses the Anthropic API and can be pointed at a custom base URL via environment variables.Step 1: Install Claude Code
Step 2: Configure environment variables
Add the following to your shell profile such as~/.zshrc or ~/.bashrc:
Step 3: Run Claude Code
If you have previously logged into Claude with another account, runclaude /logout before connecting through MeshAgent so cached credentials do not override this configuration.