meshagent llm proxy starts a local proxy server on your machine. It exposes temporary OpenAI-compatible and Anthropic-compatible localhost endpoints that forward requests to the hosted MeshAgent LLM Proxy, so usage is still routed through your MeshAgent project and user.
The local endpoints and keys only work while the command is running, and by default the terminal shows live usage as requests come through. Use it for local tools that are easiest to configure as if they were talking to OpenAI or Anthropic directly or can’t send custom headers like Meshagent-Project-Id.
For Codex or Claude, use Use Codex and Claude with MeshAgent. meshagent setup can configure those tools directly against the hosted MeshAgent endpoints.
Start the local proxy
Runmeshagent setup once to authenticate with MeshAgent and select a project, then start the local proxy:
http://127.0.0.1:8766/openai/v1for OpenAI-compatible clientshttp://127.0.0.1:8766/anthropicfor Anthropic-compatible clients
meshagent llm proxy shows live usage by model and recent request activity as traffic passes through the proxy.
How It Works
- Your tool sends requests to the localhost OpenAI-compatible or Anthropic-compatible URL printed by
meshagent llm proxy. - The local proxy authenticates that request with the temporary local proxy key it printed for the session.
- The local proxy then forwards the request to the hosted MeshAgent LLM Proxy using your active MeshAgent project and an upstream MeshAgent token from your CLI session or
MESHAGENT_TOKEN. - MeshAgent handles the real project-level routing, usage attribution, billing, and provider credentials.
Use the Local Proxy
OpenAI-compatible curl request
Anthropic-compatible curl request
OpenAI SDK examples
Anthropic SDK examples
Using Agent Frameworks with the local proxy
Any framework that lets you override an OpenAI or Anthropic compatible base URL and API key can use the local proxy. For example, with LangChain: If your framework can send MeshAgent authentication and project headers directly, you can also use the hosted MeshAgent LLM Proxy. Use the local proxy when you want the framework to talk to a local provider-compatible endpoint during development.
When to use the local proxy vs the hosted proxy
The local proxy is a local adapter on top of the hosted MeshAgent LLM Proxy. Requests still route through MeshAgent and use the active MeshAgent project. Use the hosted proxy when your app, service, SDK, or framework can call MeshAgent directly with a MeshAgent OAuth token andMeshagent-Project-Id.
Use the local proxy when you’re working with a tool on your machine that’s easiest to configure with provider-style base URLs and API keys. It’s especially useful when the tool can’t reliably send custom headers like Meshagent-Project-Id, when you’d rather use local proxy keys than a MeshAgent OAuth token, or when you want live usage visible in your terminal as requests come through.
Advanced options
The local proxy signs upstream MeshAgent requests with:MESHAGENT_TOKEN, if it is set- otherwise your current MeshAgent CLI auth session
--project-id <project_id>to bypass the active project--token-from-env <ENV_NAME>to forward a different MeshAgent token env var upstream--bearer <token>to set the local bearer token explicitly--insecureto disable local bearer-token enforcement--no-tuito skip the live usage dashboard
--bearer or --insecure, MeshAgent reuses a stored local bearer token or generates one on first run. That local bearer token protects the localhost proxy only. It is separate from the upstream MeshAgent credential used to talk to the hosted proxy.