Proxied endpoints
OpenAI
Base URL:https://api.meshagent.com/openai/v1 (also accepts https://api.meshagent.com/v1 for OpenAI compatible clients).
Supported endpoints:
/v1/chat/completions/v1/responses/v1/responses/compact/v1/responses/input_tokens/v1/embeddings/v1/audio/speech/v1/audio/transcriptions/v1/audio/translations/v1/images/*/v1/modelsand/v1/models/*/v1/realtimeand/v1/realtime/*
Anthropic
Base URL for raw HTTP requests:https://api.meshagent.com/anthropic/v1.
When you use the MeshAgent Anthropic SDK helper or the injected ANTHROPIC_BASE_URL, the base URL is https://api.meshagent.com/anthropic; the Anthropic SDK appends the /v1 paths itself.
Supported endpoints:
/v1/messages/v1/messages/count_tokens/v1/messages/batches*/v1/complete/v1/modelsand/v1/models/*
Using the proxy
Use the proxy helpers so the provider SDK resolves the injected MeshAgent proxy environment variables automatically:Python
meshagent room connect set OPENAI_API_KEY,
OPENAI_BASE_URL, ANTHROPIC_API_KEY, and ANTHROPIC_BASE_URL
automatically. For OpenAI-compatible SDKs, you can also set
OPENAI_BASE_URL to https://api.meshagent.com/openai/v1 and pass the
MeshAgent participant token as api_key.
If you are calling the proxy from your own application instead of from a room participant, you can also use an OAuth access token from a registered OAuth client. In that case, send the OAuth access token in the normal auth header for the route and include Meshagent-Project-Id: <project_id>. The token must include the llm_proxy scope, and the user behind that token must have the can_use_llm_proxy project permission (admins also qualify) for the project named by that header.
Local CLI proxy
If you want to point local SDKs and tools at MeshAgent without exposing your OAuth access token directly, run:- signs upstream requests with
MESHAGENT_TOKENwhen that env var is set, or with your OAuth access token otherwise - lets you force a specific token env var with
--token-from-env <ENV_NAME> - refreshes the OAuth token automatically when OAuth is being used
- proxies both OpenAI and Anthropic traffic, including supported websockets
- prints the local
OPENAI_*andANTHROPIC_*environment variables to use - stores a randomly generated local bearer token in your meshagent config on first run unless you pass
--insecureor--bearer - shows a fullscreen live usage and pricing dashboard when attached to a TTY
Automatic usage tracking
The proxy inspects responses and streams to record usage and cost:- OpenAI streaming tracks
response.completed(Responses API) or the finalusagechunk (Chat Completions). - Anthropic streaming deduplicates cumulative usage fields to avoid double counting.
- Image usage is inferred from response count plus request
model/size/quality. - Audio speech usage counts input characters.
- Models without pricing data are rejected so costs remain accurate.
llm_proxy_surcharge line item on top of the underlying provider usage charges.
Authentication and provider key protection
- Clients authenticate with either a MeshAgent room participant token or an OAuth access token issued by a registered OAuth client.
- OpenAI proxy requires
Authorization: Bearer <token>. - Anthropic proxy accepts either
Authorization: Bearer <token>orx-api-key: <token>. - OAuth-authenticated LLM requests must also send
Meshagent-Project-Id: <project_id>. - The OAuth token must include the
llm_proxyscope. - The OAuth user must have the
can_use_llm_proxyproject permission for the project named byMeshagent-Project-Id(admins also qualify). - The proxy strips provider auth headers and injects provider keys from server-side env vars.
- Provider keys never need to exist inside room containers or client code.
- Optional HTTP logging clients redact auth headers by default.
Environment variables
- Provider keys (router/proxy): You can set your own default
OPENAI_API_KEY,OPENAI_BASE_URL,ANTHROPIC_API_KEY, andANTHROPIC_BASE_URLfrom MeshAgent Studio or the REST API to use your own API keys across your MeshAgent rooms. - Room service defaults: Certain environment variables are auto-injected for services running inside a room. You can override them in a service template/container env if you need a different base URL. Examples:
OPENAI_BASE_URL,ANTHROPIC_BASE_URL. - SDK defaults: When you construct the client/adapter (for example
OpenAIResponsesAdapter(model=...)) you can override the model or base URL explicitly, or set defaults withOPENAI_MODEL,OPENAI_BASE_URL,ANTHROPIC_MODEL, andANTHROPIC_BASE_URL.