Skip to main content
The MeshAgent LLM proxy is an HTTP proxy that lets services and external clients call OpenAI and Anthropic without shipping provider API keys in your code. The proxy can authenticate with either MeshAgent room participant tokens or OAuth access tokens issued by a registered OAuth client. Usage is attributed to the room or project context established by that credential.

Proxied endpoints

OpenAI

Base URL: https://api.meshagent.com/openai/v1 (also accepts https://api.meshagent.com/v1 for OpenAI compatible clients). Supported endpoints:
  • /v1/chat/completions
  • /v1/responses
  • /v1/responses/compact
  • /v1/responses/input_tokens
  • /v1/embeddings
  • /v1/audio/speech
  • /v1/audio/transcriptions
  • /v1/audio/translations
  • /v1/images/*
  • /v1/models and /v1/models/*
  • /v1/realtime and /v1/realtime/*

Anthropic

Base URL for raw HTTP requests: https://api.meshagent.com/anthropic/v1. When you use the MeshAgent Anthropic SDK helper or the injected ANTHROPIC_BASE_URL, the base URL is https://api.meshagent.com/anthropic; the Anthropic SDK appends the /v1 paths itself. Supported endpoints:
  • /v1/messages
  • /v1/messages/count_tokens
  • /v1/messages/batches*
  • /v1/complete
  • /v1/models and /v1/models/*
Requests to other paths are rejected.

Using the proxy

Use the proxy helpers so the provider SDK resolves the injected MeshAgent proxy environment variables automatically:
Python
from meshagent.openai.proxy import get_client as get_openai_client
from meshagent.anthropic.proxy import get_client as get_anthropic_client

openai_client = get_openai_client()
anthropic_client = get_anthropic_client()
Room containers and meshagent room connect set OPENAI_API_KEY, OPENAI_BASE_URL, ANTHROPIC_API_KEY, and ANTHROPIC_BASE_URL automatically. For OpenAI-compatible SDKs, you can also set OPENAI_BASE_URL to https://api.meshagent.com/openai/v1 and pass the MeshAgent participant token as api_key. If you are calling the proxy from your own application instead of from a room participant, you can also use an OAuth access token from a registered OAuth client. In that case, send the OAuth access token in the normal auth header for the route and include Meshagent-Project-Id: <project_id>. The token must include the llm_proxy scope, and the user behind that token must have the can_use_llm_proxy project permission (admins also qualify) for the project named by that header.

Local CLI proxy

If you want to point local SDKs and tools at MeshAgent without exposing your OAuth access token directly, run:
meshagent llm proxy --project-id <project_id>
The command:
  • signs upstream requests with MESHAGENT_TOKEN when that env var is set, or with your OAuth access token otherwise
  • lets you force a specific token env var with --token-from-env <ENV_NAME>
  • refreshes the OAuth token automatically when OAuth is being used
  • proxies both OpenAI and Anthropic traffic, including supported websockets
  • prints the local OPENAI_* and ANTHROPIC_* environment variables to use
  • stores a randomly generated local bearer token in your meshagent config on first run unless you pass --insecure or --bearer
  • shows a fullscreen live usage and pricing dashboard when attached to a TTY

Automatic usage tracking

The proxy inspects responses and streams to record usage and cost:
  • OpenAI streaming tracks response.completed (Responses API) or the final usage chunk (Chat Completions).
  • Anthropic streaming deduplicates cumulative usage fields to avoid double counting.
  • Image usage is inferred from response count plus request model/size/quality.
  • Audio speech usage counts input characters.
  • Models without pricing data are rejected so costs remain accurate.
Usage is emitted as OpenTelemetry metrics/traces and also appears in MeshAgent’s built-in dashboards (view from MeshAgent Studio usage and billing tabs). If you configure your own OTEL endpoint, the same data can be exported to your provider. LLM proxy billing also adds a separate llm_proxy_surcharge line item on top of the underlying provider usage charges.

Authentication and provider key protection

  • Clients authenticate with either a MeshAgent room participant token or an OAuth access token issued by a registered OAuth client.
  • OpenAI proxy requires Authorization: Bearer <token>.
  • Anthropic proxy accepts either Authorization: Bearer <token> or x-api-key: <token>.
  • OAuth-authenticated LLM requests must also send Meshagent-Project-Id: <project_id>.
  • The OAuth token must include the llm_proxy scope.
  • The OAuth user must have the can_use_llm_proxy project permission for the project named by Meshagent-Project-Id (admins also qualify).
  • The proxy strips provider auth headers and injects provider keys from server-side env vars.
  • Provider keys never need to exist inside room containers or client code.
  • Optional HTTP logging clients redact auth headers by default.

Environment variables

  • Provider keys (router/proxy): You can set your own default OPENAI_API_KEY, OPENAI_BASE_URL, ANTHROPIC_API_KEY, and ANTHROPIC_BASE_URL from MeshAgent Studio or the REST API to use your own API keys across your MeshAgent rooms.
  • Room service defaults: Certain environment variables are auto-injected for services running inside a room. You can override them in a service template/container env if you need a different base URL. Examples: OPENAI_BASE_URL, ANTHROPIC_BASE_URL.
  • SDK defaults: When you construct the client/adapter (for example OpenAIResponsesAdapter(model=...)) you can override the model or base URL explicitly, or set defaults with OPENAI_MODEL, OPENAI_BASE_URL, ANTHROPIC_MODEL, and ANTHROPIC_BASE_URL.