Skip to main content
The MeshAgent LLM Proxy is an HTTP proxy that lets OpenAI- and Anthropic-compatible clients send requests through MeshAgent. Instead of pointing your app, SDK, or framework directly at OpenAI or Anthropic, you point it at MeshAgent. MeshAgent authenticates the request, applies project-level routing, sends the request to the selected provider, and records usage against the right project and user. With the LLM Proxy you can:
  • Manage provider configuration centrally at the project level in MeshAgent.
  • Track usage, billing, and budget controls across tools and team members in one place.
  • Use the same integration pattern for raw HTTP requests, official SDKs, and higher-level frameworks.

Before you start

MeshAgent provides managed OpenAI and Anthropic access by default. To use your own provider credentials, configure them per project in MeshAgent Studio under Integrations. For localhost URLs and temporary local credentials on your machine, use the Local CLI Proxy. For Codex or Claude, meshagent setup can configure them to use MeshAgent directly. To inspect your own LLM usage for the current project, open the LLM Proxy page in MeshAgent Studio and check the My Usage tab. To get more information about usage, manage billing, and control which models are allowed, use MeshAgent Accounts.

How it works

  1. Your client sends a normal OpenAI-compatible or Anthropic-compatible request to MeshAgent.
  2. Your client authenticates with a MeshAgent participant token, API key, or OAuth access token.
  3. MeshAgent validates the provider path and model, then forwards the request.
  4. MeshAgent returns the provider-compatible response and records usage for the project resolved from the credential.
This is the same route used by meshagent ask and the MeshAgent Codex and Claude integrations.

How to use the LLM Proxy

meshagent room connect runs a local command with the same environment variables it will have in a MeshAgent room. It connects to the room, starts your local command, and sets MESHAGENT_TOKEN to a participant token with access to that room. It also sets OPENAI_BASE_URL, OPENAI_API_KEY, ANTHROPIC_BASE_URL, ANTHROPIC_API_KEY, MESHAGENT_PROJECT_ID, and MESHAGENT_ROOM. Use those environment variables directly. Do not hardcode the proxy address in local code; let meshagent room connect provide the OpenAI and Anthropic base URLs for the room you are testing.

Connect a local command to a room

meshagent setup 
meshagent project list 
meshagent project activate PROJECT_ID # optional: switch the active project
meshagent setup signs you in and stores an OAuth session locally. meshagent project list shows the projects you can use and their IDs. The current project is marked with *. Use meshagent project activate PROJECT_ID to switch projects first. Then run your local command through meshagent room connect:
meshagent room connect --room=my-room --identity=sample-participant -- <your command>
The signed-in user must have permission to connect to the room and use the LLM proxy for the selected project.
When --identity is set, meshagent room connect mints the participant token locally using the active API key for the selected project. See API Keys for the CLI commands used to create, activate, rotate, and remove project API keys.

Make a Raw HTTP Request

Send an OpenAI-compatible request

meshagent room connect --room=my-room --identity=sample-participant -- bash -c '
curl "$OPENAI_BASE_URL/responses" \
  -H "Authorization: Bearer $OPENAI_API_KEY" \
  -H "Content-Type: application/json" \
  --data @-
' <<'JSON'
{
  "model": "gpt-5.4",
  "input": "Tell me a fun fact about AI."
}
JSON

Anthropic-compatible request

meshagent room connect --room=my-room --identity=sample-participant -- bash -c '
curl "$ANTHROPIC_BASE_URL/v1/messages" \
  -H "Authorization: Bearer $ANTHROPIC_API_KEY" \
  -H "anthropic-version: 2023-06-01" \
  -H "content-type: application/json" \
  --data @-
' <<'JSON'
{
  "model": "claude-sonnet-4-6",
  "max_tokens": 512,
  "messages": [
    {"role": "user", "content": "Tell me a fun fact about AI."}
  ]
}
JSON
For Anthropic-compatible raw HTTP requests, append /v1 to the ANTHROPIC_BASE_URL value that meshagent room connect provides. The Anthropic SDK examples below use ANTHROPIC_BASE_URL as-is.

Use The OpenAI And Anthropic SDKs

Run these examples through meshagent room connect so the SDKs receive the room-scoped base URLs and participant token API keys.

OpenAI SDK

# meshagent room connect --room=my-room --identity=sample-participant -- python3 llm-proxy-openai-sdk.py

import os
from openai import OpenAI

client = OpenAI(
    base_url=os.environ["OPENAI_BASE_URL"],
    api_key=os.environ["OPENAI_API_KEY"],
)

response = client.responses.create(
    model="gpt-5.4",
    input="Tell me a fun fact about AI.",
)

print(response.output_text)

Anthropic SDK

# meshagent room connect --room=my-room --identity=sample-participant -- python3 llm-proxy-anthropic-sdk.py

import os
from anthropic import Anthropic

client = Anthropic(
    base_url=os.environ["ANTHROPIC_BASE_URL"],
    api_key=os.environ["ANTHROPIC_API_KEY"],
)

message = client.messages.create(
    model="claude-sonnet-4-6",
    max_tokens=512,
    messages=[
        {
            "role": "user",
            "content": "Tell me a fun fact about AI.",
        }
    ],
)

print(message.content[0].text)

Use Other Frameworks

Configure OpenAI- or Anthropic-compatible frameworks with the MeshAgent base URL and a MeshAgent credential: a participant token, API key, or OAuth access token. For local testing, meshagent room connect supplies the room-scoped base URLs and participant token as environment variables. For OpenAI-compatible transports:
  • Set the base URL from OPENAI_BASE_URL
  • Set the API key from OPENAI_API_KEY
For Anthropic-compatible transports:
  • Set the base URL from ANTHROPIC_BASE_URL for Anthropic SDKs
  • Set the base URL to $ANTHROPIC_BASE_URL/v1 for raw HTTP transports
  • Set the API key or bearer token from ANTHROPIC_API_KEY
For example, LangChain can use the room-scoped OpenAI-compatible route directly:
# meshagent room connect --room=my-room --identity=sample-participant -- python3 llm-proxy-langchain.py

import os
from langchain_openai import ChatOpenAI

llm = ChatOpenAI(
    model="gpt-5.4",
    base_url=os.environ["OPENAI_BASE_URL"],
    api_key=os.environ["OPENAI_API_KEY"],
)

result = llm.invoke("Tell me a fun fact about AI.")
print(result.content)

For frameworks or local tools that require localhost provider endpoints, use Local CLI Proxy. The local proxy gives you temporary localhost OpenAI and Anthropic endpoints that forward through MeshAgent while the meshagent llm proxy process is running.

Use OAuth Credentials Directly

A client that authenticates using OAuth, such as the MeshAgent CLI, can call the proxy with OAuth credentials directly. OAuth proxy requests do not require a room to be running. Because they are not authenticated with a room participant token, cost is attributed to the selected project and user, not to a room in the usage dashboards. OAuth clients can access multiple projects, so OAuth proxy requests must include Meshagent-Project-Id to choose the project for routing and usage. The OAuth token must include the llm_proxy scope, and the user must have direct LLM proxy access enabled in the MeshAgent Accounts account management console.
export MESHAGENT_ACCESS_TOKEN="$(meshagent auth token)"
export MESHAGENT_PROJECT_ID=<project_id from meshagent project list>

curl "https://api.meshagent.com/openai/v1/responses" \
  -H "Authorization: Bearer $MESHAGENT_ACCESS_TOKEN" \
  -H "Meshagent-Project-Id: $MESHAGENT_PROJECT_ID" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "gpt-5.4",
    "input": "Tell me a fun fact about AI."
  }'

Authentication Notes

  • meshagent room connect sets MESHAGENT_TOKEN, OPENAI_API_KEY, and ANTHROPIC_API_KEY to a participant token with access to the room.
  • The participant token must carry the room grant and LLM API grant.
  • The signed-in user must have permission to connect to the room and use the LLM proxy for that project.
  • The room-scoped base URLs are {room_url}/openai/v1 for OpenAI-compatible requests and {room_url}/anthropic for Anthropic SDK requests.
  • OAuth proxy requests require an access token with the llm_proxy scope and Meshagent-Project-Id.

Supported Provider Paths

MeshAgent exposes these provider-compatible paths. MeshAgent proxies both OpenAI and Anthropic endpoints.

OpenAI-Compatible

  • /v1/chat/completions
  • /v1/responses
  • /v1/responses/compact
  • /v1/responses/input_tokens
  • /v1/embeddings
  • /v1/audio/speech
  • /v1/audio/transcriptions
  • /v1/audio/translations
  • /v1/models and /v1/models/*
  • /v1/images/*
  • /v1/realtime and /v1/realtime/*
Supported OpenAI websocket paths are:
  • /v1/realtime
  • /v1/responses

Anthropic-Compatible

  • /v1/messages
  • /v1/messages/count_tokens
  • /v1/messages/batches*
  • /v1/complete
  • /v1/models and /v1/models/*