- Manage provider configuration centrally at the project level in MeshAgent.
- Track usage, billing, and budget controls across tools and team members in one place.
- Use the same integration pattern for raw HTTP requests, official SDKs, and higher-level frameworks.
Before you start
MeshAgent provides managed OpenAI and Anthropic access by default. To use your own provider credentials, configure them per project in MeshAgent Studio under Integrations. For localhost URLs and temporary local credentials on your machine, use the Local CLI Proxy. For Codex or Claude,meshagent setup can configure them to use MeshAgent directly.
To inspect your own LLM usage for the current project, open the LLM Proxy page in MeshAgent Studio and check the My Usage tab. To get more information about usage, manage billing, and control which models are allowed, use MeshAgent Accounts.
How it works
- Your client sends a normal OpenAI-compatible or Anthropic-compatible request to MeshAgent.
- Your client authenticates with a MeshAgent participant token, API key, or OAuth access token.
- MeshAgent validates the provider path and model, then forwards the request.
- MeshAgent returns the provider-compatible response and records usage for the project resolved from the credential.
meshagent ask and the MeshAgent Codex and Claude integrations.
How to use the LLM Proxy
meshagent room connect runs a local command with the same environment variables it will have in a MeshAgent room. It connects to the room, starts your local command, and sets MESHAGENT_TOKEN to a participant token with access to that room. It also sets OPENAI_BASE_URL, OPENAI_API_KEY, ANTHROPIC_BASE_URL, ANTHROPIC_API_KEY, MESHAGENT_PROJECT_ID, and MESHAGENT_ROOM.
Use those environment variables directly. Do not hardcode the proxy address in local code; let meshagent room connect provide the OpenAI and Anthropic base URLs for the room you are testing.
Connect a local command to a room
meshagent setup signs you in and stores an OAuth session locally. meshagent project list shows the projects you can use and their IDs. The current project is marked with *. Use meshagent project activate PROJECT_ID to switch projects first.
Then run your local command through meshagent room connect:
The signed-in user must have permission to connect to the room and use the LLM proxy for the selected project.
--identity is set, meshagent room connect mints the participant token locally using the active API key for the selected project. See API Keys for the CLI commands used to create, activate, rotate, and remove project API keys.
Make a Raw HTTP Request
Send an OpenAI-compatible request
Anthropic-compatible request
/v1 to the ANTHROPIC_BASE_URL value that meshagent room connect provides. The Anthropic SDK examples below use ANTHROPIC_BASE_URL as-is.
Use The OpenAI And Anthropic SDKs
Run these examples throughmeshagent room connect so the SDKs receive the room-scoped base URLs and participant token API keys.
OpenAI SDK
Anthropic SDK
Use Other Frameworks
Configure OpenAI- or Anthropic-compatible frameworks with the MeshAgent base URL and a MeshAgent credential: a participant token, API key, or OAuth access token. For local testing,meshagent room connect supplies the room-scoped base URLs and participant token as environment variables.
For OpenAI-compatible transports:
- Set the base URL from
OPENAI_BASE_URL - Set the API key from
OPENAI_API_KEY
- Set the base URL from
ANTHROPIC_BASE_URLfor Anthropic SDKs - Set the base URL to
$ANTHROPIC_BASE_URL/v1for raw HTTP transports - Set the API key or bearer token from
ANTHROPIC_API_KEY
meshagent llm proxy process is running.
Use OAuth Credentials Directly
A client that authenticates using OAuth, such as the MeshAgent CLI, can call the proxy with OAuth credentials directly. OAuth proxy requests do not require a room to be running. Because they are not authenticated with a room participant token, cost is attributed to the selected project and user, not to a room in the usage dashboards. OAuth clients can access multiple projects, so OAuth proxy requests must includeMeshagent-Project-Id to choose the project for routing and usage. The OAuth token must include the llm_proxy scope, and the user must have direct LLM proxy access enabled in the MeshAgent Accounts account management console.
Authentication Notes
meshagent room connectsetsMESHAGENT_TOKEN,OPENAI_API_KEY, andANTHROPIC_API_KEYto a participant token with access to the room.- The participant token must carry the room grant and LLM API grant.
- The signed-in user must have permission to connect to the room and use the LLM proxy for that project.
- The room-scoped base URLs are
{room_url}/openai/v1for OpenAI-compatible requests and{room_url}/anthropicfor Anthropic SDK requests. - OAuth proxy requests require an access token with the
llm_proxyscope andMeshagent-Project-Id.
Supported Provider Paths
MeshAgent exposes these provider-compatible paths. MeshAgent proxies both OpenAI and Anthropic endpoints.OpenAI-Compatible
/v1/chat/completions/v1/responses/v1/responses/compact/v1/responses/input_tokens/v1/embeddings/v1/audio/speech/v1/audio/transcriptions/v1/audio/translations/v1/modelsand/v1/models/*/v1/images/*/v1/realtimeand/v1/realtime/*
/v1/realtime/v1/responses
Anthropic-Compatible
/v1/messages/v1/messages/count_tokens/v1/messages/batches*/v1/complete/v1/modelsand/v1/models/*