The Anthropic Messages adapter is anDocumentation Index
Fetch the complete documentation index at: https://docs.meshagent.com/llms.txt
Use this file to discover all available pages before exploring further.
LLMAdapter implementation for Anthropic’s Messages API. It enables MeshAgent agents to use Claude models, handle tool calls, and stream responses while preserving MeshAgent toolkit behavior.
Key features
- Model defaults: Reads the model name from the constructor (
model=) or theANTHROPIC_MODELenvironment variable. Override per call by passingmodeltonext(). - Max token defaults: Reads
max_tokensfrom the constructor orANTHROPIC_MAX_TOKENS. - Session context defaults: Creates
AgentSessionContext(system_role=None)so system/developer prompts come from the caller. - Tool calling: Converts the supplied toolkits into Anthropic
toolsdefinitions and executestool_userequests. - Tool result formatting: Uses
AnthropicMessagesToolResponseAdapterto returntool_resultblocks back to the model. - Streaming support: Uses Anthropic’s streaming API and emits events to
event_handler. - Structured output (best-effort): When
output_schemais provided, the adapter prompts for JSON and validates it (with retries). - MCP connector support: MCP toolkits inject
mcp_servers, MCP toolset entries intools, and requiredbetasflags into the request.
Constructor parameters
Python
model- default model name; can be overridden per call.max_tokens- cap on output tokens per response.client- reuse an existing Anthropic client; otherwise the adapter builds one viameshagent.anthropic.proxy.get_client.base_url- override the provider base URL used when the adapter creates its own client. Defaults toANTHROPIC_BASE_URLwhen omitted.message_options- extra parameters passed tomessages.create(for exampletemperature,top_p,tools,betas).provider- label emitted in telemetry and logs.log_requests- when true, uses a logging HTTP client for debugging.context_management-autoenables Anthropic beta context management compaction edits only for Claude model versions newer than4.5(for exampleclaude-sonnet-4-6);nonedisables automatic compaction configuration.compaction_threshold- input token threshold for triggeringcompact_20260112whencontext_management="auto"(minimum50000).compaction_pause_after- when true, compaction can return acompactionblock and pause before continuing.compaction_instructions- optional summarization instructions passed to compaction.
context_management="auto" is enabled, the adapter uses Anthropic’s compact-2026-01-12 beta on client.beta.messages.* and injects a compact_20260112 edit into the request’s context_management.
MCP connector support
Anthropic’s official MCP connector can be enabled by adding an MCP toolkit to your toolkits list. The adapter applies MCP middleware to the request, which injectsmcp_servers and an MCP toolset into the top-level tools array (and ensures the required betas flag is present).
Handling a turn
Whennext() is called it:
- Bundles tools - Converts toolkits into Anthropic tool definitions.
- Builds the request - Converts the chat context into Anthropic blocks and constructs the request payload.
- Calls the model - Sends the request (streaming if
event_handleris provided). - Handles tool calls - Executes requested tools and formats
tool_resultblocks. - Loops - Continues until the model returns a final response.
- Returns result - Returns text or validated JSON if
output_schemawas supplied.
Related Topics
- Adapters Overview: Understand LLMAdapters and ToolResponseAdapters
- MeshAgent LLM Proxy: How requests are routed and metered inside rooms
- OpenAI Responses Adapter: A detailed reference implementation
- Anthropic Tool Response Adapter: How tool outputs are rendered back into the chat transcript
- Process Agents Overview: Shows where adapters are used in the recommended runtime