LLMAdapter implementation for Anthropic’s Messages API. It enables MeshAgent agents to use Claude models, handle tool calls, and stream responses while preserving MeshAgent toolkit behavior.
Key features
- Model defaults: Reads the model name from the constructor (
model=) or theANTHROPIC_MODELenvironment variable. Override per call by passingmodeltonext()(for example viaChatBot). - Max token defaults: Reads
max_tokensfrom the constructor orANTHROPIC_MAX_TOKENS. - Chat context defaults: Creates
AgentChatContext(system_role=None)so system/developer prompts come from the caller. - Tool calling: Converts the supplied toolkits into Anthropic
toolsdefinitions and executestool_userequests. - Tool result formatting: Uses
AnthropicMessagesToolResponseAdapterto returntool_resultblocks back to the model. - Streaming support: Uses Anthropic’s streaming API and emits events to
event_handler. - Structured output (best-effort): When
output_schemais provided, the adapter prompts for JSON and validates it (with retries). - MCP connector support: MCP toolkits inject
mcp_servers, MCP toolset entries intools, and requiredbetasflags into the request.
Constructor parameters
Python
model- default model name; can be overridden per call.max_tokens- cap on output tokens per response.client- reuse an existing Anthropic client; otherwise the adapter builds one viameshagent.anthropic.client.get_client.message_options- extra parameters passed tomessages.create(for exampletemperature,top_p,tools,betas).provider- label emitted in telemetry and logs.log_requests- when true, uses a logging HTTP client for debugging.
MCP connector support
Anthropic’s official MCP connector can be enabled by adding an MCP toolkit to your toolkits list. The adapter applies MCP middleware to the request, which injectsmcp_servers and an MCP toolset into the top-level tools array (and ensures the required betas flag is present).
Handling a turn
Whennext() is called it:
- Bundles tools - Converts toolkits into Anthropic tool definitions.
- Builds the request - Converts the chat context into Anthropic blocks and constructs the request payload.
- Calls the model - Sends the request (streaming if
event_handleris provided). - Handles tool calls - Executes requested tools and formats
tool_resultblocks. - Loops - Continues until the model returns a final response.
- Returns result - Returns text or validated JSON if
output_schemawas supplied.
Related Topics
- Adapters Overview: Understand LLMAdapters and ToolResponseAdapters
- LLM Proxy: How requests are routed and metered inside rooms
- OpenAI Responses Adapter: A detailed reference implementation
- Anthropic Tool Response Adapter: How tool outputs are rendered back into the chat transcript
- ChatBot Overview: Shows where adapters are used