Skip to main content
The Anthropic Messages adapter is an LLMAdapter implementation for Anthropic’s Messages API. It enables MeshAgent agents to use Claude models, handle tool calls, and stream responses while preserving MeshAgent toolkit behavior.

Key features

  • Model defaults: Reads the model name from the constructor (model=) or the ANTHROPIC_MODEL environment variable. Override per call by passing model to next() (for example via ChatBot).
  • Max token defaults: Reads max_tokens from the constructor or ANTHROPIC_MAX_TOKENS.
  • Session context defaults: Creates AgentSessionContext(system_role=None) so system/developer prompts come from the caller.
  • Tool calling: Converts the supplied toolkits into Anthropic tools definitions and executes tool_use requests.
  • Tool result formatting: Uses AnthropicMessagesToolResponseAdapter to return tool_result blocks back to the model.
  • Streaming support: Uses Anthropic’s streaming API and emits events to event_handler.
  • Structured output (best-effort): When output_schema is provided, the adapter prompts for JSON and validates it (with retries).
  • MCP connector support: MCP toolkits inject mcp_servers, MCP toolset entries in tools, and required betas flags into the request.

Constructor parameters

Python
AnthropicMessagesAdapter(
    model: str = "claude-3-5-sonnet-latest",
    max_tokens: int = 1024,
    client: Optional[Any] = None,
    message_options: Optional[dict] = None,
    provider: str = "anthropic",
    log_requests: bool = False,
    context_management: Literal["auto", "none"] = "none",
    compaction_threshold: int = 150000,
    compaction_pause_after: bool = False,
    compaction_instructions: Optional[str] = None,
)
  • model - default model name; can be overridden per call.
  • max_tokens - cap on output tokens per response.
  • client - reuse an existing Anthropic client; otherwise the adapter builds one via meshagent.anthropic.client.get_client.
  • message_options - extra parameters passed to messages.create (for example temperature, top_p, tools, betas).
  • provider - label emitted in telemetry and logs.
  • log_requests - when true, uses a logging HTTP client for debugging.
  • context_management - auto enables Anthropic beta context management compaction edits only for Claude model versions newer than 4.5 (for example claude-sonnet-4-6); none disables automatic compaction configuration.
  • compaction_threshold - input token threshold for triggering compact_20260112 when context_management="auto" (minimum 50000).
  • compaction_pause_after - when true, compaction can return a compaction block and pause before continuing.
  • compaction_instructions - optional summarization instructions passed to compaction.
When context_management="auto" is enabled, the adapter uses Anthropic’s compact-2026-01-12 beta on client.beta.messages.* and injects a compact_20260112 edit into the request’s context_management.

MCP connector support

Anthropic’s official MCP connector can be enabled by adding an MCP toolkit to your toolkits list. The adapter applies MCP middleware to the request, which injects mcp_servers and an MCP toolset into the top-level tools array (and ensures the required betas flag is present).

Handling a turn

When next() is called it:
  1. Bundles tools - Converts toolkits into Anthropic tool definitions.
  2. Builds the request - Converts the chat context into Anthropic blocks and constructs the request payload.
  3. Calls the model - Sends the request (streaming if event_handler is provided).
  4. Handles tool calls - Executes requested tools and formats tool_result blocks.
  5. Loops - Continues until the model returns a final response.
  6. Returns result - Returns text or validated JSON if output_schema was supplied.