Skip to main content
ChatBot builds on SingleRoomAgent to deliver a ready-to-use conversational agent. It joins a MeshAgent room, manages per-participant chat threads, streams responses from your chosen LLM, and persists conversation history so users can pick up where they left off. Toolkit integration, thread storage, and auto-greeting are all handled for you, which makes ChatBot the fastest path to creating a helpful, interactive chat-based agent. The MeshAgent ChatBot supports both static and dynamic toolkits built per message. This allows you to define tools the agent should always have access to as well as add and remove tools on demand as a conversation progresses.

When to Use It

  • You want to build an agent that is seamlessly connected to the room and responds conversationally to user messages.
  • You’re building a text-based agent that uses an LLM to generate responses.
  • You need to synchronize message history with room documents or threads.
  • You want automatic integration with toolkits and schemas for dynamic LLM behavior.
  • You plan to add tools, dynamic UI interactions, or other agent-to-room interactions during a chat.
  • You prefer to focus on prompts, tools, and business logic for your agent instead of plumbing for storage, messaging, and scale.
If your agent only needs to perform background tasks or doesn’t interact with participants directly, consider subclasses like Worker or TaskRunner instead.

Constructor Parameters

ChatBot accepts everything that SingleRoomAgent does (name, title, description, requires, labels) and adds chat-specific parameters.
ParameterTypeDescription
namestrUnique identifier for the agent within the room.
titlestr | NoneHuman-friendly name shown in UX. Defaults to name.
descriptionstr | NoneOptional short description.
requireslist[Requirement] | NoneDependencies such as RequiredSchema or RequiredToolkit. Automatically ensures the "thread" schema exists.
llm_adapterLLMAdapterRequired. Adapter that talks to your model provider (for example OpenAIResponsesAdapter). It supplies chat contexts and translates responses into MeshAgent events.
tool_adapterToolResponseAdapter | NoneOptional adapter for translating tool call outputs.
toolkitslist[Toolkit] | NoneExtra toolkits that are always available to the ChatBot beyond what requires installs. Defaults to [].
ruleslist[str] | NoneList of system or behavior rules appended to each chat context.
auto_greet_messagestr | NoneOptional greeting message sent automatically by the agent when a new thread starts.
empty_state_titlestr | NoneTitle text shown in the Studio chat pane before the first user message. Defaults to "How can I help you?".
labelslist[str] | NoneOptional tags for discovery and filtering.
If you do not pass a requires list, ChatBot automatically injects RequiredSchema(name="thread") so it can store transcripts. You can still add additional toolkits or schemas by supplying your own list.

Lifecycle Overview

ChatBot inherits the lifecycle from SingleRoomAgent but extends it to manage ongoing conversations and message threads.
  • await start(room: RoomClient): Connects to the room, installs requirements, and registers message handlers. Spawns per-thread tasks to handle participant messages, LLM responses, and tool invocations.
  • await stop(): Cancels active chat threads and clears cached state before disconnecting.
  • room property: Accesses the active RoomClient just like in SingleRoomAgent.

Conversation Flow

When a participant sends a message:
  1. The ChatBot receives the message via room.messaging.on("message").
  2. It identifies or spawns a chat thread corresponding to that conversation path.
  3. A chat context (AgentChatContext) is initialized using the llm_adapter.
  4. The message history is synchronized with the room’s "thread" document.
  5. ChatBot resolves this turn’s toolkits: it collects ToolkitBuilders from get_thread_toolkit_builders(...), merges per-turn ToolkitConfigs (from MeshAgent Studio/CLI/App UI), calls make_tools(...) to create any dynamic toolkits, and combines the result with any static toolkits.
  6. The LLM is invoked with the prepared chat context and the resolved toolkits; tool calls are routed back into those toolkits, tokens are streamed, and the results are appended to the thread document.
All of this happens automatically once your bot is started.

Key Behaviors and Hooks

  • Thread management: Every conversation lives inside a MeshDocument thread. The ChatBot opens the thread as soon as a user sends a message and keeps it in sync with the document store.
  • Context building: init_chat_context() asks the LLM adapter for a fresh context and appends any rules you provided. Override this if you need to preload the context with extra data.
  • Static tool resolution: get_thread_toolkits() gathers toolkits declared in requires, and any always-on toolkits passed through toolkits.
  • Dynamic tool providers: get_thread_toolkit_builders() lists the ToolkitBuilder factories the ChatBot is willing to spin up on demand (MCP, storage uploads, local shell, image generation, etc.). The chat UI calls this hook first to learn which tool toggles to display, and later sends back matching configs so the ChatBot can hydrate those toolkits with make_tools().
  • Message-specific toolkits: When the user sends a message with a tools payload, make_tools() combines the selected configs with the provider list to hydrate just the toolkits needed for that turn before the LLM is invoked. (For example, in MeshAgent Studio, when the user selects which tools to add to an agent, these tools are added to the tools payload so the agent can use them.)
  • Thread helpers: Use open_thread() and close_thread() to work with the underlying document, or load_thread_context() to replay the stored messages into an existing chat context (for example, when resuming a conversation).
  • LLM customization: Override prepare_llm_context() if you need to mutate the context right before calling the LLM (for example to provide additional background context for the agent outside of the system prompt).
  • Participant utilities: get_online_participants() returns the participants included in the thread so you can tailor responses or access their attributes.

Key Methods

MethodDescription
async def init_chat_context()Creates a new chat context using the configured llm_adapter and applies rules.
async def get_thread_toolkits(thread_context, participant)Resolves all always-on toolkits for a participant, including built-in reasoning tools.
async def get_thread_toolkit_builders(thread_context, participant)Returns the ToolkitBuilder list the UI can offer for message-level tool selection.
async def open_thread(path) / async def close_thread(path)Opens or closes a thread document in the room.
async def load_thread_context(thread_context)Loads historical messages into the current chat context from the thread document.
async def prepare_llm_context(thread_context)Hook for modifying the chat context before invoking the LLM. Override as needed.

Built-in Toolkits and Behaviors

ChatBot automatically provides and manages several toolkits and utilities:
  • Reasoning Toolkit: Allows structured “chain-of-thought” reasoning within the chat context.
  • Built-in Tool Support (OpenAI-style): May include web_search, local_shell, or image_gen, depending on the LLM adapter.
  • UI Toolkit Integration: Supports sending messages, showing toasts, or interacting with user interfaces via the “ui” toolkit.
These can be extended or replaced with your own toolkit factories.

Beginner-Friendly Examples

import asyncio
from meshagent.otel import otel_config
from meshagent.api.services import ServiceHost
from meshagent.agents.chat import ChatBot
from meshagent.openai import OpenAIResponsesAdapter

# Enable OpenTelemetry logging and tracing for the agent
otel_config(service_name="chatbot")

# Create a service host
service = ServiceHost()  # optional to pass a port, MeshAgent automatically assigns an available one if none provideds

# Register an agent at a specific path
@service.path(path="/chat", identity="chatbot")
class SimpleChatbot(ChatBot):
    def __init__(self):
        super().__init__(
            name="chatbot",
            title="Simple Chatbot",
            description="A helpful chatbot for room participants",
            rules=[
                "Always respond to the user first then include a fun fact at the end of your response."
            ],
            llm_adapter=OpenAIResponsesAdapter(),
        )

# Start the service
asyncio.run(service.run())

This subclass uses the OpenAI Responses API and applies a a system prompt. You can call the ChatBot into a room directly from your terminal by running:
bash
meshagent service run "main.py" --room=gettingstarted
Next from the MeshAgent Studio you can go to the Sessions tab, open the gettingstarted room, and begin chatting with your agent. To extend the Chatbot with Tools see the next article on building a chat agent. MeshAgent Studio and the Flutter SDK include built-in chat UI components (such as ChatThreadLoader) that understand ChatBot threads. This means any conversation started in the Studio can be displayed properly or continued inside your own application without extra integration work.

Next Steps

ChatBot builds directly on SingleRoomAgent, inheriting its lifecycle management, requirement installation, and toolkit resolution. While SingleRoomAgent focuses on room connection and environment setup, ChatBot handles conversation orchestration — managing threads, participants, messages, and reasoning. To continue learning about MeshAgent agents check out:
  • Build a ChatBot: Step-by-step guide to configuring adapters, adding toolkits, and running your ChatBot in a MeshAgent room.
  • Adapters: Understand how LLM adapters and tool response adapters plug into the chat loop.
  • VoiceBot: Adds streaming audio input/output for speech interactions.
  • Worker and TaskRunner: Perform background or task-based actions.
  • MailWorker: Interact with an agent via email.
To learn more about deploying agents with MeshAgent Use ChatBot whenever your agent’s primary interface is text-based conversation and you need a foundation for more specialized conversational agents.