Skip to main content
In addition to MeshAgent Tools, agents can also use external tools through OpenAI Connectors and Model Context Protocol (MCP) servers.
  • OpenAI Connectors: OpenAI-maintained MCP wrappers for third party services that don’t have official MCP servers (Gmail, Google Drive, Outlook, Microsoft Teams, Dropbox etc.). Using these connectors requires OAuth client registration and authorization with the provider.
  • Remote MCP servers: Any MCP server operated by you or a third party. You provide the server URL and any required authentication details, and the agent can call the server’s MCP tools.

Using OpenAI Connectors & MCP Tools with Agents

The simplest way to give your agent access to MCP tools is to pass them directly to the toolkits parameter when creating the agent (see example below). This will ensure the agent always has access to the tools. For more advanced use cases, you can also let users dynamically enable/disable specific connectors per message - see Dynamic MCP Tools for details. In both cases, MeshAgent talks to OpenAI via the OpenAIResponsesAdapter. This adapter gathers the toolkits available to the agent on that turn, manages tool execution, streaming responses, and returning the final result.

Example: Create an Agent with MCP Tools

Let’s create a ChatBot that can use the public DeepWiki MCP Server which does not require an access token or OAuth. For this ChatBot we’ll define:
  • A Toolkit: a collection of related tools the agent can use
  • An MCPTool: connects to an MCP server
  • MCPConfig: configuration for the MCP connection
  • MCPServer: the actual server details (URL, authentication details)

import asyncio
import logging
from meshagent.tools import Toolkit
from meshagent.agents.chat import ChatBot
from meshagent.openai import OpenAIResponsesAdapter
from meshagent.openai.tools.responses_adapter import MCPConfig, MCPServer, MCPTool
from meshagent.api.services import ServiceHost
from meshagent.otel import otel_config

otel_config(service_name="mcp-deepwiki-chatbot")
log = logging.getLogger(__name__)
service = ServiceHost()

@service.path(path="/agent", identity="mcp-deepwiki-chatbot")
class MCPChatbot(ChatBot):
    def __init__(self):
        super().__init__(
            name="mcp-deepwiki-chatbot",
            title="ChatBot that can use DeepWiki MCP Server",
            description="An agent that can use the DeepWiki MCP server",
            rules=["you are an assistant for trying out MCP servers"],
            llm_adapter=OpenAIResponsesAdapter(
                parallel_tool_calls=True
            ),
            auto_greet_message="What can I help you with?",
            toolkits=[
                Toolkit(
                    name="mcp-deepwiki",
                    tools=[
                        MCPTool(
                            config=MCPConfig(
                                name="mcp",
                                servers=[
                                    MCPServer(
                                        server_label="mcp",
                                        server_url="https://mcp.deepwiki.com/mcp"
                                    )
                                ]
                            )
                        )
                    ]
                )
            ],
            labels=["tasks", "mcp"],
        )

asyncio.run(service.run())

Run it locally and try it in MeshAgent Studio:
bash
meshagent service run "main.py" --room=myroom
From MeshAgent Studio, enter myroom, and start talking to the agent. Ask it about public GitHub repositories and it will use the DeepWiki MCP tools to respond. For example you can ask “What’s in the README on the Pydantic AI repository?”.

Package and deploy the agent

Once satisfied with the agent, we can package and deploy it as a project service so it’s automatically available to all the rooms in a project, or as a room service so it’s always available in a specific room. To do so we’ll create a Dockerfile, build/push the image, create a yaml file that defines the service, then deploy it using the MeshAgent CLI. Step 1: Create the Dockerfile
Dockerfile
FROM meshagent/python-sdk-slim:latest
COPY . /src
WORKDIR /src
ENTRYPOINT [ "python3", "main.py" ]
Step 2: Build and Push the Image
docker buildx build . \
  -t "$IMAGE:$TAG" \
  --platform linux/amd64 \
  --push
Step 3: Define the Service
yaml
kind: Service
version: v1
metadata:
  name: mcp-deepwiki-chatbot
  description: "Expose a ChatBot with access to DeepWiki MCP server"
ports:
- num: "*" 
  type: http
  liveness: "/"
  endpoints:
  - path: /mcp-deepwiki-chatbot
    meshagent:
      identity: "mcp-deepwiki-chatbot"
container:
  image: "IMAGE:TAG"
Step 4: Deploy
bash
meshagent service create --file "meshagent.yaml" # optional --room=myroom 
Step 5: Use Your Deployed Agent Next open MeshAgent Studio, go to a room where the service is deployed, and begin talking to the ChatBot.

Conclusion

OpenAI Connectors and MCP Servers extend your agents with third-party capabilities. When adding these tools, consider:
  • What data you’re sharing with external services
  • Who operates the MCP servers you connect to
  • Whether agents should require approval before using certain tools
Use always-available toolkits (like this example) when agents need consistent access to specific capabilities, such as a support agent with internal ticketing tools. Use dynamic toolkits when you want users to control which tools are available per message, this is particularly helpful if you are trying to build a ChatGPT/Claude style experience where users can toggle tools on and off. See Dynamic Toolkits for details.