LLMAdapter implementation. It enables MeshAgent agents to use the OpenAI Responses API, handling streaming, tool calls, and model-specific settings.
Key features
- Model defaults: Reads the model name from the constructor (
model=) or theOPENAI_MODELenvironment variable. Override per message by passingmodelin the chat payload;ChatBotforwards it tonext(). - System role selection: Adjusts the initial chat role (
system,developer, etc.) based on the model name (o1, o3, computer-use, etc.). - Tool bundling: Converts the supplied toolkits into OpenAI tool definitions (both standard JSON function tools and OpenAI-native tools like 
computer_use_preview,web_search_preview,image_generation). - Streaming support: Consumes the streaming response API, emitting events such as reasoning summaries, partial content, and tool call updates.
 - Parallel tool calls: Optionally enables OpenAI’s 
parallel_tool_callssetting (disabled automatically for models that do not support it). 
Constructor parameters
Python
model– default model name; can be overridden per message.parallel_tool_calls– request parallel tool execution when supported.client– reuse an existingAsyncOpenAIclient; otherwise the adapter creates one viameshagent.openai.proxy.get_client.response_options– extra parameters passed toresponses.create.reasoning_effort– populates the Responses APIreasoningoptions.provider– label emitted in telemetry and logs.
Tool provider integration
The adapter includes several builders and tools for OpenAI native tools. Agents can use them directly, or override them with agent-specific wrappers that add persistence (for example, the ChatBot’s thread-aware image generation builder that saves partial/final images to room storage and updates the thread document).- Image generation – 
ImageGenerationConfig,ImageGenerationToolkitBuilder,ImageGenerationTool - Local shell – 
LocalShellConfig,LocalShellToolkitBuilder,LocalShellTool - MCP – 
MCPConfig,MCPToolkitBuilder,MCPTool - Web search preview – 
WebSearchConfig,WebSearchToolkitBuilder,WebSearchTool - File Search - 
FileSearchTool - Code Interpreter - 
CodeInterpreterTool - Reasoning - 
ReasoningTool 
Note: The adapter doesn’t “auto-register” these builders by default. Your agent decides which builders to expose each turn (e.g.,ChatBot.get_thread_toolkit_builders(...)). This letsChatBotsubstitute its thread-aware wrappers (likeChatBotThreadOpenAIImageGenerationToolkitBuilder).
Handling a turn
Whennext() is called it:
- Bundles tools - Collects the tools from your toolkits and packages them for OpenAI’s API
 - Calls the model - Sends messages and tools to OpenAI’s API
 - Handles responses - Processes text, tool calls, or structured output
 - Executes tools - When the model requests tools, executes them and formats results
 - Loops - Continues calling the model with tool results until it produces a final answer
 - Returns Result - Gives you the final output (text or structured data)
 
Related Topics
- LLM Adapters: Base interface and lifecycle.
 - OpenAI Tool Response Adapter: How tool outputs are rendered back into the chat transcript.
 - ChatBot Overview: Shows where the adapter is invoked in the overall agent flow.