autogen_ext.tools.mcp#
- create_mcp_server_session(server_params: Annotated[StdioServerParams | SseServerParams | StreamableHttpServerParams, FieldInfo(annotation=NoneType, required=True, discriminator='type')], sampling_callback: SamplingFnT | None = None) AsyncGenerator[ClientSession, None][source]#
为给定服务器参数创建 MCP 客户端会话。
- class McpSessionActor(server_params: Annotated[StdioServerParams | SseServerParams | StreamableHttpServerParams, FieldInfo(annotation=NoneType, required=True, discriminator='type')], model_client: ChatCompletionClient | None = None)[source]#
基类:
ComponentBase[BaseModel],Component[McpSessionActorConfig]- component_type: ClassVar[ComponentType] = 'mcp_session_actor'#
组件的逻辑类型。
- component_config_schema#
McpSessionActorConfig的别名
- component_provider_override: ClassVar[str | None] = 'autogen_ext.tools.mcp.McpSessionActor'#
覆盖组件的提供者字符串。这应该用于防止内部模块名称成为模块名称的一部分。
- server_params: Annotated[StdioServerParams | SseServerParams | StreamableHttpServerParams, FieldInfo(annotation=NoneType, required=True, discriminator='type')]#
- async call(type: str, args: McpActorArgs | None = None) Future[Coroutine[Any, Any, ListToolsResult] | Coroutine[Any, Any, CallToolResult] | Coroutine[Any, Any, ListPromptsResult] | Coroutine[Any, Any, ListResourcesResult] | Coroutine[Any, Any, ListResourceTemplatesResult] | Coroutine[Any, Any, ReadResourceResult] | Coroutine[Any, Any, GetPromptResult]][source]#
- class StdioMcpToolAdapter(server_params: StdioServerParams, tool: Tool, session: ClientSession | None = None)[source]#
基类:
McpToolAdapter[StdioServerParams],Component[StdioMcpToolAdapterConfig]允许您包装通过 STDIO 运行的 MCP 工具,并使其可用于 AutoGen。
此适配器能够与 AutoGen 代理一起使用通过标准输入/输出进行通信的 MCP 兼容工具。常见的用例包括包装命令行工具和实现模型上下文协议 (MCP) 的本地服务。
注意
要使用此类别,您需要为 autogen-ext 包安装 mcp 额外功能。
pip install -U "autogen-ext[mcp]"
- 参数:
server_params (StdioServerParams) – MCP 服务器连接的参数,包括要运行的命令及其参数
tool (Tool) – 要包装的 MCP 工具
session (ClientSession, optional) – 要使用的 MCP 客户端会话。如果未提供,将创建一个新会话。这对于测试或您希望自己管理会话生命周期时很有用。
请参阅
mcp_server_tools()获取示例。- component_config_schema#
StdioMcpToolAdapterConfig的别名
- pydantic model StdioServerParams[source]#
基类:
StdioServerParameters通过 STDIO 连接到 MCP 服务器的参数。
显示 JSON 模式
{ "title": "StdioServerParams", "description": "Parameters for connecting to an MCP server over STDIO.", "type": "object", "properties": { "command": { "title": "Command", "type": "string" }, "args": { "items": { "type": "string" }, "title": "Args", "type": "array" }, "env": { "anyOf": [ { "additionalProperties": { "type": "string" }, "type": "object" }, { "type": "null" } ], "default": null, "title": "Env" }, "cwd": { "anyOf": [ { "type": "string" }, { "format": "path", "type": "string" }, { "type": "null" } ], "default": null, "title": "Cwd" }, "encoding": { "default": "utf-8", "title": "Encoding", "type": "string" }, "encoding_error_handler": { "default": "strict", "enum": [ "strict", "ignore", "replace" ], "title": "Encoding Error Handler", "type": "string" }, "type": { "const": "StdioServerParams", "default": "StdioServerParams", "title": "Type", "type": "string" }, "read_timeout_seconds": { "default": 5, "title": "Read Timeout Seconds", "type": "number" } }, "required": [ "command" ] }
- 字段:
read_timeout_seconds (float)type (Literal['StdioServerParams'])
- class SseMcpToolAdapter(server_params: SseServerParams, tool: Tool, session: ClientSession | None = None)[source]#
基类:
McpToolAdapter[SseServerParams],Component[SseMcpToolAdapterConfig]允许您包装通过服务器发送事件 (SSE) 运行的 MCP 工具,并使其可用于 AutoGen。
此适配器能够与 AutoGen 代理一起使用通过 HTTP 和 SSE 进行通信的 MCP 兼容工具。常见的用例包括与远程 MCP 服务、基于云的工具以及实现模型上下文协议 (MCP) 的 Web API 集成。
注意
要使用此类别,您需要为 autogen-ext 包安装 mcp 额外功能。
pip install -U "autogen-ext[mcp]"
- 参数:
server_params (SseServerParameters) – MCP 服务器连接的参数,包括 URL、标头和超时。
tool (Tool) – 要包装的 MCP 工具。
session (ClientSession, optional) – 要使用的 MCP 客户端会话。如果未提供,它将创建一个新会话。这对于测试或您希望自己管理会话生命周期时很有用。
示例
使用通过 SSE 实现 MCP 的远程翻译服务创建工具,允许 AutoGen 代理执行翻译
import asyncio from autogen_ext.models.openai import OpenAIChatCompletionClient from autogen_ext.tools.mcp import SseMcpToolAdapter, SseServerParams from autogen_agentchat.agents import AssistantAgent from autogen_agentchat.ui import Console from autogen_core import CancellationToken async def main() -> None: # Create server params for the remote MCP service server_params = SseServerParams( url="https://api.example.com/mcp", headers={"Authorization": "Bearer your-api-key", "Content-Type": "application/json"}, timeout=30, # Connection timeout in seconds ) # Get the translation tool from the server adapter = await SseMcpToolAdapter.from_server_params(server_params, "translate") # Create an agent that can use the translation tool model_client = OpenAIChatCompletionClient(model="gpt-4") agent = AssistantAgent( name="translator", model_client=model_client, tools=[adapter], system_message="You are a helpful translation assistant.", ) # Let the agent translate some text await Console( agent.run_stream(task="Translate 'Hello, how are you?' to Spanish", cancellation_token=CancellationToken()) ) if __name__ == "__main__": asyncio.run(main())
- component_config_schema#
SseMcpToolAdapterConfig的别名
- pydantic model SseServerParams[source]#
基类:
BaseModel通过 SSE 连接到 MCP 服务器的参数。
显示 JSON 模式
{ "title": "SseServerParams", "description": "Parameters for connecting to an MCP server over SSE.", "type": "object", "properties": { "type": { "const": "SseServerParams", "default": "SseServerParams", "title": "Type", "type": "string" }, "url": { "title": "Url", "type": "string" }, "headers": { "anyOf": [ { "type": "object" }, { "type": "null" } ], "default": null, "title": "Headers" }, "timeout": { "default": 5, "title": "Timeout", "type": "number" }, "sse_read_timeout": { "default": 300, "title": "Sse Read Timeout", "type": "number" } }, "required": [ "url" ] }
- 字段:
headers (dict[str, Any] | None)sse_read_timeout (float)timeout (float)type (Literal['SseServerParams'])url (str)
- class StreamableHttpMcpToolAdapter(server_params: StreamableHttpServerParams, tool: Tool, session: ClientSession | None = None)[source]#
基类:
McpToolAdapter[StreamableHttpServerParams],Component[StreamableHttpMcpToolAdapterConfig]允许您包装通过可流式 HTTP 运行的 MCP 工具,并使其可用于 AutoGen。
此适配器能够与 AutoGen 代理一起使用通过可流式 HTTP 进行通信的 MCP 兼容工具。常见的用例包括与远程 MCP 服务、基于云的工具以及实现模型上下文协议 (MCP) 的 Web API 集成。
注意
要使用此类别,您需要为 autogen-ext 包安装 mcp 额外功能。
pip install -U "autogen-ext[mcp]"
- 参数:
server_params (StreamableHttpServerParams) – MCP 服务器连接的参数,包括 URL、标头和超时。
tool (Tool) – 要包装的 MCP 工具。
session (ClientSession, optional) – 要使用的 MCP 客户端会话。如果未提供,它将创建一个新会话。这对于测试或您希望自己管理会话生命周期时很有用。
示例
使用通过可流式 HTTP 实现 MCP 的远程翻译服务创建工具,允许 AutoGen 代理执行翻译
import asyncio from autogen_ext.models.openai import OpenAIChatCompletionClient from autogen_ext.tools.mcp import StreamableHttpMcpToolAdapter, StreamableHttpServerParams from autogen_agentchat.agents import AssistantAgent from autogen_agentchat.ui import Console from autogen_core import CancellationToken async def main() -> None: # Create server params for the remote MCP service server_params = StreamableHttpServerParams( url="https://api.example.com/mcp", headers={"Authorization": "Bearer your-api-key", "Content-Type": "application/json"}, timeout=30.0, # HTTP timeout in seconds sse_read_timeout=300.0, # SSE read timeout in seconds (5 minutes) terminate_on_close=True, ) # Get the translation tool from the server adapter = await StreamableHttpMcpToolAdapter.from_server_params(server_params, "translate") # Create an agent that can use the translation tool model_client = OpenAIChatCompletionClient(model="gpt-4") agent = AssistantAgent( name="translator", model_client=model_client, tools=[adapter], system_message="You are a helpful translation assistant.", ) # Let the agent translate some text await Console( agent.run_stream(task="Translate 'Hello, how are you?' to Spanish", cancellation_token=CancellationToken()) ) if __name__ == "__main__": asyncio.run(main())
- component_config_schema#
StreamableHttpMcpToolAdapterConfig的别名
- pydantic 模型 StreamableHttpServerParams[来源]#
基类:
BaseModel通过 Streamable HTTP 连接到 MCP 服务器的参数。
显示 JSON 模式
{ "title": "StreamableHttpServerParams", "description": "Parameters for connecting to an MCP server over Streamable HTTP.", "type": "object", "properties": { "type": { "const": "StreamableHttpServerParams", "default": "StreamableHttpServerParams", "title": "Type", "type": "string" }, "url": { "title": "Url", "type": "string" }, "headers": { "anyOf": [ { "type": "object" }, { "type": "null" } ], "default": null, "title": "Headers" }, "timeout": { "default": 30.0, "title": "Timeout", "type": "number" }, "sse_read_timeout": { "default": 300.0, "title": "Sse Read Timeout", "type": "number" }, "terminate_on_close": { "default": true, "title": "Terminate On Close", "type": "boolean" } }, "required": [ "url" ] }
- 字段:
headers (dict[str, Any] | None)sse_read_timeout (float)terminate_on_close (bool)timeout (float)type (Literal['StreamableHttpServerParams'])url (str)
- 异步 mcp_server_tools(server_params: Annotated[StdioServerParams | SseServerParams | StreamableHttpServerParams, FieldInfo(annotation=NoneType, required=True, discriminator='type')], session: ClientSession | None = None) list[StdioMcpToolAdapter | SseMcpToolAdapter | StreamableHttpMcpToolAdapter][来源]#
创建一个可与 AutoGen 代理一起使用的 MCP 工具适配器列表。
警告
只连接到受信任的 MCP 服务器,尤其是在使用 StdioServerParams 时,因为它会在本地环境中执行命令。
此工厂函数连接到 MCP 服务器并返回所有可用工具的适配器。适配器可以直接分配给 AutoGen 代理的工具列表。
注意
要使用此函数,您需要为 autogen-ext 包安装 mcp 额外功能。
pip install -U "autogen-ext[mcp]"
- 参数:
server_params (McpServerParams) – MCP 服务器的连接参数。可以是用于命令行工具的 StdioServerParams,也可以是用于 HTTP/SSE 服务的 SseServerParams 和 StreamableHttpServerParams。
session (ClientSession | None) – 可选的现有会话。当您想重用与 MCP 服务器的现有连接时使用。创建 MCP 工具适配器时将重用该会话。
- 返回:
list[StdioMcpToolAdapter | SseMcpToolAdapter | StreamableHttpMcpToolAdapter] – 可与 AutoGen 代理一起使用的工具适配器列表。
示例
通过标准 I/O 的本地文件系统 MCP 服务示例
从 npm 安装文件系统服务器包(需要 Node.js 16+ 和 npm)。
npm install -g @modelcontextprotocol/server-filesystem
创建一个代理,可以使用本地文件系统 MCP 服务器中的所有工具。
import asyncio from pathlib import Path from autogen_ext.models.openai import OpenAIChatCompletionClient from autogen_ext.tools.mcp import StdioServerParams, mcp_server_tools from autogen_agentchat.agents import AssistantAgent from autogen_core import CancellationToken async def main() -> None: # Setup server params for local filesystem access desktop = str(Path.home() / "Desktop") server_params = StdioServerParams( command="npx.cmd", args=["-y", "@modelcontextprotocol/server-filesystem", desktop] ) # Get all available tools from the server tools = await mcp_server_tools(server_params) # Create an agent that can use all the tools agent = AssistantAgent( name="file_manager", model_client=OpenAIChatCompletionClient(model="gpt-4"), tools=tools, # type: ignore ) # The agent can now use any of the filesystem tools await agent.run(task="Create a file called test.txt with some content", cancellation_token=CancellationToken()) if __name__ == "__main__": asyncio.run(main())
通过标准 I/O 的本地抓取 MCP 服务示例
安装 mcp-server-fetch 包。
pip install mcp-server-fetch
创建一个代理,可以使用本地 MCP 服务器中的 fetch 工具。
import asyncio from autogen_agentchat.agents import AssistantAgent from autogen_ext.models.openai import OpenAIChatCompletionClient from autogen_ext.tools.mcp import StdioServerParams, mcp_server_tools async def main() -> None: # Get the fetch tool from mcp-server-fetch. fetch_mcp_server = StdioServerParams(command="uvx", args=["mcp-server-fetch"]) tools = await mcp_server_tools(fetch_mcp_server) # Create an agent that can use the fetch tool. model_client = OpenAIChatCompletionClient(model="gpt-4o") agent = AssistantAgent(name="fetcher", model_client=model_client, tools=tools, reflect_on_tool_use=True) # type: ignore # Let the agent fetch the content of a URL and summarize it. result = await agent.run(task="Summarize the content of https://en.wikipedia.org/wiki/Seattle") print(result.messages[-1]) asyncio.run(main())
跨多个工具共享 MCP 客户端会话
您可以创建一个 MCP 客户端会话并跨多个工具共享。当服务器维护会话状态(例如,浏览器状态)应在多个请求中重用时,有时需要这样做。
以下示例展示了如何创建单个 MCP 客户端会话到本地 Playwright 服务器,并将其跨多个工具共享。
import asyncio from autogen_agentchat.agents import AssistantAgent from autogen_agentchat.conditions import TextMentionTermination from autogen_agentchat.teams import RoundRobinGroupChat from autogen_agentchat.ui import Console from autogen_ext.models.openai import OpenAIChatCompletionClient from autogen_ext.tools.mcp import StdioServerParams, create_mcp_server_session, mcp_server_tools async def main() -> None: model_client = OpenAIChatCompletionClient(model="gpt-4o", parallel_tool_calls=False) # type: ignore params = StdioServerParams( command="npx", args=["@playwright/mcp@latest"], read_timeout_seconds=60, ) async with create_mcp_server_session(params) as session: await session.initialize() tools = await mcp_server_tools(server_params=params, session=session) print(f"Tools: {[tool.name for tool in tools]}") agent = AssistantAgent( name="Assistant", model_client=model_client, tools=tools, # type: ignore ) termination = TextMentionTermination("TERMINATE") team = RoundRobinGroupChat([agent], termination_condition=termination) await Console( team.run_stream( task="Go to https://ekzhu.com/, visit the first link in the page, then tell me about the linked page." ) ) asyncio.run(main())
通过 SSE 的远程 MCP 服务示例
from autogen_ext.tools.mcp import SseServerParams, mcp_server_tools async def main() -> None: # Setup server params for remote service server_params = SseServerParams(url="https://api.example.com/mcp", headers={"Authorization": "Bearer token"}) # Get all available tools tools = await mcp_server_tools(server_params) # Create an agent with all tools agent = AssistantAgent(name="tool_user", model_client=OpenAIChatCompletionClient(model="gpt-4"), tools=tools) # type: ignore
有关更多示例和详细用法,请参阅包仓库中的 samples 目录。
- class McpWorkbench(server_params: Annotated[StdioServerParams | SseServerParams | StreamableHttpServerParams, FieldInfo(annotation=NoneType, required=True, discriminator='type')], tool_overrides: Dict[str, ToolOverride] | None = None, model_client: ChatCompletionClient | None = None)[来源]#
基类:
Workbench,Component[McpWorkbenchConfig]一个包装 MCP 服务器并提供用于列出和调用服务器提供的工具的接口的工作台。
警告
只连接到受信任的 MCP 服务器,尤其是在使用 StdioServerParams 时,因为它会在本地环境中执行命令。
此工作台应作为上下文管理器使用,以确保底层 MCP 会话的正确初始化和清理。
MCP 支持# MCP 能力
支持的功能
工具
list_tools、call_tool
资源
list_resources、read_resource
ResourceTemplates
list_resource_templates、read_resource_template
提示
list_prompts、get_prompt
采样
通过 model_client 可选支持
根
不支持
启发
不支持
- 参数:
server_params (McpServerParams) – 连接到 MCP 服务器的参数。可以是
StdioServerParams或SseServerParams。tool_overrides (Optional[Dict[str, ToolOverride]]) – 原始工具名称到名称和/或描述的覆盖配置的可选映射。这允许自定义服务器工具如何呈现给使用者,同时保持底层工具功能。
model_client – 可选的聊天完成客户端,用于处理支持采样功能的 MCP 服务器的采样请求。这允许 MCP 服务器在工具执行期间请求语言模型的文本生成。如果未提供,采样请求将返回错误。
- 抛出:
ValueError – 如果工具覆盖名称存在冲突。
示例
以下是如何将工作台与 mcp-server-fetch 服务器一起使用的简单示例
import asyncio from autogen_ext.tools.mcp import McpWorkbench, StdioServerParams async def main() -> None: params = StdioServerParams( command="uvx", args=["mcp-server-fetch"], read_timeout_seconds=60, ) # You can also use `start()` and `stop()` to manage the session. async with McpWorkbench(server_params=params) as workbench: tools = await workbench.list_tools() print(tools) result = await workbench.call_tool(tools[0]["name"], {"url": "https://github.com/"}) print(result) asyncio.run(main())
使用工具覆盖的示例
import asyncio from autogen_ext.tools.mcp import McpWorkbench, StdioServerParams from autogen_core.tools import ToolOverride async def main() -> None: params = StdioServerParams( command="uvx", args=["mcp-server-fetch"], read_timeout_seconds=60, ) # Override the fetch tool's name and description overrides = { "fetch": ToolOverride(name="web_fetch", description="Enhanced web fetching tool with better error handling") } async with McpWorkbench(server_params=params, tool_overrides=overrides) as workbench: tools = await workbench.list_tools() # The tool will now appear as "web_fetch" with the new description print(tools) # Call the overridden tool result = await workbench.call_tool("web_fetch", {"url": "https://github.com/"}) print(result) asyncio.run(main())
将工作台与 GitHub MCP 服务器一起使用的示例
import asyncio from autogen_agentchat.agents import AssistantAgent from autogen_agentchat.ui import Console from autogen_ext.models.openai import OpenAIChatCompletionClient from autogen_ext.tools.mcp import McpWorkbench, StdioServerParams async def main() -> None: model_client = OpenAIChatCompletionClient(model="gpt-4.1-nano") server_params = StdioServerParams( command="docker", args=[ "run", "-i", "--rm", "-e", "GITHUB_PERSONAL_ACCESS_TOKEN", "ghcr.io/github/github-mcp-server", ], env={ "GITHUB_PERSONAL_ACCESS_TOKEN": "ghp_XXXXXXXXXXXXXXXXXXXXXXXXXXXXXX", }, ) async with McpWorkbench(server_params) as mcp: agent = AssistantAgent( "github_assistant", model_client=model_client, workbench=mcp, reflect_on_tool_use=True, model_client_stream=True, ) await Console(agent.run_stream(task="Is there a repository named Autogen")) asyncio.run(main())
将工作台与 Playwright MCP 服务器一起使用的示例
# First run `npm install -g @playwright/mcp@latest` to install the MCP server. import asyncio from autogen_agentchat.agents import AssistantAgent from autogen_agentchat.teams import RoundRobinGroupChat from autogen_agentchat.conditions import TextMessageTermination from autogen_agentchat.ui import Console from autogen_ext.models.openai import OpenAIChatCompletionClient from autogen_ext.tools.mcp import McpWorkbench, StdioServerParams async def main() -> None: model_client = OpenAIChatCompletionClient(model="gpt-4.1-nano") server_params = StdioServerParams( command="npx", args=[ "@playwright/mcp@latest", "--headless", ], ) async with McpWorkbench(server_params) as mcp: agent = AssistantAgent( "web_browsing_assistant", model_client=model_client, workbench=mcp, model_client_stream=True, ) team = RoundRobinGroupChat( [agent], termination_condition=TextMessageTermination(source="web_browsing_assistant"), ) await Console(team.run_stream(task="Find out how many contributors for the microsoft/autogen repository")) asyncio.run(main())
- component_provider_override: ClassVar[str | None] = 'autogen_ext.tools.mcp.McpWorkbench'#
覆盖组件的提供者字符串。这应该用于防止内部模块名称成为模块名称的一部分。
- component_config_schema#
别名
McpWorkbenchConfig
- 属性 server_params: Annotated[StdioServerParams | SseServerParams | StreamableHttpServerParams, FieldInfo(annotation=NoneType, required=True, discriminator='type')]#
- 异步 list_tools() List[ToolSchema][来源]#
将工作台中当前可用的工具列为
ToolSchema对象。工具列表可以是动态的,其内容可能在工具执行后发生变化。
- 异步 call_tool(name: str, arguments: Mapping[str, Any] | None = None, cancellation_token: CancellationToken | None = None, call_id: str | None = None) ToolResult[来源]#
调用工作台中的工具。
- 参数:
name (str) – 要调用的工具的名称。
arguments (Mapping[str, Any] | None) – 要传递给工具的参数。如果为 None,则将不带参数调用该工具。
cancellation_token (CancellationToken | None) – 用于取消工具执行的可选取消令牌。
call_id (str | None) – 用于跟踪的工具调用的可选标识符。
- 返回:
ToolResult – 工具执行的结果。