autogen_ext.models.openai#
- class OpenAIChatCompletionClient(**kwargs: Unpack)[源代码]#
基类:
BaseOpenAIChatCompletionClient
,Component
[OpenAIClientConfigurationConfigModel
]用于 OpenAI 托管模型的聊天完成客户端。
要使用此客户端,您必须安装 openai 扩展
pip install "autogen-ext[openai]"
您还可以将此客户端用于与 OpenAI 兼容的 ChatCompletion 端点。不保证将此客户端用于非 OpenAI 模型,且未经过测试。
对于非 OpenAI 模型,请先查看我们的 社区扩展,以获取其他模型客户端。
- 参数:
model (str) – 要使用的 OpenAI 模型。
api_key (可选, str) – 要使用的 API 密钥。如果在环境变量中未找到“OPENAI_API_KEY”,则为必填项。
organization (可选, str) – 要使用的组织 ID。
base_url (可选, str) – 要使用的基本 URL。如果模型未托管在 OpenAI 上,则为必填项。
timeout – (可选,float): 请求的超时时间(秒)。
max_retries (可选, int) – 要尝试的最大重试次数。
model_info (可选, ModelInfo) – 模型的容量。如果模型名称不是有效的 OpenAI 模型,则为必填项。
frequency_penalty (可选, float)
logit_bias – (可选,dict[str, int])
max_tokens (可选, int)
n (可选, int)
presence_penalty (可选, float)
response_format (可选, Dict[str, Any]) –
响应的格式。 可能的选项是
# Text response, this is the default. {"type": "text"}
# JSON response, make sure to instruct the model to return JSON. {"type": "json_object"}
# Structured output response, with a pre-defined JSON schema. { "type": "json_schema", "json_schema": { "name": "name of the schema, must be an identifier.", "description": "description for the model.", # You can convert a Pydantic (v2) model to JSON schema # using the `model_json_schema()` method. "schema": "<the JSON schema itself>", # Whether to enable strict schema adherence when # generating the output. If set to true, the model will # always follow the exact schema defined in the # `schema` field. Only a subset of JSON Schema is # supported when `strict` is `true`. # To learn more, read # https://platform.openai.com/docs/guides/structured-outputs. "strict": False, # or True }, }
建议在
create()
或create_stream()
方法中使用 json_output 参数,而不是 response_format 以获得结构化输出。 json_output 参数更灵活,允许您直接指定 Pydantic 模型类。seed (可选, int)
temperature (可选, float)
top_p (可选, float)
user (可选, str)
default_headers (可选, dict[str, str]) – 自定义标头;对身份验证或其他自定义要求很有用。
add_name_prefixes (可选, bool) – 是否将 source 值添加到每个
UserMessage
内容的前面。 例如,“this is content”变为“Reviewer said: this is content”。 这对于不支持消息中的 name 字段的模型很有用。 默认为 False。stream_options (可选, dict) – 流式传输的其他选项。 目前仅支持 include_usage。
示例
以下代码片段显示了如何将客户端与 OpenAI 模型一起使用
from autogen_ext.models.openai import OpenAIChatCompletionClient from autogen_core.models import UserMessage openai_client = OpenAIChatCompletionClient( model="gpt-4o-2024-08-06", # api_key="sk-...", # Optional if you have an OPENAI_API_KEY environment variable set. ) result = await openai_client.create([UserMessage(content="What is the capital of France?", source="user")]) # type: ignore print(result) # Close the client when done. # await openai_client.close()
要将客户端与非 OpenAI 模型一起使用,您需要提供模型的基本 URL 和模型信息。 例如,要使用 Ollama,您可以使用以下代码片段
from autogen_ext.models.openai import OpenAIChatCompletionClient from autogen_core.models import ModelFamily custom_model_client = OpenAIChatCompletionClient( model="deepseek-r1:1.5b", base_url="http://localhost:11434/v1", api_key="placeholder", model_info={ "vision": False, "function_calling": False, "json_output": False, "family": ModelFamily.R1, "structured_output": True, }, ) # Close the client when done. # await custom_model_client.close()
要使用流式传输模式,您可以使用以下代码片段
import asyncio from autogen_core.models import UserMessage from autogen_ext.models.openai import OpenAIChatCompletionClient async def main() -> None: # Similar for AzureOpenAIChatCompletionClient. model_client = OpenAIChatCompletionClient(model="gpt-4o") # assuming OPENAI_API_KEY is set in the environment. messages = [UserMessage(content="Write a very short story about a dragon.", source="user")] # Create a stream. stream = model_client.create_stream(messages=messages) # Iterate over the stream and print the responses. print("Streamed responses:") async for response in stream: if isinstance(response, str): # A partial response is a string. print(response, flush=True, end="") else: # The last response is a CreateResult object with the complete message. print("\n\n------------\n") print("The complete response:", flush=True) print(response.content, flush=True) # Close the client when done. await model_client.close() asyncio.run(main())
要使用结构化输出以及函数调用,您可以使用以下代码片段
import asyncio from typing import Literal from autogen_core.models import ( AssistantMessage, FunctionExecutionResult, FunctionExecutionResultMessage, SystemMessage, UserMessage, ) from autogen_core.tools import FunctionTool from autogen_ext.models.openai import OpenAIChatCompletionClient from pydantic import BaseModel # Define the structured output format. class AgentResponse(BaseModel): thoughts: str response: Literal["happy", "sad", "neutral"] # Define the function to be called as a tool. def sentiment_analysis(text: str) -> str: """Given a text, return the sentiment.""" return "happy" if "happy" in text else "sad" if "sad" in text else "neutral" # Create a FunctionTool instance with `strict=True`, # which is required for structured output mode. tool = FunctionTool(sentiment_analysis, description="Sentiment Analysis", strict=True) async def main() -> None: # Create an OpenAIChatCompletionClient instance. model_client = OpenAIChatCompletionClient(model="gpt-4o-mini") # Generate a response using the tool. response1 = await model_client.create( messages=[ SystemMessage(content="Analyze input text sentiment using the tool provided."), UserMessage(content="I am happy.", source="user"), ], tools=[tool], ) print(response1.content) # Should be a list of tool calls. # [FunctionCall(name="sentiment_analysis", arguments={"text": "I am happy."}, ...)] assert isinstance(response1.content, list) response2 = await model_client.create( messages=[ SystemMessage(content="Analyze input text sentiment using the tool provided."), UserMessage(content="I am happy.", source="user"), AssistantMessage(content=response1.content, source="assistant"), FunctionExecutionResultMessage( content=[FunctionExecutionResult(content="happy", call_id=response1.content[0].id, is_error=False, name="sentiment_analysis")] ), ], # Use the structured output format. json_output=AgentResponse, ) print(response2.content) # Should be a structured output. # {"thoughts": "The user is happy.", "response": "happy"} # Close the client when done. await model_client.close() asyncio.run(main())
要从配置中加载客户端,您可以使用 load_component 方法
from autogen_core.models import ChatCompletionClient config = { "provider": "OpenAIChatCompletionClient", "config": {"model": "gpt-4o", "api_key": "REPLACE_WITH_YOUR_API_KEY"}, } client = ChatCompletionClient.load_component(config)
要查看可用配置选项的完整列表,请参阅
OpenAIClientConfigurationConfigModel
类。- component_type: ClassVar[ComponentType] = 'model'#
组件的逻辑类型。
- component_config_schema#
- component_provider_override: ClassVar[str | None] = 'autogen_ext.models.openai.OpenAIChatCompletionClient'#
覆盖组件的提供程序字符串。 这应该用于防止内部模块名称成为模块名称的一部分。
- _to_config() OpenAIClientConfigurationConfigModel [source]#
导出配置,该配置是创建与此实例配置匹配的组件新实例所必需的。
- 返回值:
T – 组件的配置。
- classmethod _from_config(config: OpenAIClientConfigurationConfigModel) Self [source]#
从配置对象创建一个组件的新实例。
- 参数:
config (T) – 配置对象。
- 返回值:
Self – 组件的新实例。
- class AzureOpenAIChatCompletionClient(**kwargs: Unpack)[source]#
基类:
BaseOpenAIChatCompletionClient
,Component
[AzureOpenAIClientConfigurationConfigModel
]用于 Azure OpenAI 托管模型的聊天完成客户端。
要使用此客户端,您必须安装 azure 和 openai 扩展
pip install "autogen-ext[openai,azure]"
- 参数:
model (str) – 要使用的 OpenAI 模型。
azure_endpoint (str) – Azure 模型的终结点。Azure 模型必需。
azure_deployment (str) – Azure 模型的部署名称。Azure 模型必需。
api_version (str) – 要使用的 API 版本。Azure 模型必需。
azure_ad_token (str) – 要使用的 Azure AD 令牌。提供此令牌或 azure_ad_token_provider 用于基于令牌的身份验证。
azure_ad_token_provider (可选, Callable[[], Awaitable[str]] | AzureTokenProvider) – 要使用的 Azure AD 令牌提供程序。提供此提供程序或 azure_ad_token 用于基于令牌的身份验证。
api_key (可选, str) – 要使用的 API 密钥。如果您使用基于密钥的身份验证,请使用此密钥。 如果您使用基于 Azure AD 令牌的身份验证或 AZURE_OPENAI_API_KEY 环境变量,则它是可选的。
timeout – (可选,float): 请求的超时时间(秒)。
max_retries (可选, int) – 要尝试的最大重试次数。
model_info (可选, ModelInfo) – 模型的容量。如果模型名称不是有效的 OpenAI 模型,则为必填项。
frequency_penalty (可选, float)
logit_bias – (可选,dict[str, int])
max_tokens (可选, int)
n (可选, int)
presence_penalty (可选, float)
response_format (可选, Dict[str, Any]) –
响应的格式。 可能的选项是
# Text response, this is the default. {"type": "text"}
# JSON response, make sure to instruct the model to return JSON. {"type": "json_object"}
# Structured output response, with a pre-defined JSON schema. { "type": "json_schema", "json_schema": { "name": "name of the schema, must be an identifier.", "description": "description for the model.", # You can convert a Pydantic (v2) model to JSON schema # using the `model_json_schema()` method. "schema": "<the JSON schema itself>", # Whether to enable strict schema adherence when # generating the output. If set to true, the model will # always follow the exact schema defined in the # `schema` field. Only a subset of JSON Schema is # supported when `strict` is `true`. # To learn more, read # https://platform.openai.com/docs/guides/structured-outputs. "strict": False, # or True }, }
建议在
create()
或create_stream()
方法中使用 json_output 参数,而不是 response_format 以获得结构化输出。 json_output 参数更灵活,允许您直接指定 Pydantic 模型类。seed (可选, int)
temperature (可选, float)
top_p (可选, float)
user (可选, str)
default_headers (可选, dict[str, str]) – 自定义标头;对身份验证或其他自定义要求很有用。
要使用客户端,您需要提供您的部署名称、Azure 认知服务终结点和 API 版本。 对于身份验证,您可以提供 API 密钥或 Azure Active Directory (AAD) 令牌凭据。
以下代码段展示了如何使用 AAD 身份验证。 使用的身份必须被分配 认知服务 OpenAI 用户 角色。
from autogen_ext.auth.azure import AzureTokenProvider from autogen_ext.models.openai import AzureOpenAIChatCompletionClient from azure.identity import DefaultAzureCredential # Create the token provider token_provider = AzureTokenProvider( DefaultAzureCredential(), "https://cognitiveservices.azure.com/.default", ) az_model_client = AzureOpenAIChatCompletionClient( azure_deployment="{your-azure-deployment}", model="{model-name, such as gpt-4o}", api_version="2024-06-01", azure_endpoint="https://{your-custom-endpoint}.openai.azure.com/", azure_ad_token_provider=token_provider, # Optional if you choose key-based authentication. # api_key="sk-...", # For key-based authentication. )
请参阅
OpenAIChatCompletionClient
类中的其他使用示例。要从配置加载使用基于身份的身份验证的客户端,您可以使用 load_component 方法
from autogen_core.models import ChatCompletionClient config = { "provider": "AzureOpenAIChatCompletionClient", "config": { "model": "gpt-4o-2024-05-13", "azure_endpoint": "https://{your-custom-endpoint}.openai.azure.com/", "azure_deployment": "{your-azure-deployment}", "api_version": "2024-06-01", "azure_ad_token_provider": { "provider": "autogen_ext.auth.azure.AzureTokenProvider", "config": { "provider_kind": "DefaultAzureCredential", "scopes": ["https://cognitiveservices.azure.com/.default"], }, }, }, } client = ChatCompletionClient.load_component(config)
要查看可用配置选项的完整列表,请参阅
AzureOpenAIClientConfigurationConfigModel
类。注意
目前仅支持 DefaultAzureCredential,不传递任何额外的参数。
注意
Azure OpenAI 客户端默认情况下将 User-Agent 标头设置为 autogen-python/{version}。 要覆盖此设置,您可以将变量 autogen_ext.models.openai.AZURE_OPENAI_USER_AGENT 环境变量设置为空字符串。
有关如何直接使用 Azure 客户端或更多信息,请参见 这里。
- component_type: ClassVar[ComponentType] = 'model'#
组件的逻辑类型。
- component_config_schema#
- component_provider_override: ClassVar[str | None] = 'autogen_ext.models.openai.AzureOpenAIChatCompletionClient'#
覆盖组件的提供程序字符串。 这应该用于防止内部模块名称成为模块名称的一部分。
- _to_config() AzureOpenAIClientConfigurationConfigModel [source]#
导出配置,该配置是创建与此实例配置匹配的组件新实例所必需的。
- 返回值:
T – 组件的配置。
- classmethod _from_config(config: AzureOpenAIClientConfigurationConfigModel) Self [source]#
从配置对象创建一个组件的新实例。
- 参数:
config (T) – 配置对象。
- 返回值:
Self – 组件的新实例。
- class BaseOpenAIChatCompletionClient(client: AsyncOpenAI | AsyncAzureOpenAI, *, create_args: Dict[str, Any], model_capabilities: ModelCapabilities | None = None, model_info: ModelInfo | None = None, add_name_prefixes: bool = False)[source]#
继承自:
ChatCompletionClient
- async create(messages: Sequence[Annotated[SystemMessage | UserMessage | AssistantMessage | FunctionExecutionResultMessage, FieldInfo(annotation=NoneType, required=True, discriminator='type')]], *, tools: Sequence[Tool | ToolSchema] = [], json_output: bool | type[BaseModel] | None = None, extra_create_args: Mapping[str, Any] = {}, cancellation_token: CancellationToken | None = None) CreateResult [source]#
从模型创建一个单独的响应。
- 参数:
messages (Sequence[LLMMessage]) – 要发送给模型的消息。
tools (Sequence[Tool | ToolSchema], optional) – 与模型一起使用的工具。 默认为 []。
json_output (Optional[bool | type[BaseModel]], optional) – 是否使用 JSON 模式、结构化输出或两者都不使用。默认为 None。如果设置为 Pydantic BaseModel 类型,它将用作结构化输出的输出类型。如果设置为布尔值,它将用于确定是否使用 JSON 模式。如果设置为 True,请确保指示模型在指令或提示中生成 JSON 输出。
extra_create_args (Mapping[str, Any], optional) – 传递给底层客户端的额外参数。 默认为 {}。
cancellation_token (Optional[CancellationToken], optional) – 用于取消的令牌。 默认为 None。
- 返回值:
CreateResult – 模型调用的结果。
- async create_stream(messages: Sequence[Annotated[SystemMessage | UserMessage | AssistantMessage | FunctionExecutionResultMessage, FieldInfo(annotation=NoneType, required=True, discriminator='type')]], *, tools: Sequence[Tool | ToolSchema] = [], json_output: bool | type[BaseModel] | None = None, extra_create_args: Mapping[str, Any] = {}, cancellation_token: CancellationToken | None = None, max_consecutive_empty_chunk_tolerance: int = 0) AsyncGenerator[str | CreateResult, None] [source]#
从模型创建一个字符串块的流,以
CreateResult
结尾。扩展
autogen_core.models.ChatCompletionClient.create_stream()
以支持 OpenAI API。在流式传输中,默认行为是不返回 token 使用计数。参见:OpenAI API 参考文档,了解可能的参数。
您可以设置 extra_create_args={“stream_options”: {“include_usage”: True}} (如果访问的 API 支持),以返回一个最终块,其中 usage 设置为
RequestUsage
对象,其中包含 prompt 和 completion token 计数,所有前面的块的 usage 将为 None。参见:OpenAI API 参考文档,了解流式选项。- extra_create_args 中可以包含的其他受支持参数的示例
temperature (float): 控制输出的随机性。较高的值(例如 0.8)使输出更随机,而较低的值(例如 0.2)使其更集中和确定。
max_tokens (int): 完成中要生成的最大 token 数。
top_p (float): 一种使用 temperature 进行采样的替代方法,称为 nucleus sampling,其中模型考虑具有 top_p 概率质量的 token 的结果。
frequency_penalty (float): 一个介于 -2.0 和 2.0 之间的值,它根据新 token 到目前为止在文本中的现有频率来惩罚新 token,从而降低重复短语的可能性。
presence_penalty (float): 一个介于 -2.0 和 2.0 之间的值,它根据新 token 是否到目前为止出现在文本中来惩罚新 token,从而鼓励模型谈论新主题。
- actual_usage() RequestUsage [source]#
- total_usage() RequestUsage [source]#
- count_tokens(messages: Sequence[Annotated[SystemMessage | UserMessage | AssistantMessage | FunctionExecutionResultMessage, FieldInfo(annotation=NoneType, required=True, discriminator='type')]], *, tools: Sequence[Tool | ToolSchema] = []) int [source]#
- remaining_tokens(messages: Sequence[Annotated[SystemMessage | UserMessage | AssistantMessage | FunctionExecutionResultMessage, FieldInfo(annotation=NoneType, required=True, discriminator='type')]], *, tools: Sequence[Tool | ToolSchema] = []) int [source]#
- 属性 capabilities: ModelCapabilities#
- pydantic 模型 AzureOpenAIClientConfigurationConfigModel[source]#
基类:
BaseOpenAIClientConfigurationConfigModel
显示 JSON 模式
{ "title": "AzureOpenAIClientConfigurationConfigModel", "type": "object", "properties": { "frequency_penalty": { "anyOf": [ { "type": "number" }, { "type": "null" } ], "default": null, "title": "Frequency Penalty" }, "logit_bias": { "anyOf": [ { "additionalProperties": { "type": "integer" }, "type": "object" }, { "type": "null" } ], "default": null, "title": "Logit Bias" }, "max_tokens": { "anyOf": [ { "type": "integer" }, { "type": "null" } ], "default": null, "title": "Max Tokens" }, "n": { "anyOf": [ { "type": "integer" }, { "type": "null" } ], "default": null, "title": "N" }, "presence_penalty": { "anyOf": [ { "type": "number" }, { "type": "null" } ], "default": null, "title": "Presence Penalty" }, "response_format": { "anyOf": [ { "$ref": "#/$defs/ResponseFormat" }, { "type": "null" } ], "default": null }, "seed": { "anyOf": [ { "type": "integer" }, { "type": "null" } ], "default": null, "title": "Seed" }, "stop": { "anyOf": [ { "type": "string" }, { "items": { "type": "string" }, "type": "array" }, { "type": "null" } ], "default": null, "title": "Stop" }, "temperature": { "anyOf": [ { "type": "number" }, { "type": "null" } ], "default": null, "title": "Temperature" }, "top_p": { "anyOf": [ { "type": "number" }, { "type": "null" } ], "default": null, "title": "Top P" }, "user": { "anyOf": [ { "type": "string" }, { "type": "null" } ], "default": null, "title": "User" }, "stream_options": { "anyOf": [ { "$ref": "#/$defs/StreamOptions" }, { "type": "null" } ], "default": null }, "model": { "title": "Model", "type": "string" }, "api_key": { "anyOf": [ { "format": "password", "type": "string", "writeOnly": true }, { "type": "null" } ], "default": null, "title": "Api Key" }, "timeout": { "anyOf": [ { "type": "number" }, { "type": "null" } ], "default": null, "title": "Timeout" }, "max_retries": { "anyOf": [ { "type": "integer" }, { "type": "null" } ], "default": null, "title": "Max Retries" }, "model_capabilities": { "anyOf": [ { "$ref": "#/$defs/ModelCapabilities" }, { "type": "null" } ], "default": null }, "model_info": { "anyOf": [ { "$ref": "#/$defs/ModelInfo" }, { "type": "null" } ], "default": null }, "add_name_prefixes": { "anyOf": [ { "type": "boolean" }, { "type": "null" } ], "default": null, "title": "Add Name Prefixes" }, "default_headers": { "anyOf": [ { "additionalProperties": { "type": "string" }, "type": "object" }, { "type": "null" } ], "default": null, "title": "Default Headers" }, "azure_endpoint": { "title": "Azure Endpoint", "type": "string" }, "azure_deployment": { "anyOf": [ { "type": "string" }, { "type": "null" } ], "default": null, "title": "Azure Deployment" }, "api_version": { "title": "Api Version", "type": "string" }, "azure_ad_token": { "anyOf": [ { "type": "string" }, { "type": "null" } ], "default": null, "title": "Azure Ad Token" }, "azure_ad_token_provider": { "anyOf": [ { "$ref": "#/$defs/ComponentModel" }, { "type": "null" } ], "default": null } }, "$defs": { "ComponentModel": { "description": "Model class for a component. Contains all information required to instantiate a component.", "properties": { "provider": { "title": "Provider", "type": "string" }, "component_type": { "anyOf": [ { "enum": [ "model", "agent", "tool", "termination", "token_provider", "workbench" ], "type": "string" }, { "type": "string" }, { "type": "null" } ], "default": null, "title": "Component Type" }, "version": { "anyOf": [ { "type": "integer" }, { "type": "null" } ], "default": null, "title": "Version" }, "component_version": { "anyOf": [ { "type": "integer" }, { "type": "null" } ], "default": null, "title": "Component Version" }, "description": { "anyOf": [ { "type": "string" }, { "type": "null" } ], "default": null, "title": "Description" }, "label": { "anyOf": [ { "type": "string" }, { "type": "null" } ], "default": null, "title": "Label" }, "config": { "title": "Config", "type": "object" } }, "required": [ "provider", "config" ], "title": "ComponentModel", "type": "object" }, "JSONSchema": { "properties": { "name": { "title": "Name", "type": "string" }, "description": { "title": "Description", "type": "string" }, "schema": { "title": "Schema", "type": "object" }, "strict": { "anyOf": [ { "type": "boolean" }, { "type": "null" } ], "title": "Strict" } }, "required": [ "name" ], "title": "JSONSchema", "type": "object" }, "ModelCapabilities": { "deprecated": true, "properties": { "vision": { "title": "Vision", "type": "boolean" }, "function_calling": { "title": "Function Calling", "type": "boolean" }, "json_output": { "title": "Json Output", "type": "boolean" } }, "required": [ "vision", "function_calling", "json_output" ], "title": "ModelCapabilities", "type": "object" }, "ModelInfo": { "description": "ModelInfo is a dictionary that contains information about a model's properties.\nIt is expected to be used in the model_info property of a model client.\n\nWe are expecting this to grow over time as we add more features.", "properties": { "vision": { "title": "Vision", "type": "boolean" }, "function_calling": { "title": "Function Calling", "type": "boolean" }, "json_output": { "title": "Json Output", "type": "boolean" }, "family": { "anyOf": [ { "enum": [ "gpt-41", "gpt-45", "gpt-4o", "o1", "o3", "o4", "gpt-4", "gpt-35", "r1", "gemini-1.5-flash", "gemini-1.5-pro", "gemini-2.0-flash", "gemini-2.5-pro", "claude-3-haiku", "claude-3-sonnet", "claude-3-opus", "claude-3-5-haiku", "claude-3-5-sonnet", "claude-3-7-sonnet", "unknown" ], "type": "string" }, { "type": "string" } ], "title": "Family" }, "structured_output": { "title": "Structured Output", "type": "boolean" }, "multiple_system_messages": { "anyOf": [ { "type": "boolean" }, { "type": "null" } ], "title": "Multiple System Messages" } }, "required": [ "vision", "function_calling", "json_output", "family", "structured_output" ], "title": "ModelInfo", "type": "object" }, "ResponseFormat": { "properties": { "type": { "enum": [ "text", "json_object", "json_schema" ], "title": "Type", "type": "string" }, "json_schema": { "anyOf": [ { "$ref": "#/$defs/JSONSchema" }, { "type": "null" } ] } }, "required": [ "type", "json_schema" ], "title": "ResponseFormat", "type": "object" }, "StreamOptions": { "properties": { "include_usage": { "title": "Include Usage", "type": "boolean" } }, "required": [ "include_usage" ], "title": "StreamOptions", "type": "object" } }, "required": [ "model", "azure_endpoint", "api_version" ] }
- 字段:
api_version (str)
azure_ad_token (str | None)
azure_ad_token_provider (autogen_core._component_config.ComponentModel | None)
azure_deployment (str | None)
azure_endpoint (str)
- field azure_ad_token_provider: ComponentModel | None = None#
- pydantic 模型 OpenAIClientConfigurationConfigModel[source]#
基类:
BaseOpenAIClientConfigurationConfigModel
显示 JSON 模式
{ "title": "OpenAIClientConfigurationConfigModel", "type": "object", "properties": { "frequency_penalty": { "anyOf": [ { "type": "number" }, { "type": "null" } ], "default": null, "title": "Frequency Penalty" }, "logit_bias": { "anyOf": [ { "additionalProperties": { "type": "integer" }, "type": "object" }, { "type": "null" } ], "default": null, "title": "Logit Bias" }, "max_tokens": { "anyOf": [ { "type": "integer" }, { "type": "null" } ], "default": null, "title": "Max Tokens" }, "n": { "anyOf": [ { "type": "integer" }, { "type": "null" } ], "default": null, "title": "N" }, "presence_penalty": { "anyOf": [ { "type": "number" }, { "type": "null" } ], "default": null, "title": "Presence Penalty" }, "response_format": { "anyOf": [ { "$ref": "#/$defs/ResponseFormat" }, { "type": "null" } ], "default": null }, "seed": { "anyOf": [ { "type": "integer" }, { "type": "null" } ], "default": null, "title": "Seed" }, "stop": { "anyOf": [ { "type": "string" }, { "items": { "type": "string" }, "type": "array" }, { "type": "null" } ], "default": null, "title": "Stop" }, "temperature": { "anyOf": [ { "type": "number" }, { "type": "null" } ], "default": null, "title": "Temperature" }, "top_p": { "anyOf": [ { "type": "number" }, { "type": "null" } ], "default": null, "title": "Top P" }, "user": { "anyOf": [ { "type": "string" }, { "type": "null" } ], "default": null, "title": "User" }, "stream_options": { "anyOf": [ { "$ref": "#/$defs/StreamOptions" }, { "type": "null" } ], "default": null }, "model": { "title": "Model", "type": "string" }, "api_key": { "anyOf": [ { "format": "password", "type": "string", "writeOnly": true }, { "type": "null" } ], "default": null, "title": "Api Key" }, "timeout": { "anyOf": [ { "type": "number" }, { "type": "null" } ], "default": null, "title": "Timeout" }, "max_retries": { "anyOf": [ { "type": "integer" }, { "type": "null" } ], "default": null, "title": "Max Retries" }, "model_capabilities": { "anyOf": [ { "$ref": "#/$defs/ModelCapabilities" }, { "type": "null" } ], "default": null }, "model_info": { "anyOf": [ { "$ref": "#/$defs/ModelInfo" }, { "type": "null" } ], "default": null }, "add_name_prefixes": { "anyOf": [ { "type": "boolean" }, { "type": "null" } ], "default": null, "title": "Add Name Prefixes" }, "default_headers": { "anyOf": [ { "additionalProperties": { "type": "string" }, "type": "object" }, { "type": "null" } ], "default": null, "title": "Default Headers" }, "organization": { "anyOf": [ { "type": "string" }, { "type": "null" } ], "default": null, "title": "Organization" }, "base_url": { "anyOf": [ { "type": "string" }, { "type": "null" } ], "default": null, "title": "Base Url" } }, "$defs": { "JSONSchema": { "properties": { "name": { "title": "Name", "type": "string" }, "description": { "title": "Description", "type": "string" }, "schema": { "title": "Schema", "type": "object" }, "strict": { "anyOf": [ { "type": "boolean" }, { "type": "null" } ], "title": "Strict" } }, "required": [ "name" ], "title": "JSONSchema", "type": "object" }, "ModelCapabilities": { "deprecated": true, "properties": { "vision": { "title": "Vision", "type": "boolean" }, "function_calling": { "title": "Function Calling", "type": "boolean" }, "json_output": { "title": "Json Output", "type": "boolean" } }, "required": [ "vision", "function_calling", "json_output" ], "title": "ModelCapabilities", "type": "object" }, "ModelInfo": { "description": "ModelInfo is a dictionary that contains information about a model's properties.\nIt is expected to be used in the model_info property of a model client.\n\nWe are expecting this to grow over time as we add more features.", "properties": { "vision": { "title": "Vision", "type": "boolean" }, "function_calling": { "title": "Function Calling", "type": "boolean" }, "json_output": { "title": "Json Output", "type": "boolean" }, "family": { "anyOf": [ { "enum": [ "gpt-41", "gpt-45", "gpt-4o", "o1", "o3", "o4", "gpt-4", "gpt-35", "r1", "gemini-1.5-flash", "gemini-1.5-pro", "gemini-2.0-flash", "gemini-2.5-pro", "claude-3-haiku", "claude-3-sonnet", "claude-3-opus", "claude-3-5-haiku", "claude-3-5-sonnet", "claude-3-7-sonnet", "unknown" ], "type": "string" }, { "type": "string" } ], "title": "Family" }, "structured_output": { "title": "Structured Output", "type": "boolean" }, "multiple_system_messages": { "anyOf": [ { "type": "boolean" }, { "type": "null" } ], "title": "Multiple System Messages" } }, "required": [ "vision", "function_calling", "json_output", "family", "structured_output" ], "title": "ModelInfo", "type": "object" }, "ResponseFormat": { "properties": { "type": { "enum": [ "text", "json_object", "json_schema" ], "title": "Type", "type": "string" }, "json_schema": { "anyOf": [ { "$ref": "#/$defs/JSONSchema" }, { "type": "null" } ] } }, "required": [ "type", "json_schema" ], "title": "ResponseFormat", "type": "object" }, "StreamOptions": { "properties": { "include_usage": { "title": "Include Usage", "type": "boolean" } }, "required": [ "include_usage" ], "title": "StreamOptions", "type": "object" } }, "required": [ "model" ] }
- 字段:
base_url (str | None)
organization (str | None)
- pydantic 模型 BaseOpenAIClientConfigurationConfigModel[source]#
基类:
CreateArgumentsConfigModel
显示 JSON 模式
{ "title": "BaseOpenAIClientConfigurationConfigModel", "type": "object", "properties": { "frequency_penalty": { "anyOf": [ { "type": "number" }, { "type": "null" } ], "default": null, "title": "Frequency Penalty" }, "logit_bias": { "anyOf": [ { "additionalProperties": { "type": "integer" }, "type": "object" }, { "type": "null" } ], "default": null, "title": "Logit Bias" }, "max_tokens": { "anyOf": [ { "type": "integer" }, { "type": "null" } ], "default": null, "title": "Max Tokens" }, "n": { "anyOf": [ { "type": "integer" }, { "type": "null" } ], "default": null, "title": "N" }, "presence_penalty": { "anyOf": [ { "type": "number" }, { "type": "null" } ], "default": null, "title": "Presence Penalty" }, "response_format": { "anyOf": [ { "$ref": "#/$defs/ResponseFormat" }, { "type": "null" } ], "default": null }, "seed": { "anyOf": [ { "type": "integer" }, { "type": "null" } ], "default": null, "title": "Seed" }, "stop": { "anyOf": [ { "type": "string" }, { "items": { "type": "string" }, "type": "array" }, { "type": "null" } ], "default": null, "title": "Stop" }, "temperature": { "anyOf": [ { "type": "number" }, { "type": "null" } ], "default": null, "title": "Temperature" }, "top_p": { "anyOf": [ { "type": "number" }, { "type": "null" } ], "default": null, "title": "Top P" }, "user": { "anyOf": [ { "type": "string" }, { "type": "null" } ], "default": null, "title": "User" }, "stream_options": { "anyOf": [ { "$ref": "#/$defs/StreamOptions" }, { "type": "null" } ], "default": null }, "model": { "title": "Model", "type": "string" }, "api_key": { "anyOf": [ { "format": "password", "type": "string", "writeOnly": true }, { "type": "null" } ], "default": null, "title": "Api Key" }, "timeout": { "anyOf": [ { "type": "number" }, { "type": "null" } ], "default": null, "title": "Timeout" }, "max_retries": { "anyOf": [ { "type": "integer" }, { "type": "null" } ], "default": null, "title": "Max Retries" }, "model_capabilities": { "anyOf": [ { "$ref": "#/$defs/ModelCapabilities" }, { "type": "null" } ], "default": null }, "model_info": { "anyOf": [ { "$ref": "#/$defs/ModelInfo" }, { "type": "null" } ], "default": null }, "add_name_prefixes": { "anyOf": [ { "type": "boolean" }, { "type": "null" } ], "default": null, "title": "Add Name Prefixes" }, "default_headers": { "anyOf": [ { "additionalProperties": { "type": "string" }, "type": "object" }, { "type": "null" } ], "default": null, "title": "Default Headers" } }, "$defs": { "JSONSchema": { "properties": { "name": { "title": "Name", "type": "string" }, "description": { "title": "Description", "type": "string" }, "schema": { "title": "Schema", "type": "object" }, "strict": { "anyOf": [ { "type": "boolean" }, { "type": "null" } ], "title": "Strict" } }, "required": [ "name" ], "title": "JSONSchema", "type": "object" }, "ModelCapabilities": { "deprecated": true, "properties": { "vision": { "title": "Vision", "type": "boolean" }, "function_calling": { "title": "Function Calling", "type": "boolean" }, "json_output": { "title": "Json Output", "type": "boolean" } }, "required": [ "vision", "function_calling", "json_output" ], "title": "ModelCapabilities", "type": "object" }, "ModelInfo": { "description": "ModelInfo is a dictionary that contains information about a model's properties.\nIt is expected to be used in the model_info property of a model client.\n\nWe are expecting this to grow over time as we add more features.", "properties": { "vision": { "title": "Vision", "type": "boolean" }, "function_calling": { "title": "Function Calling", "type": "boolean" }, "json_output": { "title": "Json Output", "type": "boolean" }, "family": { "anyOf": [ { "enum": [ "gpt-41", "gpt-45", "gpt-4o", "o1", "o3", "o4", "gpt-4", "gpt-35", "r1", "gemini-1.5-flash", "gemini-1.5-pro", "gemini-2.0-flash", "gemini-2.5-pro", "claude-3-haiku", "claude-3-sonnet", "claude-3-opus", "claude-3-5-haiku", "claude-3-5-sonnet", "claude-3-7-sonnet", "unknown" ], "type": "string" }, { "type": "string" } ], "title": "Family" }, "structured_output": { "title": "Structured Output", "type": "boolean" }, "multiple_system_messages": { "anyOf": [ { "type": "boolean" }, { "type": "null" } ], "title": "Multiple System Messages" } }, "required": [ "vision", "function_calling", "json_output", "family", "structured_output" ], "title": "ModelInfo", "type": "object" }, "ResponseFormat": { "properties": { "type": { "enum": [ "text", "json_object", "json_schema" ], "title": "Type", "type": "string" }, "json_schema": { "anyOf": [ { "$ref": "#/$defs/JSONSchema" }, { "type": "null" } ] } }, "required": [ "type", "json_schema" ], "title": "ResponseFormat", "type": "object" }, "StreamOptions": { "properties": { "include_usage": { "title": "Include Usage", "type": "boolean" } }, "required": [ "include_usage" ], "title": "StreamOptions", "type": "object" } }, "required": [ "model" ] }
- 字段:
add_name_prefixes (bool | None)
api_key (pydantic.types.SecretStr | None)
default_headers (Dict[str, str] | None)
max_retries (int | None)
model (str)
model_capabilities (autogen_core.models._model_client.ModelCapabilities | None)
model_info (autogen_core.models._model_client.ModelInfo | None)
timeout (float | None)
- 字段 model_capabilities: ModelCapabilities | None = None#
- pydantic 模型 CreateArgumentsConfigModel[源代码]#
基类:
BaseModel
显示 JSON 模式
{ "title": "CreateArgumentsConfigModel", "type": "object", "properties": { "frequency_penalty": { "anyOf": [ { "type": "number" }, { "type": "null" } ], "default": null, "title": "Frequency Penalty" }, "logit_bias": { "anyOf": [ { "additionalProperties": { "type": "integer" }, "type": "object" }, { "type": "null" } ], "default": null, "title": "Logit Bias" }, "max_tokens": { "anyOf": [ { "type": "integer" }, { "type": "null" } ], "default": null, "title": "Max Tokens" }, "n": { "anyOf": [ { "type": "integer" }, { "type": "null" } ], "default": null, "title": "N" }, "presence_penalty": { "anyOf": [ { "type": "number" }, { "type": "null" } ], "default": null, "title": "Presence Penalty" }, "response_format": { "anyOf": [ { "$ref": "#/$defs/ResponseFormat" }, { "type": "null" } ], "default": null }, "seed": { "anyOf": [ { "type": "integer" }, { "type": "null" } ], "default": null, "title": "Seed" }, "stop": { "anyOf": [ { "type": "string" }, { "items": { "type": "string" }, "type": "array" }, { "type": "null" } ], "default": null, "title": "Stop" }, "temperature": { "anyOf": [ { "type": "number" }, { "type": "null" } ], "default": null, "title": "Temperature" }, "top_p": { "anyOf": [ { "type": "number" }, { "type": "null" } ], "default": null, "title": "Top P" }, "user": { "anyOf": [ { "type": "string" }, { "type": "null" } ], "default": null, "title": "User" }, "stream_options": { "anyOf": [ { "$ref": "#/$defs/StreamOptions" }, { "type": "null" } ], "default": null } }, "$defs": { "JSONSchema": { "properties": { "name": { "title": "Name", "type": "string" }, "description": { "title": "Description", "type": "string" }, "schema": { "title": "Schema", "type": "object" }, "strict": { "anyOf": [ { "type": "boolean" }, { "type": "null" } ], "title": "Strict" } }, "required": [ "name" ], "title": "JSONSchema", "type": "object" }, "ResponseFormat": { "properties": { "type": { "enum": [ "text", "json_object", "json_schema" ], "title": "Type", "type": "string" }, "json_schema": { "anyOf": [ { "$ref": "#/$defs/JSONSchema" }, { "type": "null" } ] } }, "required": [ "type", "json_schema" ], "title": "ResponseFormat", "type": "object" }, "StreamOptions": { "properties": { "include_usage": { "title": "Include Usage", "type": "boolean" } }, "required": [ "include_usage" ], "title": "StreamOptions", "type": "object" } } }
- 字段:
frequency_penalty (float | None)
logit_bias (Dict[str, int] | None)
max_tokens (int | None)
n (int | None)
presence_penalty (float | None)
response_format (autogen_ext.models.openai.config.ResponseFormat | None)
seed (int | None)
stop (str | List[str] | None)
stream_options (autogen_ext.models.openai.config.StreamOptions | None)
temperature (float | None)
top_p (float | None)
user (str | None)