graph LR
Multi_Provider["Multi-Provider"]
OpenAI_Provider["OpenAI Provider"]
LiteLLM_Provider["LiteLLM Provider"]
OpenAI_Chat_Completions_Model["OpenAI Chat Completions Model"]
OpenAI_Responses_Model["OpenAI Responses Model"]
LiteLLM_Model["LiteLLM Model"]
Chat_Completion_Converter["Chat Completion Converter"]
Chat_Completion_Stream_Handler["Chat Completion Stream Handler"]
Multi_Provider -- "orchestrates" --> OpenAI_Provider
Multi_Provider -- "orchestrates" --> LiteLLM_Provider
OpenAI_Provider -- "delegates to" --> OpenAI_Chat_Completions_Model
OpenAI_Provider -- "delegates to" --> OpenAI_Responses_Model
LiteLLM_Provider -- "delegates to" --> LiteLLM_Model
OpenAI_Chat_Completions_Model -- "utilizes" --> Chat_Completion_Converter
OpenAI_Chat_Completions_Model -- "interacts with" --> Chat_Completion_Stream_Handler
OpenAI_Responses_Model -- "utilizes" --> Chat_Completion_Converter
LiteLLM_Model -- "utilizes" --> Chat_Completion_Converter
LiteLLM_Model -- "interacts with" --> Chat_Completion_Stream_Handler
The LLM Integration subsystem provides a robust abstraction layer for interacting with various Large Language Models (LLMs), managing provider-specific details, and ensuring consistent model access and response processing within the openai-agents-python project.
Acts as the central orchestrator and facade for the entire LLM integration subsystem. It selects and provides the appropriate LLM model from different underlying providers based on configuration or runtime context, including fallback mechanisms.
Related Classes/Methods:
Manages the integration with OpenAI models. It serves as the primary entry point for OpenAI-specific LLM interactions, translating generic requests into OpenAI-compatible calls.
Related Classes/Methods:
Provides a unified abstraction layer for interacting with various LLMs via the LiteLLM library. It translates generic requests into LiteLLM-compatible calls.
Related Classes/Methods:
Handles the specifics of interacting directly with OpenAI's chat completions API, including constructing requests and fetching raw responses.
Related Classes/Methods:
Focuses on processing and converting responses received from OpenAI models, particularly for extracting structured information like tool definitions and usage.
Related Classes/Methods:
Implements the core logic for sending requests to and processing responses from LiteLLM, handling message format conversions specific to LiteLLM.
Related Classes/Methods:
A utility component for converting various message and item types to and from the format required by chat completion models (e.g., OpenAI's message format, LiteLLM's message format).
Related Classes/Methods:
Manages the state and processing of streamed responses from chat completion models, reconstructing the full response from incoming chunks.
Related Classes/Methods: