graph LR
CLI_User_Interface["CLI User Interface"]
AI_Agent_Orchestrator["AI Agent Orchestrator"]
LLM_Integration_Layer["LLM Integration Layer"]
Tooling_System["Tooling System"]
Configuration_Manager["Configuration Manager"]
Terminal_Renderer["Terminal Renderer"]
Unclassified["Unclassified"]
CLI_User_Interface -- "Initiates Task" --> AI_Agent_Orchestrator
CLI_User_Interface -- "Manages Settings" --> Configuration_Manager
AI_Agent_Orchestrator -- "Requests Inference" --> LLM_Integration_Layer
AI_Agent_Orchestrator -- "Invokes Tool" --> Tooling_System
AI_Agent_Orchestrator -- "Sends Output" --> Terminal_Renderer
LLM_Integration_Layer -- "Loads LLM Config" --> Configuration_Manager
LLM_Integration_Layer -- "Streams Output" --> Terminal_Renderer
Configuration_Manager -- "Provides Configuration" --> CLI_User_Interface
Configuration_Manager -- "Provides Environment Config" --> AI_Agent_Orchestrator
Configuration_Manager -- "Provides LLM Config" --> LLM_Integration_Layer
click CLI_User_Interface href "https://github.com/CodeBoarding/GeneratedOnBoardings/blob/main/OpenCopilot-PikoAi/CLI_User_Interface.md" "Details"
click AI_Agent_Orchestrator href "https://github.com/CodeBoarding/GeneratedOnBoardings/blob/main/OpenCopilot-PikoAi/AI_Agent_Orchestrator.md" "Details"
click LLM_Integration_Layer href "https://github.com/CodeBoarding/GeneratedOnBoardings/blob/main/OpenCopilot-PikoAi/LLM_Integration_Layer.md" "Details"
click Tooling_System href "https://github.com/CodeBoarding/GeneratedOnBoardings/blob/main/OpenCopilot-PikoAi/Tooling_System.md" "Details"
The PikoAi application is structured around a core CLI User Interface that serves as the primary entry point for user interaction. This interface is responsible for parsing user commands, processing prompts (including file inclusions), and initiating tasks. It interacts with the Configuration Manager to load and save application settings and user preferences. At the heart of the AI-driven workflow is the AI Agent Orchestrator. This component receives tasks from the CLI User Interface and intelligently manages the AI's decision-making and task execution. It relies on the LLM Integration Layer to interact with various Large Language Models for inference and response generation. The AI Agent Orchestrator also leverages the Tooling System to dynamically invoke external and internal tools, including a secure Python code executor, to accomplish complex tasks. All outputs and progress from the AI Agent Orchestrator are directed to the Terminal Renderer for formatted display to the user. The LLM Integration Layer provides a standardized abstraction for communicating with different LLM providers, handling streaming responses and model-specific configurations. It retrieves necessary LLM configurations from the Configuration Manager. The Tooling System is responsible for managing and executing a diverse set of tools, enabling the AI to interact with the environment, perform file operations, web searches, and execute dynamic code. The PythonExecutor is an integral part of this system, providing a secure sandbox for Python script execution. The Configuration Manager centralizes the handling of all application settings, including API keys and LLM model selections, ensuring consistent configuration across all components. Finally, the Terminal Renderer is dedicated to presenting rich, interactive output to the user, formatting AI responses, tool execution logs, and other information in a clear and readable manner. It receives output streams from both the AI Agent Orchestrator and the LLM Integration Layer. This architecture promotes modularity, allowing for independent development and easier maintenance of each component, while ensuring a clear and efficient data flow for AI-powered task automation.
CLI User Interface [Expand]
The primary interface for user interaction, command parsing, initial prompt processing, and overall application flow. It's the direct interface for the user.
Related Classes/Methods:
AI Agent Orchestrator [Expand]
The central intelligence component responsible for managing AI workflows, executing tasks, and making decisions based on LLM inference and tool usage.
Related Classes/Methods:
LLM Integration Layer [Expand]
Provides a standardized interface for interacting with various Large Language Models (LLMs), abstracting away provider specifics and handling streaming responses.
Related Classes/Methods:
Tooling System [Expand]
Manages the registration, discovery, and dynamic execution of various external and internal tools, including a secure Python code executor for dynamic script execution.
Related Classes/Methods:
Handles the loading, saving, and management of application configurations, environment variables (like API keys), and user preferences across different components.
Related Classes/Methods:
Responsible for formatting and displaying rich, interactive output to the command-line terminal, including AI responses, tool execution details, and markdown rendering.
Related Classes/Methods:
Component for all unclassified files and utility functions (Utility functions/External Libraries/Dependencies)
Related Classes/Methods: None