Skip to content

Latest commit

 

History

History
45 lines (24 loc) · 2.73 KB

File metadata and controls

45 lines (24 loc) · 2.73 KB
graph LR
    AlgorithmBase["AlgorithmBase"]
    ConsistencyLoss["ConsistencyLoss"]
    CrossEntropyLoss["CrossEntropyLoss"]
    AlgorithmBase -- "uses" --> ConsistencyLoss
    AlgorithmBase -- "uses" --> CrossEntropyLoss
Loading

CodeBoardingDemoContact

Details

The AlgorithmCore subsystem forms the bedrock for all semi-supervised and imbalanced learning algorithms within the project. It encapsulates the fundamental structure and common functionalities required for algorithm implementation and execution.

AlgorithmBase

This is the abstract base class that provides the foundational structure and orchestrates the training pipeline for all semi-supervised learning algorithms. It defines the overall flow, including managing the model, optimizer, data loaders, and hooks, and provides abstract methods (e.g., train_step) that concrete algorithms must implement. It acts as the central orchestrator for the algorithm's execution.

Related Classes/Methods:

ConsistencyLoss

This component computes the consistency regularization loss, a fundamental criterion in many semi-supervised learning algorithms. Its purpose is to ensure that predictions for perturbed versions of the same input remain consistent, thereby enhancing model robustness.

Related Classes/Methods:

CrossEntropyLoss

This component provides the standard cross-entropy loss function. It is typically utilized for the supervised learning portion of semi-supervised algorithms or for general classification tasks, serving as a core criterion for measuring prediction error against true labels.

Related Classes/Methods: