v1.10.0
Added
- New optimizer module (
optimizers/) using the Template Method Pattern with explicit hook methods (_iterate_continuous_batch,_iterate_categorical_batch,_iterate_discrete_batch) - Extended search space dimension types: continuous
(min, max)tuples, categorical["a", "b"]lists, and discrete numerical NumPy arrays DimensionTypeenum,DimensionInfodataclass, andDimensionMasksfor dimension-aware vectorized operations- Automatic vectorization for search spaces with 1000+ dimensions via
DimensionIteratorMixin resolutionparameter forGridSearchOptimizerandDirectAlgorithmto handle continuous dimensions- Mixed-type distance metric (Gower-like) for the DIRECT algorithm across heterogeneous dimensions
- Lazy search data construction in
ResultsManagerfor reduced memory footprint during optimization - State management via property setters with automatic history tracking in
CoreOptimizer - Extended search-space tests for all optimizers
- Examples for mixed and large search spaces
- Sphinx documentation site with landing page, logos, and navigation
Changed
- All optimizers reimplemented to comply with the new Template Method architecture
- Legacy optimizer implementations preserved in
optimizers_legacy/(not part of public API) - SciPy restored as a core dependency
- Wall clipping algorithm reworked
- Optimizer initialization refactored (
finish_initialization,_generate_positionpattern) - Converter enhanced with dimension type analysis (
_analyze_dimension_types) - Updated CI workflow configuration
Fixed
finish_initializationin Downhill Simplex and other optimizers_move_randomin sequential model-based optimizers- Init position and
evaluate_initoverride issues in optimizer subclasses - Empty scores edge case in evaluation
Full Changelog: v1.9.0...v1.10.0