Skip to content

Latest commit

 

History

History
263 lines (200 loc) · 11.7 KB

File metadata and controls

263 lines (200 loc) · 11.7 KB

Changelog

All notable changes to Gradient-Free-Optimizers are documented in this file.

The format is based on Keep a Changelog, and this project adheres to Semantic Versioning.

For detailed release notes, see GitHub Releases.

[Unreleased]

[1.12.0] - 2026-04-18

Added

  • Ask/tell interface in the new gradient_free_optimizers.ask_tell subpackage, exposing all 23 single-objective optimizer algorithms with a batch-capable ask(n=...) / tell(scores) loop and an initial_evaluations constructor parameter for seeded restarts
  • (private) Distributed objective-function evaluation in the _distributed subpackage, with backends Joblib, Ray, Dask, and native Multiprocessing. SMBO optimizers (Bayesian, TPE, Forest) gained _select_diverse_batch to avoid duplicate proposals when asked for a batch of positions
  • (private) Pluggable evaluation storage (_storage) with MemoryStorage (default, in-process dict) and SQLiteStorage (on-disk) backends, allowing the memoization cache to persist across processes and runs

Changed

  • wrap_with_catch rewritten from a closure into a callable class so wrapped objectives remain picklable under the new parallel backends
  • Objective-function result unpacking (score and optional metadata) centralized in a single path

Fixed

  • Error when the objective function returns a metadata dict alongside the score (affecting search(), the SQLite storage backend, and the distributed base class)
  • Multiprocessing distribution backend now supports spawn start method (Windows and macOS 3.14+) by pickling the objective alongside each parameter dict via starmap; previously it hard-coded fork and broke on platforms where fork is unavailable
  • Async batch refill loop: cache hits consumed an iteration without submitting a future, stalling the worker pool. The refill now retries submission until a future is queued or there is nothing left to dispatch
  • search_data.rst user-guide referenced opt.data and its sub-accessors as public API, but the code only exposed them as _data (private) since v1.11.0. Docs now reflect the actual public surface: opt.search_data, opt.best_score, opt.best_para

Tests

  • Added coverage for the ask/tell interface (including batch semantics), distributed backends, and persistent storage

[1.11.1] - 2026-03-15

Fixed

  • Constraint retry loop now falls back to random positions when the optimizer's own _generate_position repeatedly violates constraints, preventing the search from silently proceeding with invalid positions

Tests

  • Expanded constraint tests from representative subset to all optimizers
  • Added constraint tests for categorical dimensions, mixed search spaces (continuous + discrete + categorical), cross-parameter constraints, and adversarial scenarios where the optimum lies inside the excluded region

[1.11.0] - 2026-03-14

Added

  • CMAESOptimizer implementing the CMA-ES (Covariance Matrix Adaptation Evolution Strategy) algorithm with population, mu, sigma, and ipop_restart parameters
  • callbacks parameter in search() accepting a list of functions that receive a frozen CallbackInfo dataclass per iteration; returning False stops the search early
  • catch parameter in search() mapping exception types to fallback scores for graceful error handling during objective function evaluation
  • (private) Data accessor (opt._data) providing 34+ computed metrics including timing breakdowns, convergence data, search statistics, and score distributions
  • Search summary printing via new verbosity options: "print_results", "print_search_stats", "print_statistics", "print_times"
  • gfo-help CLI command showing available metrics and API accessor paths
  • py.typed marker file for PEP 561 type checker support
  • Type annotations across the package

Changed

  • Docs: parameter entries added to right-side page TOC, landing page and styling updated

Fixed

  • Unicode encoding in search summary on Windows terminals (falls back to ASCII box-drawing)

[1.10.1] - 2026-02-19

Fixed

  • optimum="minimum" parameter in search() had no effect on the actual optimization. The objective adapter received the raw function instead of the negated one, causing the optimizer to maximize regardless of the optimum setting. The negation was only applied to the progress bar display.

[1.10.0] - 2026-02-14

Major release introducing a new optimizer architecture based on the Template Method Pattern, extended search space support with continuous, categorical, and discrete dimension types, and vectorized operations for high-dimensional optimization.

Added

  • New optimizer module (optimizers/) using the Template Method Pattern with explicit hook methods (_iterate_continuous_batch, _iterate_categorical_batch, _iterate_discrete_batch)
  • Extended search space dimension types: continuous (min, max) tuples, categorical ["a", "b"] lists, and discrete numerical NumPy arrays
  • DimensionType enum, DimensionInfo dataclass, and DimensionMasks for dimension-aware vectorized operations
  • Automatic vectorization for search spaces with 1000+ dimensions via DimensionIteratorMixin
  • resolution parameter for GridSearchOptimizer and DirectAlgorithm to handle continuous dimensions
  • Mixed-type distance metric (Gower-like) for the DIRECT algorithm across heterogeneous dimensions
  • Lazy search data construction in ResultsManager for reduced memory footprint during optimization
  • State management via property setters with automatic history tracking in CoreOptimizer
  • Extended search-space tests for all optimizers
  • Examples for mixed and large search spaces
  • Sphinx documentation site with landing page, logos, and navigation

Changed

  • All optimizers reimplemented to comply with the new Template Method architecture
  • Legacy optimizer implementations preserved in optimizers_legacy/ (not part of public API)
  • SciPy restored as a core dependency
  • Wall clipping algorithm reworked
  • Optimizer initialization refactored (finish_initialization, _generate_position pattern)
  • Converter enhanced with dimension type analysis (_analyze_dimension_types)
  • Updated CI workflow configuration

Fixed

  • finish_initialization in Downhill Simplex and other optimizers
  • _move_random in sequential model-based optimizers
  • Init position and evaluate_init override issues in optimizer subclasses
  • Empty scores edge case in evaluation

[1.9.0] - 2026-01-15

Major release focusing on dependency reduction. scikit-learn and scipy are now optional dependencies with native Python implementations available for all core functionality.

Added

  • Private array backend (_array_backend) for pure Python array operations without NumPy
  • Private math backend (_math_backend) for mathematical operations without SciPy
  • Native DecisionTreeRegressor implementation
  • Native ExtraTreesRegressor implementation
  • Native RandomForestRegressor implementation
  • Native GradientBoostingRegressor implementation
  • SimpleProgressBar class as fallback when tqdm is unavailable
  • Sigma self-adaptation for EvolutionStrategyOptimizer
  • convergence_threshold parameter for Powell's Method
  • Type hints to all optimizer classes and Search class
  • Comprehensive docstrings for all optimizer classes
  • Sphinx documentation with ReadTheDocs integration
  • API tests for all optimizer categories

Changed

  • scikit-learn is now an optional dependency (native estimators used by default)
  • SciPy is now an optional dependency
  • tqdm is now an optional dependency
  • Complete reimplementation of Powell's Method with improved line search algorithms
  • Reworked README with new 3D optimization animation
  • Consolidated CI workflows into single ci.yml
  • Restructured test directory (tests/test_main/, tests/test_internal/, etc.)
  • Improved error messages with actionable suggestions

Removed

  • BayesianRidge estimator
  • Linear GP option from Gaussian Process regressor

Fixed

  • Golden section search algorithm in Powell's Method
  • Mutable default argument anti-pattern (constraints=[] changed to constraints=None)
  • Missing @functools.wraps on internal decorators
  • Division by zero edge case in print-times
  • Bug in evaluate method

[1.8.1] - 2025-12-29

Re-release of v1.8.0 with updated package metadata.

[1.8.0] - 2025-12-29

Added

  • Python 3.14 support
  • Package keywords and classifiers for improved discoverability

Changed

  • Dropped Python 3.9 support
  • Test performance improvements

Fixed

  • Sporadic test failures in CI

[1.7.2] - 2025-09-21

Added

  • Native GaussianProcessRegressor implementation with RBF kernel
  • Native KernelDensityEstimator implementation
  • Result dataclass for structured evaluation results
  • ObjectiveAdapter class for cleaner objective function handling
  • Toy test functions (Sphere, Ackley) for benchmarking
  • Python 3.13 support

Changed

  • Refactored memory/caching system using CachedObjectiveAdapter
  • Refactored ResultsManager for improved result collection
  • New optimization stopping implementation
  • Performance improvements to normalize function (up to 90% faster)
  • Performance improvements to LipschitzFunction.find_best_slope (81% faster)

Fixed

  • Issue with maximize/minimize objective function handling

[1.7.1] - 2024-12-07

Added

  • Comprehensive docstrings for all optimizer classes

Changed

  • Dropped Python 3.8 support
  • Improved type hints for constraints, sampling, and initialize parameters
  • Refactored move_climb method to CoreOptimizer
  • Cleaned up class inheritance and removed unused arguments

[1.6.0] - 2024-08-14

Added

  • Python 3.12 support
  • NumPy v2 and Pandas v2 compatibility
  • PyTorch optimizer integration example

Changed

  • Migrated from setup.py to pyproject.toml
  • Moved source code into src/ directory structure

[1.5.0] - 2024-07-22

Added

  • GeneticAlgorithmOptimizer for evolutionary optimization
  • DifferentialEvolutionOptimizer for population-based optimization
  • mutation_rate and crossover_rate parameters for evolutionary algorithms

Changed

  • Refactored stochastic hill climbing transition logic
  • Moved discrete recombination method into base class

Fixed

  • Bug in constrained optimization

[1.4.0] - 2024-05-11

Added

  • OrthogonalGridSearchOptimizer for systematic grid search
  • direction parameter for grid search optimizer
  • Search-space value validation

Changed

  • Dropped Python 3.5, 3.6, and 3.7 support
  • Replaced nth_iter with nth_trial for clearer semantics
  • Pandas v2 compatibility improvements

Fixed

  • Probability calculation bug in stochastic hill climbing
  • Bugs in stochastic hill climbing and simulated annealing

[1.3.0] - 2023-04-11

Added

  • Constrained optimization support for most optimizers
  • Constrained optimization examples and documentation

Changed

  • Refactored optimizer and search classes into separate APIs

Fixed

  • Evaluation call from parent class

[1.2.0] - 2022-10-20

Added

  • SpiralOptimization algorithm
  • LipschitzOptimizer algorithm
  • DirectAlgorithm (DIRECT) for global optimization
  • Backend API for low-level optimizer control
  • Python 3.10 and 3.11 support

Changed

  • Major refactoring for more consistent optimizer behavior
  • Improved low-level API
  • Refactored SMBO optimizers into unified pattern
  • Refactored expected improvement into separate module
  • Core optimizer moved into separate module

Fixed

  • Rotation matrix calculation
  • Various fixes for DIRECT algorithm
  • Grid search GCD calculation

[1.0.1] - 2021-12-01

Fixed

  • Bug in grid search
  • Random move and random state handling
  • Various stability improvements from v1.0.0 release