All notable changes to Gradient-Free-Optimizers are documented in this file.
The format is based on Keep a Changelog, and this project adheres to Semantic Versioning.
For detailed release notes, see GitHub Releases.
- Ask/tell interface in the new
gradient_free_optimizers.ask_tellsubpackage, exposing all 23 single-objective optimizer algorithms with a batch-capableask(n=...)/tell(scores)loop and aninitial_evaluationsconstructor parameter for seeded restarts - (private) Distributed objective-function evaluation in the
_distributedsubpackage, with backendsJoblib,Ray,Dask, and nativeMultiprocessing. SMBO optimizers (Bayesian, TPE, Forest) gained_select_diverse_batchto avoid duplicate proposals when asked for a batch of positions - (private) Pluggable evaluation storage (
_storage) withMemoryStorage(default, in-process dict) andSQLiteStorage(on-disk) backends, allowing the memoization cache to persist across processes and runs
wrap_with_catchrewritten from a closure into a callable class so wrapped objectives remain picklable under the new parallel backends- Objective-function result unpacking (score and optional metadata) centralized in a single path
- Error when the objective function returns a metadata dict alongside the score (affecting
search(), the SQLite storage backend, and the distributed base class) Multiprocessingdistribution backend now supportsspawnstart method (Windows and macOS 3.14+) by pickling the objective alongside each parameter dict viastarmap; previously it hard-codedforkand broke on platforms whereforkis unavailable- Async batch refill loop: cache hits consumed an iteration without submitting a future, stalling the worker pool. The refill now retries submission until a future is queued or there is nothing left to dispatch
search_data.rstuser-guide referencedopt.dataand its sub-accessors as public API, but the code only exposed them as_data(private) since v1.11.0. Docs now reflect the actual public surface:opt.search_data,opt.best_score,opt.best_para
- Added coverage for the ask/tell interface (including batch semantics), distributed backends, and persistent storage
- Constraint retry loop now falls back to random positions when the optimizer's own
_generate_positionrepeatedly violates constraints, preventing the search from silently proceeding with invalid positions
- Expanded constraint tests from representative subset to all optimizers
- Added constraint tests for categorical dimensions, mixed search spaces (continuous + discrete + categorical), cross-parameter constraints, and adversarial scenarios where the optimum lies inside the excluded region
CMAESOptimizerimplementing the CMA-ES (Covariance Matrix Adaptation Evolution Strategy) algorithm withpopulation,mu,sigma, andipop_restartparameterscallbacksparameter insearch()accepting a list of functions that receive a frozenCallbackInfodataclass per iteration; returningFalsestops the search earlycatchparameter insearch()mapping exception types to fallback scores for graceful error handling during objective function evaluation- (private) Data accessor (
opt._data) providing 34+ computed metrics including timing breakdowns, convergence data, search statistics, and score distributions - Search summary printing via new verbosity options:
"print_results","print_search_stats","print_statistics","print_times" gfo-helpCLI command showing available metrics and API accessor pathspy.typedmarker file for PEP 561 type checker support- Type annotations across the package
- Docs: parameter entries added to right-side page TOC, landing page and styling updated
- Unicode encoding in search summary on Windows terminals (falls back to ASCII box-drawing)
optimum="minimum"parameter insearch()had no effect on the actual optimization. The objective adapter received the raw function instead of the negated one, causing the optimizer to maximize regardless of theoptimumsetting. The negation was only applied to the progress bar display.
Major release introducing a new optimizer architecture based on the Template Method Pattern, extended search space support with continuous, categorical, and discrete dimension types, and vectorized operations for high-dimensional optimization.
- New optimizer module (
optimizers/) using the Template Method Pattern with explicit hook methods (_iterate_continuous_batch,_iterate_categorical_batch,_iterate_discrete_batch) - Extended search space dimension types: continuous
(min, max)tuples, categorical["a", "b"]lists, and discrete numerical NumPy arrays DimensionTypeenum,DimensionInfodataclass, andDimensionMasksfor dimension-aware vectorized operations- Automatic vectorization for search spaces with 1000+ dimensions via
DimensionIteratorMixin resolutionparameter forGridSearchOptimizerandDirectAlgorithmto handle continuous dimensions- Mixed-type distance metric (Gower-like) for the DIRECT algorithm across heterogeneous dimensions
- Lazy search data construction in
ResultsManagerfor reduced memory footprint during optimization - State management via property setters with automatic history tracking in
CoreOptimizer - Extended search-space tests for all optimizers
- Examples for mixed and large search spaces
- Sphinx documentation site with landing page, logos, and navigation
- All optimizers reimplemented to comply with the new Template Method architecture
- Legacy optimizer implementations preserved in
optimizers_legacy/(not part of public API) - SciPy restored as a core dependency
- Wall clipping algorithm reworked
- Optimizer initialization refactored (
finish_initialization,_generate_positionpattern) - Converter enhanced with dimension type analysis (
_analyze_dimension_types) - Updated CI workflow configuration
finish_initializationin Downhill Simplex and other optimizers_move_randomin sequential model-based optimizers- Init position and
evaluate_initoverride issues in optimizer subclasses - Empty scores edge case in evaluation
Major release focusing on dependency reduction. scikit-learn and scipy are now optional dependencies with native Python implementations available for all core functionality.
- Private array backend (
_array_backend) for pure Python array operations without NumPy - Private math backend (
_math_backend) for mathematical operations without SciPy - Native
DecisionTreeRegressorimplementation - Native
ExtraTreesRegressorimplementation - Native
RandomForestRegressorimplementation - Native
GradientBoostingRegressorimplementation SimpleProgressBarclass as fallback when tqdm is unavailable- Sigma self-adaptation for
EvolutionStrategyOptimizer convergence_thresholdparameter for Powell's Method- Type hints to all optimizer classes and
Searchclass - Comprehensive docstrings for all optimizer classes
- Sphinx documentation with ReadTheDocs integration
- API tests for all optimizer categories
- scikit-learn is now an optional dependency (native estimators used by default)
- SciPy is now an optional dependency
- tqdm is now an optional dependency
- Complete reimplementation of Powell's Method with improved line search algorithms
- Reworked README with new 3D optimization animation
- Consolidated CI workflows into single
ci.yml - Restructured test directory (
tests/test_main/,tests/test_internal/, etc.) - Improved error messages with actionable suggestions
BayesianRidgeestimator- Linear GP option from Gaussian Process regressor
- Golden section search algorithm in Powell's Method
- Mutable default argument anti-pattern (
constraints=[]changed toconstraints=None) - Missing
@functools.wrapson internal decorators - Division by zero edge case in print-times
- Bug in evaluate method
Re-release of v1.8.0 with updated package metadata.
- Python 3.14 support
- Package keywords and classifiers for improved discoverability
- Dropped Python 3.9 support
- Test performance improvements
- Sporadic test failures in CI
- Native
GaussianProcessRegressorimplementation with RBF kernel - Native
KernelDensityEstimatorimplementation Resultdataclass for structured evaluation resultsObjectiveAdapterclass for cleaner objective function handling- Toy test functions (Sphere, Ackley) for benchmarking
- Python 3.13 support
- Refactored memory/caching system using
CachedObjectiveAdapter - Refactored
ResultsManagerfor improved result collection - New optimization stopping implementation
- Performance improvements to
normalizefunction (up to 90% faster) - Performance improvements to
LipschitzFunction.find_best_slope(81% faster)
- Issue with maximize/minimize objective function handling
- Comprehensive docstrings for all optimizer classes
- Dropped Python 3.8 support
- Improved type hints for
constraints,sampling, andinitializeparameters - Refactored
move_climbmethod toCoreOptimizer - Cleaned up class inheritance and removed unused arguments
- Python 3.12 support
- NumPy v2 and Pandas v2 compatibility
- PyTorch optimizer integration example
- Migrated from
setup.pytopyproject.toml - Moved source code into
src/directory structure
GeneticAlgorithmOptimizerfor evolutionary optimizationDifferentialEvolutionOptimizerfor population-based optimizationmutation_rateandcrossover_rateparameters for evolutionary algorithms
- Refactored stochastic hill climbing transition logic
- Moved discrete recombination method into base class
- Bug in constrained optimization
OrthogonalGridSearchOptimizerfor systematic grid searchdirectionparameter for grid search optimizer- Search-space value validation
- Dropped Python 3.5, 3.6, and 3.7 support
- Replaced
nth_iterwithnth_trialfor clearer semantics - Pandas v2 compatibility improvements
- Probability calculation bug in stochastic hill climbing
- Bugs in stochastic hill climbing and simulated annealing
- Constrained optimization support for most optimizers
- Constrained optimization examples and documentation
- Refactored optimizer and search classes into separate APIs
- Evaluation call from parent class
SpiralOptimizationalgorithmLipschitzOptimizeralgorithmDirectAlgorithm(DIRECT) for global optimization- Backend API for low-level optimizer control
- Python 3.10 and 3.11 support
- Major refactoring for more consistent optimizer behavior
- Improved low-level API
- Refactored SMBO optimizers into unified pattern
- Refactored expected improvement into separate module
- Core optimizer moved into separate module
- Rotation matrix calculation
- Various fixes for DIRECT algorithm
- Grid search GCD calculation
- Bug in grid search
- Random move and random state handling
- Various stability improvements from v1.0.0 release