|
| 1 | +.. _api_reference_ask_tell: |
| 2 | + |
| 3 | +=================== |
| 4 | +Ask/Tell Optimizers |
| 5 | +=================== |
| 6 | + |
| 7 | +The optimizers in the ``gradient_free_optimizers.ask_tell`` subpackage |
| 8 | +provide the same algorithms as the main package, but with a batch-capable |
| 9 | +``ask(n=...)`` / ``tell(scores)`` interface in place of the managed |
| 10 | +``search()`` loop. They are the right choice when you need to keep |
| 11 | +evaluation control on your side: external worker pools, async job queues, |
| 12 | +distributed clusters, or integration into a larger framework. |
| 13 | + |
| 14 | +For an introduction and trade-off discussion, see |
| 15 | +:doc:`/user_guide/ask_tell`. The constructor signatures match the |
| 16 | +corresponding main-package optimizers, with two differences: |
| 17 | +``initialize`` is replaced by ``initial_evaluations``, and ``nth_process`` |
| 18 | +is not exposed. |
| 19 | + |
| 20 | + |
| 21 | +Optimizers |
| 22 | +---------- |
| 23 | + |
| 24 | +.. autosummary:: |
| 25 | + :toctree: generated/ |
| 26 | + :template: class.rst |
| 27 | + |
| 28 | + gradient_free_optimizers.ask_tell.HillClimbingOptimizer |
| 29 | + gradient_free_optimizers.ask_tell.StochasticHillClimbingOptimizer |
| 30 | + gradient_free_optimizers.ask_tell.RepulsingHillClimbingOptimizer |
| 31 | + gradient_free_optimizers.ask_tell.SimulatedAnnealingOptimizer |
| 32 | + gradient_free_optimizers.ask_tell.DownhillSimplexOptimizer |
| 33 | + gradient_free_optimizers.ask_tell.RandomSearchOptimizer |
| 34 | + gradient_free_optimizers.ask_tell.GridSearchOptimizer |
| 35 | + gradient_free_optimizers.ask_tell.RandomRestartHillClimbingOptimizer |
| 36 | + gradient_free_optimizers.ask_tell.RandomAnnealingOptimizer |
| 37 | + gradient_free_optimizers.ask_tell.PatternSearch |
| 38 | + gradient_free_optimizers.ask_tell.PowellsMethod |
| 39 | + gradient_free_optimizers.ask_tell.LipschitzOptimizer |
| 40 | + gradient_free_optimizers.ask_tell.DirectAlgorithm |
| 41 | + gradient_free_optimizers.ask_tell.ParticleSwarmOptimizer |
| 42 | + gradient_free_optimizers.ask_tell.SpiralOptimization |
| 43 | + gradient_free_optimizers.ask_tell.ParallelTemperingOptimizer |
| 44 | + gradient_free_optimizers.ask_tell.GeneticAlgorithmOptimizer |
| 45 | + gradient_free_optimizers.ask_tell.EvolutionStrategyOptimizer |
| 46 | + gradient_free_optimizers.ask_tell.DifferentialEvolutionOptimizer |
| 47 | + gradient_free_optimizers.ask_tell.CMAESOptimizer |
| 48 | + gradient_free_optimizers.ask_tell.BayesianOptimizer |
| 49 | + gradient_free_optimizers.ask_tell.TreeStructuredParzenEstimators |
| 50 | + gradient_free_optimizers.ask_tell.ForestOptimizer |
| 51 | + |
| 52 | + |
| 53 | +Common Interface |
| 54 | +---------------- |
| 55 | + |
| 56 | +All ask/tell optimizers expose the same methods and attributes: |
| 57 | + |
| 58 | +.. code-block:: python |
| 59 | +
|
| 60 | + optimizer = OptimizerClass( |
| 61 | + search_space, # dict: parameter name -> numpy array |
| 62 | + initial_evaluations=[ # list[tuple[dict, float]] |
| 63 | + (params_dict, score), |
| 64 | + ..., |
| 65 | + ], |
| 66 | + constraints=[], # optional list of constraint callables |
| 67 | + random_state=None, # optional int seed |
| 68 | + # plus algorithm-specific parameters (epsilon, population, ...) |
| 69 | + ) |
| 70 | +
|
| 71 | + params_list = optimizer.ask(n=4) # list[dict] of length n |
| 72 | + optimizer.tell(scores) # list[float] of length n |
| 73 | +
|
| 74 | + optimizer.best_score # float, -inf before any tell() |
| 75 | + optimizer.best_para # dict |
0 commit comments