**Motivation**

If you have any kind of function returning real values, it could not be possible to apply classical derivative approaches to search for global minima (or maxima). For this kind of problems, there are different solutions and implementation.

Unlike random automated parameter tuning approaches, **Bayesian Optimisation** methods aim to choose next hyperparameter values according to past good models.

Bayesian optimizationis a sequential design strategy for global optimization of black-box functions that doesn’t require derivatives. (Wikipedia)

Hyperopt is a Python implementation using the Bayesian optimization approach.

**Examples with single and multivariable functions**

A more readable and complete explanation (with plots!) of the Python code is available in this html page, which can also be found in my Github repository as Jupyter Notebook and PDF.

A minimal python code is shown in the following snippet, but I suggest you to check the above mentioned html page first.

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 |
import numpy as np from hyperopt import fmin, hp, tpe def my_fcn(x): return np.sin(x[0]*(x[1]**2-x[2])/x[3])*np.cos(x[0]) x_mins_dict = fmin( fn=my_fcn, space=[hp.uniform('x_1', -100, 100), # search range for x[0] from -100 to 100 hp.uniform('x_2', -200, 100), # search range for x[1] from -200 to 100 hp.uniform('x_3', 0, 50), # search range for x[2] from 0 to 50 hp.uniform('x_4', -100, -20) # search range for x[3] from -100 to -20 ], algo=tpe.suggest, max_evals=500 # stop searching after 500 iterations ) |

## Leave a Reply