If you have any kind of function returning real values, it could not be possible to apply classical derivative approaches to search for global minima (or maxima). For this kind of problems, there are different solutions and implementation.
Unlike random automated parameter tuning approaches, Bayesian Optimisation methods aim to choose next hyperparameter values according to past good models.
Bayesian optimization is a sequential design strategy for global optimization of black-box functions that doesn’t require derivatives. (Wikipedia)
Hyperopt is a Python implementation using the Bayesian optimization approach.
Examples with single and multivariable functions
A minimal python code is shown in the following snippet, but I suggest you to check the above mentioned html page first.
import numpy as np
from hyperopt import fmin, hp, tpe
x_mins_dict = fmin(
space=[hp.uniform('x_1', -100, 100), # search range for x from -100 to 100
hp.uniform('x_2', -200, 100), # search range for x from -200 to 100
hp.uniform('x_3', 0, 50), # search range for x from 0 to 50
hp.uniform('x_4', -100, -20) # search range for x from -100 to -20
max_evals=500 # stop searching after 500 iterations