`CMAEvolutionStrategy.NoiseHandling`

— Type```
NoiseHandling(ασ = 1.1, callback = s -> s > 0,
r = .3, ϵ = 0., θ = .5, c = .3)
```

The standard settings may work well for noisy objective functions. To avoid premature convergence due to too fast decrease of sigma, there is the option `noise_handling = CMAEvolutionStrategy.NoiseHandling(ασ = 1.1, callback = s -> s > 0)`

. Choose `ασ`

such that sigma decreases slowly (and does not diverge). The callback function can be used to change the objective function, e.g. increase the measurement duration, if this leads to smaller noise. The variable `s`

indicates whether CMA-ES can handle the current level of noise: `s > 0`

indicates that the noise level is too high. Whenever the callback returns `true`

, sigma gets multiplied by `ασ`

, which is the case when `s > 0`

, with the default callback.

For details on noise handling see Hansen 2009.

`CMAEvolutionStrategy.NoiseHandling`

— Method`NoiseHandling(n; kwargs...) = NoiseHandling(; ασ = 1 + 2/(n + 10), kwargs...)`

`CMAEvolutionStrategy.fbest`

— Method`fbest(result)`

Extract lowest function value ever evaluated.

`CMAEvolutionStrategy.minimize`

— Method```
minimize(f, x0, s0;
lower = nothing,
upper = nothing,
constraints = _constraints(lower, upper),
noise_handling = nothing,
popsize = 4 + floor(Int, 3*log(length(x0))),
callback = (o, y, fvals, perm) -> nothing,
parallel_evaluation = false,
multi_threading = false,
verbosity = 1,
seed = rand(UInt),
logger = BasicLogger(x0, verbosity = verbosity, callback = callback),
maxtime = nothing,
maxiter = nothing,
maxfevals = nothing,
stagnation = 100 + 100 * length(x0)^1.5/popsize,
ftarget = nothing,
xtol = nothing,
ftol = 1e-11)
```

Minimize function `f`

starting around `x0`

with initial covariance `s0 * I`

under box constraints `lower .<= x0 .<= upper`

, where `x0`

, `lower`

and `upper`

are vectors of the same length or `nothing`

.

The result is an `Optimizer`

object from which e.g. `xbest`

, `fbest`

or `population_mean`

can be extracted.

If `parallel_evaluation = true`

, the objective function `f`

receives matrices of `n`

rows (`n = length(x0)`

) and `popsize`

columns and should return a vector of length `popsize`

. To use multi-threaded parallel evaluation of the objective function, set `multi_threading = true`

and start julia with multiple threads (c.f. julia manual for the multi-threading setup).

**Example 1**

```
using CMAEvolutionStrategy
function rosenbrock(x)
n = length(x)
sum(100 * (x[2i-1]^2 - x[2i])^2 + (x[2i-1] - 1)^2 for i in 1:div(n, 2))
end
result = minimize(rosenbrock, zeros(6), 1.)
xbest(result) # show best input x
fbest(result) # show best function value
population_mean(result) # show mean of final population
```

**Example 2**

```
# continuation of Example 1 with parallel evaluation
rosenbrock_parallel(x) = [rosenbrock(xi) for xi in eachcol(x)]
result = minimize(rosenbrock_parallel, zeros(6), 1., parallel_evaluation = true)
```

`CMAEvolutionStrategy.population_mean`

— Method`population_mean(result)`

Extract the current mean of the optimizer.

`CMAEvolutionStrategy.xbest`

— Method`xbest(result)`

Extract input `x`

that resulted in the lowest function value ever.