Examples
This package provides different tools for optimization. Hence, this section gives different examples for using the implemented Metaheuristics
.
Single-Objective Optimization
julia> using Metaheuristics
julia> f(x) = 10length(x) + sum( x.^2 - 10cos.(2π*x) ) # objective function
f (generic function with 1 method)
julia> bounds = [-5ones(10) 5ones(10)]' # limits/bounds
2×10 adjoint(::Matrix{Float64}) with eltype Float64:
-5.0 -5.0 -5.0 -5.0 -5.0 -5.0 -5.0 -5.0 -5.0 -5.0
5.0 5.0 5.0 5.0 5.0 5.0 5.0 5.0 5.0 5.0
julia> information = Information(f_optimum = 0.0); # information on the minimization problem
julia> options = Options(f_calls_limit = 9000*10, f_tol = 1e-5); # generic settings
julia> algorithm = ECA(information = information, options = options) # metaheuristic used to optimize
ECA(2.0, 7, 0, 0, 0.95, 0.02, Float64[], 0.0, false, false)
julia> result = optimize(f, bounds, algorithm) # start the minimization proccess
+=========== RESULT ==========+
iteration: 1286
minimum: 1.98992
minimizer: [0.994958638836778, 0.9949586367263603, 7.73757593099707e-10, -1.1493248852473246e-9, -5.787822593506674e-10, 3.6854947254324133e-9, -4.287652076653421e-9, -4.011434949358244e-9, 1.6604468866068952e-9, 2.909879085075321e-9]
f calls: 89951
total time: 1.6777 s
+============================+
julia> minimum(result)
1.9899181141865796
julia> minimizer(result)
10-element Vector{Float64}:
0.994958638836778
0.9949586367263603
7.73757593099707e-10
-1.1493248852473246e-9
-5.787822593506674e-10
3.6854947254324133e-9
-4.287652076653421e-9
-4.011434949358244e-9
1.6604468866068952e-9
2.909879085075321e-9
julia> result = optimize(f, bounds, algorithm) # note that second run is faster
+=========== RESULT ==========+
iteration: 1286
minimum: 1.98992
minimizer: [0.994958638836778, 0.9949586367263603, 7.73757593099707e-10, -1.1493248852473246e-9, -5.787822593506674e-10, 3.6854947254324133e-9, -4.287652076653421e-9, -4.011434949358244e-9, 1.6604468866068952e-9, 2.909879085075321e-9]
f calls: 89881
total time: 0.6977 s
+============================+
Providing Initial Solutions
Sometimes you may need to use the starter solutions you need before the optimization process begins, well, this example illustrates how to do it.
julia> using Metaheuristics
julia> f, bounds, optimums = Metaheuristics.TestProblems.get_problem(:sphere);
julia> D = size(bounds,2);
julia> x_known = 0.6ones(D) # known solution
10-element Vector{Float64}:
0.6
0.6
0.6
0.6
0.6
0.6
0.6
0.6
0.6
0.6
julia> X = [ bounds[1,:] + rand(D).* ( bounds[2,:] - bounds[1,:]) for i in 1:19 ]; # random solutions (uniform distribution)
julia> push!(X, x_known); # save an interest solution
julia> population = [ Metaheuristics.create_child(x, f(x)) for x in X ]; # generate the population with 19+1 solutions
julia> prev_status = State(Metaheuristics.get_best(population), population); # prior state
julia> method = ECA(N = length(population))
ECA(2.0, 7, 20, 20, 0.95, 0.02, Float64[], 0.0, false, false)
julia> method.status = prev_status; # say to ECA that you have generated a population
julia> optimize(f, bounds, method) # optimize
+=========== RESULT ==========+
iteration: 5001
minimum: 1.33687e-121
minimizer: [1.2055091160305532e-61, 2.8600913489032895e-62, 2.1124295769306863e-61, -1.9476835167624728e-61, 6.23431259281838e-63, -6.462189047382966e-62, -5.7770198524678515e-62, 3.660251615747485e-62, 1.604328640176536e-61, 3.387585280937405e-62]
f calls: 99981
total time: 0.6263 s
+============================+
Constrained Optimization
It is common that optimization models include constraints that must be satisfied for example: The Rosenbrock function constrained to a disk
Minimize:
\[{\displaystyle f(x,y)=(1-x)^{2}+100(y-x^{2})^{2}}\]
subject to:
\[{\displaystyle x^{2}+y^{2}\leq 2}\]
where $-2 \leq x,y \leq 2$.
In Metaheuristics.jl
, a feasible solution is such that $g(x) \leq 0$ and $h(x) \approx 0$. Hence, in this example the constraint is given by $g(x) = x^2 + y^2 - 2 \leq 0$. Moreover, the equality and inequality constraints must be saved into Array
s.
In this package, if the algorithm was not designed for constrained optimization, then solutions with lower constraint violation sum will be preferred.
julia> using Metaheuristics
julia> function f(x)
x,y = x[1], x[2]
fx = (1-x)^2+100(y-x^2)^2
gx = [x^2 + y^2 - 2] # inequality constraints
hx = [0.0] # equality constraints
# order is important
return fx, gx, hx
end
f (generic function with 1 method)
julia> bounds = [-2.0 -2; 2 2]
2×2 Matrix{Float64}:
-2.0 -2.0
2.0 2.0
julia> optimize(f, bounds, ECA(N=30, K=3))
+=========== RESULT ==========+
iteration: 471
minimum: 0.00191431
minimizer: [0.956247145871321, 0.9144086039621672]
f calls: 14107
feasibles: 30 / 30 in final population
total time: 0.9123 s
+============================+
Multiobjective Optimization
To implement a multiobjective optimization and solve it, you can proceed as usual. Here, you need provide constraints if they exists, otherwise put gx = [0.0]; hx = [0.0];
to indicate an unconstrained multiobjective problem
julia> using Metaheuristics
julia> function f(x)
# objective functions
v = 1.0 + sum(x .^ 2)
fx1 = x[1] * v
fx2 = (1 - sqrt(x[1])) * v
fx = [fx1, fx2]
# constraints
gx = [0.0] # inequality constraints
hx = [0.0] # equality constraints
# order is important
return fx, gx, hx
end
f (generic function with 1 method)
julia> bounds = [zeros(30) ones(30)]';
julia> optimize(f, bounds, NSGA2())
+=========== RESULT ==========+
iteration: 500
population: F space
┌────────────────────────────────────────┐
2 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
f_2 │⡆⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⣃⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠘⡄⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠈⠢⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠈⠒⠤⣀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠉⠓⠢⠤⣀⣀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠈⠉⠑⠒⠦⠤⣀⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
0 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠈⠉⠑⠒⠤⢤⣀⣀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
└────────────────────────────────────────┘
0 3
f_1
non-dominated solution(s):
F space
┌────────────────────────────────────────┐
2 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
f_2 │⡆⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⣃⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠘⡄⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠈⠢⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠈⠒⠤⣀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠉⠓⠢⠤⣀⣀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠈⠉⠑⠒⠦⠤⣀⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
0 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠈⠉⠑⠒⠤⢤⣀⣀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
└────────────────────────────────────────┘
0 3
f_1
f calls: 50000
feasibles: 100 / 100 in final population
total time: 5.0879 s
+============================+
Modifying an Existing Metaheuristic
You may need to modify one of the implemented metaheuristic to improve the algorithm performance or test new mechanisms. This example illustrate how to do it.
It is recommended to put the new methods in module
s rather than in global scope in order to avoid errors.
Let assume that we want to modify the stop criteria for ECA
. See Contributing for more details.
julia> using Metaheuristics
julia> import LinearAlgebra: norm
julia> # overwrite method
function Metaheuristics.stop_criteria!(
status,
parameters::ECA, # It is important indicate the modified Metaheuristic
problem,
information,
options,
args...;
kargs...
)
if status.stop
# nothing to do
return
end
# Diversity-based stop criteria
x_mean = zeros(length(status.population[1].x))
for sol in status.population
x_mean += sol.x
end
x_mean /= length(status.population)
distances_mean = sum(sol -> norm( x_mean - sol.x ), status.population)
distances_mean /= length(status.population)
# stop when solutions are close enough to the geometrical center
new_stop_condition = distances_mean <= 1e-3
status.stop = new_stop_condition
# (optional and not recommended) print when this critaria is met
if status.stop
@info "Diversity-based stop criteria"
@show distances_mean
end
return
end
julia> f, bounds, opt = Metaheuristics.TestProblems.get_problem(:sphere);
julia> optimize(f, bounds, ECA())
[ Info: Diversity-based stop criteria
distances_mean = 0.0009925894154983498
+=========== RESULT ==========+
iteration: 167
minimum: 4.18635e-07
minimizer: [0.0001842529857505795, 0.00036731360054813436, 0.0002031311324773614, -4.090993871154635e-5, 0.0002533080232984601, 4.852555260834754e-5, -0.00017665222361996336, 0.00018799339911827918, -0.0002715370124426023, -5.580304112533588e-6]
f calls: 11674
total time: 1.4279 s
+============================+