Examples
This package provides different tools for optimization. Hence, this section gives different examples for using the implemented Metaheuristics
.
Single-Objective Optimization
julia> using Metaheuristics
julia> f(x) = 10length(x) + sum( x.^2 - 10cos.(2π*x) ) # objective function
f (generic function with 1 method)
julia> bounds = [-5ones(10) 5ones(10)]' # limits/bounds
2×10 adjoint(::Matrix{Float64}) with eltype Float64: -5.0 -5.0 -5.0 -5.0 -5.0 -5.0 -5.0 -5.0 -5.0 -5.0 5.0 5.0 5.0 5.0 5.0 5.0 5.0 5.0 5.0 5.0
julia> information = Information(f_optimum = 0.0); # information on the minimization problem
julia> options = Options(f_calls_limit = 9000*10, f_tol = 1e-5); # generic settings
julia> algorithm = ECA(information = information, options = options) # metaheuristic used to optimize
ECA(η_max=2.0, K=7, N=0, N_init=0, p_exploit=0.95, p_bin=0.02, p_cr=Float64[], ε=0.0, adaptive=false, resize_population=false)
julia> result = optimize(f, bounds, algorithm) # start the minimization process
+=========== RESULT ==========+ iteration: 1286 minimum: 2.98488 minimizer: [-0.9949586378788006, 0.9949586372822485, 1.2444946331345849e-9, 0.9949586371169581, 1.986023850228761e-9, -1.7623014793688543e-9, -1.2512883021077943e-9, -2.5225876413813435e-9, 3.516651703738476e-9, 4.340119223240603e-9] f calls: 90020 total time: 1.2488 s stop reason: Maximum objective function calls exceeded. +============================+
julia> minimum(result)
2.9848771712798623
julia> minimizer(result)
10-element Vector{Float64}: -0.9949586378788006 0.9949586372822485 1.2444946331345849e-9 0.9949586371169581 1.986023850228761e-9 -1.7623014793688543e-9 -1.2512883021077943e-9 -2.5225876413813435e-9 3.516651703738476e-9 4.340119223240603e-9
julia> result = optimize(f, bounds, algorithm) # note that second run is faster
+=========== RESULT ==========+ iteration: 1286 minimum: 2.98488 minimizer: [-0.9949586379362376, 0.9949586372822485, 2.693944125970655e-9, 0.994958638133699, -1.3891553982696725e-9, -1.8152103145835806e-9, -2.614085357419406e-11, 5.648683784449374e-10, 2.422502892822722e-9, 4.720844739123722e-10] f calls: 89950 total time: 0.4600 s stop reason: Maximum number of iterations exceeded. +============================+
Constrained Optimization
It is common that optimization models include constraints that must be satisfied. For example: The Rosenbrock function constrained to a disk
Minimize:
\[{\displaystyle f(x,y)=(1-x)^{2}+100(y-x^{2})^{2}}\]
subject to:
\[{\displaystyle x^{2}+y^{2}\leq 2}\]
where $-2 \leq x,y \leq 2$.
In Metaheuristics.jl
, a feasible solution is such that $g(x) \leq 0$ and $h(x) \approx 0$. Hence, in this example the constraint is given by $g(x) = x^2 + y^2 - 2 \leq 0$. Moreover, the equality and inequality constraints must be saved into Array
s.
In this package, if the algorithm was not designed for constrained optimization, then solutions with the lower constraint violation sum will be preferred.
julia> using Metaheuristics
julia> function f(x) x,y = x[1], x[2] fx = (1-x)^2+100(y-x^2)^2 gx = [x^2 + y^2 - 2] # inequality constraints hx = [0.0] # equality constraints # order is important return fx, gx, hx end
f (generic function with 1 method)
julia> bounds = [-2.0 -2; 2 2]
2×2 Matrix{Float64}: -2.0 -2.0 2.0 2.0
julia> optimize(f, bounds, ECA(N=30, K=3))
+=========== RESULT ==========+ iteration: 667 minimum: 0.0135034 minimizer: [0.8839812856687168, 0.7807666849972211] f calls: 20010 feasibles: 30 / 30 in final population total time: 0.8476 s stop reason: Maximum objective function calls exceeded. +============================+
Multiobjective Optimization
To implement a multiobjective optimization problem and solve it, you can proceed as usual. Here, you need to provide constraints if they exist, otherwise put gx = [0.0]; hx = [0.0];
to indicate an unconstrained multiobjective problem.
julia> using Metaheuristics
julia> function f(x) # objective functions v = 1.0 + sum(x .^ 2) fx1 = x[1] * v fx2 = (1 - sqrt(x[1])) * v fx = [fx1, fx2] # constraints gx = [0.0] # inequality constraints hx = [0.0] # equality constraints # order is important return fx, gx, hx end
f (generic function with 1 method)
julia> bounds = [zeros(30) ones(30)]';
julia> optimize(f, bounds, NSGA2())
+=========== RESULT ==========+ iteration: 251 population:100-element Vector{Metaheuristics.xFgh_solution{Vector{Float64}}}: (f = [3.427195437695609e-9, 1.0004182121059373], g = [0.0], h = [0.0], x = [3.426e-09, 1.214e-03, …, 3.341e-04]) (f = [2.000298972307362, 0.0], g = [0.0], h = [0.0], x = [1.000e+00, 7.094e-03, …, 6.609e-06]) (f = [0.08535964009437963, 0.7151741410073176], g = [0.0], h = [0.0], x = [8.463e-02, 5.097e-03, …, 3.226e-03]) (f = [1.8187511580932663, 0.04561212728429282], g = [0.0], h = [0.0], x = [9.528e-01, 1.882e-03, …, 2.210e-04]) (f = [1.792539469766235, 0.05400674231704061], g = [0.0], h = [0.0], x = [9.439e-01, 5.220e-03, …, 8.269e-05]) (f = [1.7306436923607453, 0.06753614824113259], g = [0.0], h = [0.0], x = [9.288e-01, 7.113e-03, …, 2.760e-03]) (f = [0.0037431259848076017, 0.9432964986148105], g = [0.0], h = [0.0], x = [3.726e-03, 9.654e-04, …, 5.089e-04]) (f = [1.6803706792012192, 0.0821049853051617], g = [0.0], h = [0.0], x = [9.128e-01, 5.257e-03, …, 2.405e-04]) (f = [1.6549863505187734, 0.08647306933431893], g = [0.0], h = [0.0], x = [9.074e-01, 1.314e-03, …, 6.091e-03]) (f = [1.6031303016601859, 0.09983591368153921], g = [0.0], h = [0.0], x = [8.920e-01, 6.298e-03, …, 1.074e-03]) ⋮ (f = [0.6028759410004177, 0.3741757217947816], g = [0.0], h = [0.0], x = [4.869e-01, 7.191e-04, …, 2.334e-04]) (f = [1.2737751889826532, 0.18325650196513021], g = [0.0], h = [0.0], x = [7.865e-01, 4.342e-03, …, 1.900e-02]) (f = [0.7281024652065512, 0.3339657972555343], g = [0.0], h = [0.0], x = [5.554e-01, 8.182e-04, …, 7.323e-04]) (f = [0.325633886615485, 0.4958265688357909], g = [0.0], h = [0.0], x = [2.981e-01, 8.267e-04, …, 1.407e-06]) (f = [0.025021192587696323, 0.8428039810741859], g = [0.0], h = [0.0], x = [2.499e-02, 4.399e-03, …, 4.431e-05]) (f = [0.4103352614455319, 0.4508192171109879], g = [0.0], h = [0.0], x = [3.623e-01, 3.176e-03, …, 4.513e-04]) (f = [0.011600498035139092, 0.8946955987328478], g = [0.0], h = [0.0], x = [1.157e-02, 6.824e-03, …, 3.804e-03]) (f = [0.9019444466340873, 0.28275343236290473], g = [0.0], h = [0.0], x = [6.393e-01, 1.361e-02, …, 3.119e-03]) (f = [1.859648891443321, 0.035420698314874696], g = [0.0], h = [0.0], x = [9.636e-01, 4.084e-04, …, 1.908e-02]) non-dominated solution(s): 100-element Vector{Metaheuristics.xFgh_solution{Vector{Float64}}}: (f = [3.427195437695609e-9, 1.0004182121059373], g = [0.0], h = [0.0], x = [3.426e-09, 1.214e-03, …, 3.341e-04]) (f = [2.000298972307362, 0.0], g = [0.0], h = [0.0], x = [1.000e+00, 7.094e-03, …, 6.609e-06]) (f = [0.08535964009437963, 0.7151741410073176], g = [0.0], h = [0.0], x = [8.463e-02, 5.097e-03, …, 3.226e-03]) (f = [1.8187511580932663, 0.04561212728429282], g = [0.0], h = [0.0], x = [9.528e-01, 1.882e-03, …, 2.210e-04]) (f = [1.792539469766235, 0.05400674231704061], g = [0.0], h = [0.0], x = [9.439e-01, 5.220e-03, …, 8.269e-05]) (f = [1.7306436923607453, 0.06753614824113259], g = [0.0], h = [0.0], x = [9.288e-01, 7.113e-03, …, 2.760e-03]) (f = [0.0037431259848076017, 0.9432964986148105], g = [0.0], h = [0.0], x = [3.726e-03, 9.654e-04, …, 5.089e-04]) (f = [1.6803706792012192, 0.0821049853051617], g = [0.0], h = [0.0], x = [9.128e-01, 5.257e-03, …, 2.405e-04]) (f = [1.6549863505187734, 0.08647306933431893], g = [0.0], h = [0.0], x = [9.074e-01, 1.314e-03, …, 6.091e-03]) (f = [1.6031303016601859, 0.09983591368153921], g = [0.0], h = [0.0], x = [8.920e-01, 6.298e-03, …, 1.074e-03]) ⋮ (f = [0.6028759410004177, 0.3741757217947816], g = [0.0], h = [0.0], x = [4.869e-01, 7.191e-04, …, 2.334e-04]) (f = [1.2737751889826532, 0.18325650196513021], g = [0.0], h = [0.0], x = [7.865e-01, 4.342e-03, …, 1.900e-02]) (f = [0.7281024652065512, 0.3339657972555343], g = [0.0], h = [0.0], x = [5.554e-01, 8.182e-04, …, 7.323e-04]) (f = [0.325633886615485, 0.4958265688357909], g = [0.0], h = [0.0], x = [2.981e-01, 8.267e-04, …, 1.407e-06]) (f = [0.025021192587696323, 0.8428039810741859], g = [0.0], h = [0.0], x = [2.499e-02, 4.399e-03, …, 4.431e-05]) (f = [0.4103352614455319, 0.4508192171109879], g = [0.0], h = [0.0], x = [3.623e-01, 3.176e-03, …, 4.513e-04]) (f = [0.011600498035139092, 0.8946955987328478], g = [0.0], h = [0.0], x = [1.157e-02, 6.824e-03, …, 3.804e-03]) (f = [0.9019444466340873, 0.28275343236290473], g = [0.0], h = [0.0], x = [6.393e-01, 1.361e-02, …, 3.119e-03]) (f = [1.859648891443321, 0.035420698314874696], g = [0.0], h = [0.0], x = [9.636e-01, 4.084e-04, …, 1.908e-02]) f calls: 50100 feasibles: 100 / 100 in final population total time: 2.7815 s stop reason: Maximum objective function calls exceeded. +============================+
Bilevel Optimization
Bilevel optimization problems can be solved by using the package BilevelHeuristics.jl which extends Metaheuristics.jl
for handling those hierarchical problems.
Defining objective functions corresponding to the BO problem.
Upper level (leader problem):
using BilevelHeuristics
F(x, y) = sum(x.^2) + sum(y.^2)
bounds_ul = [-ones(5) ones(5)]
Lower level (follower problem):
f(x, y) = sum((x - y).^2) + y[1]^2
bounds_ll = [-ones(5) ones(5)];
Approximate solution:
res = optimize(F, f, bounds_ul, bounds_ll, BCA())
Output:
+=========== RESULT ==========+
iteration: 108
minimum:
F: 4.03387e-10
f: 2.94824e-10
minimizer:
x: [-1.1460768817533927e-5, 7.231706879604178e-6, 3.818596951258517e-6, 2.294324313691869e-6, 1.8770952450067828e-6]
y: [1.998748659975197e-6, 9.479307908087866e-6, 6.180041276047425e-6, -7.642051857319683e-6, 2.434166021682429e-6]
F calls: 2503
f calls: 5062617
Message: Stopped due UL function evaluations limitations.
total time: 26.8142 s
+============================+
See BilevelHeuristics documentation for more information.
Decision-Making
Although Metaheuristics is focused on the optimization part, some decision-making algorithms are available in this package (see Multi-Criteria Decision-Making).
The following example shows how to perform a posteriori decision-making.
julia> # load the problem
julia> f, bounds, pf = Metaheuristics.TestProblems.ZDT1();
julia> # perform multi-objective optimization
julia> res = optimize(f, bounds, NSGA2());
julia> # user preferences
julia> w = [0.5, 0.5];
julia> # set the decision-making algorithm
julia> dm_method = CompromiseProgramming(Tchebysheff())
julia> # find the best decision
julia> sol = best_alternative(res, w, dm_method)
(f = [0.38493217206706115, 0.38037042164979956], g = [0.0], h = [0.0], x = [3.849e-01, 7.731e-06, …, 2.362e-07])
Providing Initial Solutions
Sometimes you may need to use the starter solutions you need before the optimization process begins, well, this example illustrates how to do it.
julia> using Metaheuristics
julia> f, bounds, optimums = Metaheuristics.TestProblems.get_problem(:sphere);
julia> D = size(bounds,2);
julia> x_known = 0.6ones(D) # known solution
10-element Vector{Float64}: 0.6 0.6 0.6 0.6 0.6 0.6 0.6 0.6 0.6 0.6
julia> X = [ bounds[1,:] + rand(D).* ( bounds[2,:] - bounds[1,:]) for i in 1:19 ]; # random solutions (uniform distribution)
julia> push!(X, x_known); # save an interest solution
julia> population = [ Metaheuristics.create_child(x, f(x)) for x in X ]; # generate the population with 19+1 solutions
julia> prev_status = State(Metaheuristics.get_best(population), population); # prior state
julia> method = ECA(N = length(population))
ECA(η_max=2.0, K=7, N=20, N_init=20, p_exploit=0.95, p_bin=0.02, p_cr=Float64[], ε=0.0, adaptive=false, resize_population=false)
julia> method.status = prev_status; # say to ECA that you have generated a population
julia> optimize(f, bounds, method) # optimize
+=========== RESULT ==========+ iteration: 5001 minimum: 5.10234e-128 minimizer: [-9.398295827770772e-66, 4.617796153434813e-65, 9.050965377785709e-65, 1.4821083942252015e-64, 1.1533797317143859e-64, 1.0638343240849482e-65, 5.3996676527274764e-65, 4.182641191976609e-65, -3.381610438981925e-66, -2.348699830683956e-65] f calls: 100000 total time: 0.6245 s stop reason: Maximum objective function calls exceeded. +============================+
Batch Evaluation
Evaluating multiple solutions at the same time can reduce computational time. To do that, define your function on an input N x D
matrix and function values into matrices with outcomes in rows for all N
solutions. Also, you need to put parallel_evaluation=true
in the Options
to indicate that your f
is prepared for parallel evaluations.
f(X) = begin
fx = sum(X.^2, dims=2) # objective function ∑x²
gx = sum(X.^2, dims=2) .-0.5 # inequality constraints ∑x² ≤ 0.5
hx = zeros(0,0) # equality constraints
fx, gx, hx
end
options = Options(parallel_evaluation=true)
res = optimize(f, [-10ones(5) 10ones(5)], ECA(options=options))
See Parallelization tutorial for more details.
Modifying an Existing Metaheuristic
You may need to modify one of the implemented metaheuristics to improve the algorithm performance or test new mechanisms. This example illustrates how to do it.
Be cautious when modifying a metaheuristic due to those changes will overwrite the default method for that metaheuristic.
Let's assume that we want to modify the stop criteria for ECA
. See Contributing for more details.
julia> using Metaheuristics
julia> import LinearAlgebra: norm
julia> # overwrite method function Metaheuristics.stop_criteria!( status, parameters::ECA, # It is important to indicate the modified Metaheuristic problem, information, options, args...; kargs... ) if status.stop # nothing to do return end # Diversity-based stop criteria x_mean = zeros(length(status.population[1].x)) for sol in status.population x_mean += sol.x end x_mean /= length(status.population) distances_mean = sum(sol -> norm( x_mean - sol.x ), status.population) distances_mean /= length(status.population) # stop when solutions are close enough to the geometrical center new_stop_condition = distances_mean <= 1e-3 status.stop = new_stop_condition # (optional and not recommended) print when this criterium is met if status.stop @info "Diversity-based stop criterium" @show distances_mean end return end
julia> f, bounds, opt = Metaheuristics.TestProblems.get_problem(:sphere);
julia> optimize(f, bounds, ECA())
[ Info: Diversity-based stop criterium distances_mean = 0.0009550440879620734 +=========== RESULT ==========+ iteration: 177 minimum: 2.87719e-07 minimizer: [0.00021129266031863047, -5.401817400004931e-5, -3.875074004821907e-5, -4.180551708518826e-6, 7.960959048665091e-5, 7.646961756568228e-5, 0.00034584180794529915, -5.388860937721664e-6, -1.745179710987299e-5, 0.00032636123123083006] f calls: 12390 total time: 0.3007 s stop reason: Unknown stop reason. +============================+
Restarting search
Metaheuristics.Restart
— TypeRestart(optimizer, every=100)
Resets the optimizer
every specified number of iterations (100 by default).
Example
julia> f, bounds, _ = Metaheuristics.TestProblems.rastrigin();
julia> optimize(f, bounds, Restart(ECA(), every=200))
Customization
The restart condition can be updated by overloading the restart_condition
method:
function Metaheuristics.restart_condition(status, restart::Restart, information, options)
st.iteration % params.every == 0
end