# Examples

This package provides different tools for optimization. Hence, this section gives different examples for using the implemented Metaheuristics.

## Single-Objective Optimization

julia> using Metaheuristicsjulia> f(x) = 10length(x) + sum( x.^2 - 10cos.(2π*x) ) # objective functionf (generic function with 1 method)julia> bounds = [-5ones(10) 5ones(10)]' # limits/bounds2×10 LinearAlgebra.Adjoint{Float64,Array{Float64,2}}:
-5.0  -5.0  -5.0  -5.0  -5.0  -5.0  -5.0  -5.0  -5.0  -5.0
5.0   5.0   5.0   5.0   5.0   5.0   5.0   5.0   5.0   5.0julia> information = Information(f_optimum = 0.0); # information on the minimization problemjulia> options = Options(f_calls_limit = 9000*10, f_tol = 1e-5); # generic settingsjulia> algorithm = ECA(information = information, options = options) # metaheuristic used to optimizeECA(η_max=2.0, K=7, N=0, N_init=0, p_exploit=0.95, p_bin=0.02, p_cr=Float64[], ε=0.0, adaptive=false, resize_population=false)julia> result = optimize(f, bounds, algorithm) # start the minimization proccess+=========== RESULT ==========+
iteration: 1286
minimum: 0.994959
minimizer: [-6.0150071606852654e-9, -1.9944598235689818e-10, 3.116951142728195e-9, 6.37436561035407e-11, 0.9949586370632609, 2.4502959988009856e-9, 4.76265222601682e-9, 6.459635198148625e-9, -4.098866254528586e-9, 4.238173422017679e-10]
f calls: 90020
total time: 1.9770 s
stop reason: Maximum objective function calls exceeded.
+============================+julia> minimum(result)0.9949590570932969julia> minimizer(result)10-element Array{Float64,1}:
-6.0150071606852654e-9
-1.9944598235689818e-10
3.116951142728195e-9
6.37436561035407e-11
0.9949586370632609
2.4502959988009856e-9
4.76265222601682e-9
6.459635198148625e-9
-4.098866254528586e-9
4.238173422017679e-10julia> result = optimize(f, bounds, algorithm) # note that second run is faster+=========== RESULT ==========+
iteration: 1286
minimum: 0.994959
minimizer: [-6.0150071606852654e-9, -1.9944598235689818e-10, 3.116951142728195e-9, 6.37436561035407e-11, 0.9949586370632609, 2.4502959988009856e-9, 4.76265222601682e-9, 6.459635198148625e-9, -4.098866254528586e-9, 4.238173422017679e-10]
f calls: 89950
total time: 0.2927 s
stop reason: Maximum number of iterations exceeded.
+============================+

## Constrained Optimization

It is common that optimization models include constraints that must be satisfied. For example: The Rosenbrock function constrained to a disk

Minimize:

$$\[f(x,y)=(1-x)^{2}+100(y-x^{2})^{2}}\$$

subject to:

$$\[x^{2}+y^{2}\leq 2}\$$

where $-2 \leq x,y \leq 2$.

In Metaheuristics.jl, a feasible solution is such that $g(x) \leq 0$ and $h(x) \approx 0$. Hence, in this example the constraint is given by $g(x) = x^2 + y^2 - 2 \leq 0$. Moreover, the equality and inequality constraints must be saved into Arrays.

Constraints handling

In this package, if the algorithm was not designed for constrained optimization, then solutions with the lower constraint violation sum will be preferred.

julia> using Metaheuristicsjulia> function f(x)
x,y = x, x

fx = (1-x)^2+100(y-x^2)^2
gx = [x^2 + y^2 - 2] # inequality constraints
hx = [0.0] # equality constraints

# order is important
return fx, gx, hx
endf (generic function with 1 method)julia> bounds = [-2.0 -2; 2 2]2×2 Array{Float64,2}:
-2.0  -2.0
2.0   2.0julia> optimize(f, bounds, ECA(N=30, K=3))+=========== RESULT ==========+
iteration: 313
minimum: 0.119774
minimizer: [0.6539166938136622, 0.4276070422509717]
f calls: 9390
feasibles: 30 / 30 in final population
total time: 1.7607 s
stop reason: Small difference of objective function values.
+============================+

## Multiobjective Optimization

To implement a multiobjective optimization problem and solve it, you can proceed as usual. Here, you need to provide constraints if they exist, otherwise put gx = [0.0]; hx = [0.0]; to indicate an unconstrained multiobjective problem.

julia> using Metaheuristicsjulia> function f(x)
# objective functions
v = 1.0 + sum(x .^ 2)
fx1 = x * v
fx2 = (1 - sqrt(x)) * v

fx = [fx1, fx2]

# constraints
gx = [0.0] # inequality constraints
hx = [0.0] # equality constraints

# order is important
return fx, gx, hx
endf (generic function with 1 method)julia> bounds = [zeros(30) ones(30)]';julia> optimize(f, bounds, NSGA2())+=========== RESULT ==========+
iteration: 251
population:        ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀F space⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
┌────────────────────────────────────────┐
2 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
f₂   │⡆⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⣇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠘⡄⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠈⠦⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠈⠑⠤⣀⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠈⠒⠂⠄⣄⣀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠉⠑⠒⠤⠤⣀⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
0 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠈⠉⠑⠂⠤⢤⣀⣀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
└────────────────────────────────────────┘
⠀0⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀3⠀
⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀f₁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
non-dominated solution(s):
⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀F space⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
┌────────────────────────────────────────┐
2 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
f₂   │⡆⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⣇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠘⡄⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠈⠦⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠈⠑⠤⣀⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠈⠒⠂⠄⣄⣀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠉⠑⠒⠤⠤⣀⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
0 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠈⠉⠑⠂⠤⢤⣀⣀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
└────────────────────────────────────────┘
⠀0⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀3⠀
⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀f₁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
f calls: 50100
feasibles: 100 / 100 in final population
total time: 5.1213 s
stop reason: Maximum objective function calls exceeded.
+============================+

## Bilevel Optimization

Bilevel optimization problems can be solved by using the package BilevelHeuristics.jl which extends Metaheuristics.jl for handling those hierarchical problems.

Defining objective functions corresponding to the BO problem.

Upper level (leader problem):

using BilevelHeuristics

F(x, y) = sum(x.^2) + sum(y.^2)
bounds_ul = [-ones(5) ones(5)] 

Lower level (follower problem):

f(x, y) = sum((x - y).^2) + y^2
bounds_ll = [-ones(5) ones(5)];

Approximate solution:

res = optimize(F, f, bounds_ul, bounds_ll, BCA())

Output:

+=========== RESULT ==========+
iteration: 108
minimum:
F: 4.03387e-10
f: 2.94824e-10
minimizer:
x: [-1.1460768817533927e-5, 7.231706879604178e-6, 3.818596951258517e-6, 2.294324313691869e-6, 1.8770952450067828e-6]
y: [1.998748659975197e-6, 9.479307908087866e-6, 6.180041276047425e-6, -7.642051857319683e-6, 2.434166021682429e-6]
F calls: 2503
f calls: 5062617
Message: Stopped due UL function evaluations limitations.
total time: 26.8142 s
+============================+

## Decision-Making

Although Metaheuristics is focused on the optimization part, some decision-making algorithms are available in this package (see Multi-Criteria Decision-Making).

The following example shows how to perform a posteriori decision-making.

julia> # load the problem
julia> f, bounds, pf = Metaheuristics.TestProblems.ZDT1();

julia> # perform multi-objective optimization
julia> res = optimize(f, bounds, NSGA2());

julia> # user preferences
julia> w = [0.5, 0.5];

julia> # set the decision-making algorithm
julia> dm_method = CompromiseProgramming(Tchebysheff())

julia> # find the best decision
julia> sol = best_alternative(res, w, dm_method)
(f = [0.38493217206706115, 0.38037042164979956], g = [0.0], h = [0.0], x = [3.849e-01, 7.731e-06, …, 2.362e-07])

## Providing Initial Solutions

Sometimes you may need to use the starter solutions you need before the optimization process begins, well, this example illustrates how to do it.

julia> using Metaheuristicsjulia> f, bounds, optimums = Metaheuristics.TestProblems.get_problem(:sphere);julia> D = size(bounds,2);julia> x_known = 0.6ones(D) # known solution10-element Array{Float64,1}:
0.6
0.6
0.6
0.6
0.6
0.6
0.6
0.6
0.6
0.6julia> X = [ bounds[1,:] + rand(D).* ( bounds[2,:] -  bounds[1,:]) for i in 1:19  ]; # random solutions (uniform distribution)julia> push!(X, x_known); # save an interest solutionjulia> population = [ Metaheuristics.create_child(x, f(x)) for x in X ]; # generate the population with 19+1 solutionsjulia> prev_status = State(Metaheuristics.get_best(population), population); # prior statejulia> method = ECA(N = length(population))ECA(η_max=2.0, K=7, N=20, N_init=20, p_exploit=0.95, p_bin=0.02, p_cr=Float64[], ε=0.0, adaptive=false, resize_population=false)julia> method.status = prev_status; # say to ECA that you have generated a populationjulia> optimize(f, bounds, method) # optimize+=========== RESULT ==========+
iteration: 5001
minimum: 1.87813e-129
minimizer: [-1.3942689313133156e-65, 6.765366436274763e-66, 2.907545875953879e-66, 2.4256121222673704e-66, 5.484054901305611e-66, 1.4502731010356664e-66, 8.019444384789044e-66, -3.7045584273126063e-65, 1.2436796207025467e-65, 2.8524874203267463e-67]
f calls: 100000
total time: 0.5748 s
stop reason: Maximum objective function calls exceeded.
+============================+

## Batch Evaluation

Evaluating multiple solutions at the same time can reduce computational time. To do that, define your function on an input N x D matrix and function values into matrices with outcomes in rows for all N solutions. Also, you need to put parallel_evaluation=true in the Options to indicate that your f is prepared for parallel evaluations.

f(X) = begin
fx = sum(X.^2, dims=2)       # objective function ∑x²
gx = sum(X.^2, dims=2) .-0.5 # inequality constraints ∑x² ≤ 0.5
hx = zeros(0,0)              # equality constraints
fx, gx, hx
end

options = Options(parallel_evaluation=true)

res = optimize(f, [-10ones(5) 10ones(5)], ECA(options=options))

See Parallelization tutorial for more details.

## Modifying an Existing Metaheuristic

You may need to modify one of the implemented metaheuristics to improve the algorithm performance or test new mechanisms. This example illustrates how to do it.

Modifying algorithms could break stuff

Be cautious when modifying a metaheuristic due to those changes will overwrite the default method for that metaheuristic.

Let's assume that we want to modify the stop criteria for ECA. See Contributing for more details.

julia> using Metaheuristicsjulia> import LinearAlgebra: normjulia> # overwrite method
function Metaheuristics.stop_criteria!(
status,
parameters::ECA, # It is important to indicate the modified Metaheuristic
problem,
information,
options,
args...;
kargs...
)

if status.stop
# nothing to do
return
end

# Diversity-based stop criteria

x_mean = zeros(length(status.population.x))
for sol in status.population
x_mean += sol.x
end
x_mean /= length(status.population)

distances_mean = sum(sol -> norm( x_mean - sol.x ), status.population)
distances_mean /= length(status.population)

# stop when solutions are close enough to the geometrical center
new_stop_condition = distances_mean <= 1e-3

status.stop = new_stop_condition

# (optional and not recommended) print when this criterium is met
if status.stop
@info "Diversity-based stop criterium"
@show distances_mean
end

return
endjulia> f, bounds, opt = Metaheuristics.TestProblems.get_problem(:sphere);julia> optimize(f, bounds, ECA())[ Info: Diversity-based stop criterium
distances_mean = 0.0009823809007036535
+=========== RESULT ==========+
iteration: 182
minimum: 2.98634e-07
minimizer: [-0.00021021640914564788, -0.000210909985126485, -4.98822316920385e-5, 0.00021283027422631647, 0.00014478996729677587, -0.0001676252482282625, 7.694193842238447e-5, 0.0003102846509278373, -7.522567820478985e-5, 7.250375301821462e-5]
f calls: 12740
total time: 0.3308 s
stop reason: Unknown stop reason.
+============================+