# Examples

This package provides different tools for optimization. Hence, this section gives different examples for using the implemented Metaheuristics.

## Single-Objective Optimization

julia> using Metaheuristics

julia> f(x) = 10length(x) + sum( x.^2 - 10cos.(2π*x) ) # objective function
f (generic function with 1 method)

julia> bounds = [-5ones(10) 5ones(10)]' # limits/bounds
-5.0  -5.0  -5.0  -5.0  -5.0  -5.0  -5.0  -5.0  -5.0  -5.0
5.0   5.0   5.0   5.0   5.0   5.0   5.0   5.0   5.0   5.0

julia> information = Information(f_optimum = 0.0); # information on the minimization problem

julia> options = Options(f_calls_limit = 9000*10, f_tol = 1e-5); # generic settings

julia> algorithm = ECA(information = information, options = options) # metaheuristic used to optimize
ECA(η_max=2.0, K=7, N=0, N_init=0, p_exploit=0.95, p_bin=0.02, p_cr=Float64[], ε=0.0, adaptive=false, resize_population=false)

julia> result = optimize(f, bounds, algorithm) # start the minimization proccess
+=========== RESULT ==========+
iteration: 1286
minimum: 1.98992
minimizer: [1.453025604091675e-9, 2.055908409952351e-9, 0.9949586362432281, 2.561532061146415e-10, -3.2563405102141868e-9, 1.3342729509398442e-9, 1.3662261006842684e-9, -6.965532325334813e-10, -0.9949586457680533, 3.2867557219462686e-10]
f calls: 89951
total time: 2.4241 s
+============================+

julia> minimum(result)
1.9899181141865938

julia> minimizer(result)
10-element Array{Float64,1}:
1.453025604091675e-9
2.055908409952351e-9
0.9949586362432281
2.561532061146415e-10
-3.2563405102141868e-9
1.3342729509398442e-9
1.3662261006842684e-9
-6.965532325334813e-10
-0.9949586457680533
3.2867557219462686e-10

julia> result = optimize(f, bounds, algorithm) # note that second run is faster
+=========== RESULT ==========+
iteration: 1286
minimum: 1.98992
minimizer: [1.453025604091675e-9, 2.055908409952351e-9, 0.9949586362432281, 2.561532061146415e-10, -3.2563405102141868e-9, 1.3342729509398442e-9, 1.3662261006842684e-9, -6.965532325334813e-10, -0.9949586457680533, 3.2867557219462686e-10]
f calls: 89881
total time: 0.4582 s
+============================+

## Providing Initial Solutions

Sometimes you may need to use the starter solutions you need before the optimization process begins, well, this example illustrates how to do it.

julia> using Metaheuristics

julia> f, bounds, optimums = Metaheuristics.TestProblems.get_problem(:sphere);

julia> D = size(bounds,2);

julia> x_known = 0.6ones(D) # known solution
10-element Array{Float64,1}:
0.6
0.6
0.6
0.6
0.6
0.6
0.6
0.6
0.6
0.6

julia> X = [ bounds[1,:] + rand(D).* ( bounds[2,:] -  bounds[1,:]) for i in 1:19  ]; # random solutions (uniform distribution)

julia> push!(X, x_known); # save an interest solution

julia> population = [ Metaheuristics.create_child(x, f(x)) for x in X ]; # generate the population with 19+1 solutions

julia> prev_status = State(Metaheuristics.get_best(population), population); # prior state

julia> method = ECA(N = length(population))
ECA(η_max=2.0, K=7, N=20, N_init=20, p_exploit=0.95, p_bin=0.02, p_cr=Float64[], ε=0.0, adaptive=false, resize_population=false)

julia> method.status = prev_status; # say to ECA that you have generated a population

julia> optimize(f, bounds, method) # optimize
+=========== RESULT ==========+
iteration: 5001
minimum: 9.85527e-126
minimizer: [-1.9462804982696402e-63, -1.4094529337401117e-63, 4.189181713913609e-64, 2.182953177304719e-64, 3.883235038508061e-65, 4.062527104597257e-64, 1.5803611981906268e-63, -1.0875822437763305e-63, -4.313363417269948e-65, -9.365281152953165e-65]
f calls: 99981
total time: 0.3593 s
+============================+

## Constrained Optimization

It is common that optimization models include constraints that must be satisfied for example: The Rosenbrock function constrained to a disk

Minimize:

$$\[f(x,y)=(1-x)^{2}+100(y-x^{2})^{2}}\$$

subject to:

$$\[x^{2}+y^{2}\leq 2}\$$

where $-2 \leq x,y \leq 2$.

In Metaheuristics.jl, a feasible solution is such that $g(x) \leq 0$ and $h(x) \approx 0$. Hence, in this example the constraint is given by $g(x) = x^2 + y^2 - 2 \leq 0$. Moreover, the equality and inequality constraints must be saved into Arrays.

Constriants handling

In this package, if the algorithm was not designed for constrained optimization, then solutions with lower constraint violation sum will be preferred.

julia> using Metaheuristics

julia> function f(x)
x,y = x, x

fx = (1-x)^2+100(y-x^2)^2
gx = [x^2 + y^2 - 2] # inequality constraints
hx = [0.0] # equality constraints

# order is important
return fx, gx, hx
end
f (generic function with 1 method)

julia> bounds = [-2.0 -2; 2 2]
2×2 Array{Float64,2}:
-2.0  -2.0
2.0   2.0

julia> optimize(f, bounds, ECA(N=30, K=3))
+=========== RESULT ==========+
iteration: 667
minimum: 0.053796
minimizer: [0.7685495591426458, 0.5891626635107599]
f calls: 20010
feasibles: 30 / 30 in final population
total time: 1.5259 s
+============================+

## Multiobjective Optimization

To implement a multiobjective optimization and solve it, you can proceed as usual. Here, you need to provide constraints if they exist, otherwise put gx = [0.0]; hx = [0.0]; to indicate an unconstrained multiobjective problem

julia> using Metaheuristics

julia> function f(x)
# objective functions
v = 1.0 + sum(x .^ 2)
fx1 = x * v
fx2 = (1 - sqrt(x)) * v

fx = [fx1, fx2]

# constraints
gx = [0.0] # inequality constraints
hx = [0.0] # equality constraints

# order is important
return fx, gx, hx
end
f (generic function with 1 method)

julia> bounds = [zeros(30) ones(30)]';

julia> optimize(f, bounds, NSGA2())
+=========== RESULT ==========+
iteration: 500
population:                         F space
┌────────────────────────────────────────┐
1 │⡁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⢅⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⢘⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⡃⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠘⡄⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠘⡄⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
f₂   │⠀⠀⠀⠘⢆⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠁⢀⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠘⠢⣀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠀⠀⠈⠑⠢⣀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠈⠑⠢⢄⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠈⠲⢤⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠈⠓⠢⢄⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
0 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠉⠒⠠⣀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
└────────────────────────────────────────┘
0                                        3
f₁
non-dominated solution(s):
F space
┌────────────────────────────────────────┐
1 │⡁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⢅⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⢘⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⡃⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠘⡄⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠘⡄⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
f₂   │⠀⠀⠀⠘⢆⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠁⢀⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠘⠢⣀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠀⠀⠈⠑⠢⣀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠈⠑⠢⢄⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠈⠲⢤⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠈⠓⠢⢄⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
0 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠉⠒⠠⣀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
└────────────────────────────────────────┘
0                                        3
f₁
f calls: 50000
feasibles: 100 / 100 in final population
total time: 5.4877 s
+============================+

## Modifying an Existing Metaheuristic

You may need to modify one of the implemented metaheuristic to improve the algorithm performance or test new mechanisms. This example illustrate how to do it.

Modifying algorithms could break stuff

It is recommended to put the new methods in modules rather than in global scope in order to avoid errors.

Let's assume that we want to modify the stop criteria for ECA. See Contributing for more details.

julia> using Metaheuristics

julia> import LinearAlgebra: norm

julia> # overwrite method
function Metaheuristics.stop_criteria!(
status,
parameters::ECA, # It is important indicate the modified Metaheuristic
problem,
information,
options,
args...;
kargs...
)

if status.stop
# nothing to do
return
end

# Diversity-based stop criteria

x_mean = zeros(length(status.population.x))
for sol in status.population
x_mean += sol.x
end
x_mean /= length(status.population)

distances_mean = sum(sol -> norm( x_mean - sol.x ), status.population)
distances_mean /= length(status.population)

# stop when solutions are close enough to the geometrical center
new_stop_condition = distances_mean <= 1e-3

status.stop = new_stop_condition

# (optional and not recommended) print when this critaria is met
if status.stop
@info "Diversity-based stop criteria"
@show distances_mean
end

return
end

julia> f, bounds, opt = Metaheuristics.TestProblems.get_problem(:sphere);

julia> optimize(f, bounds, ECA())
[ Info: Diversity-based stop criteria
distances_mean = 0.0009956873791493875
+=========== RESULT ==========+
iteration: 183
minimum: 4.79325e-07
minimizer: [-0.00017131704767053542, -0.0001653995659191938, -1.0320962837594345e-5, 0.00028321458543870696, -0.0002554619119175432, -0.0001258118792650535, -0.0001293343259078651, -0.00016117574483573725, 0.0003909602697019154, 0.00025623581278110506]
f calls: 12799
total time: 1.2931 s
+============================+