Getting Started

After reading this tutorial you'll become an expert in using the Metaheuristics module.

Minimization Problem

Assume you want to optimize the following minimization problem:

Minimize:

\[f(x) = 10D + \sum_{i=1}^D x_i^2 - 10cos(2\pi x_i)\]

where $x\in [-5, 5]^D$, that is, each coordinate in $x$ is between -5 and 5. Use $D=10$.

Note that the global optimum is obtained when $x_i = 0$ for all $i$. Thus, $\min f(x) = 0$.

Objective function:

f(x) = 10length(x) + sum( x.^2 - 10cos.(2pi*x) )
f (generic function with 1 method)

Bounds:

bounds = BoxConstrainedSpace(lb = -5ones(10), ub = 5ones(10))
BoxConstrainedSpace{Float64}([-5.0, -5.0, -5.0, -5.0, -5.0, -5.0, -5.0, -5.0, -5.0, -5.0], [5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0, 5.0], [10.0, 10.0, 10.0, 10.0, 10.0, 10.0, 10.0, 10.0, 10.0, 10.0], 10, true)

Providing Information

Since the optimum is known, then we can provide this information to the optimizer.

information = Information(f_optimum = 0.0)
Information(0.0, Float64[])

Common Settings

Usually, users could require to limit the number of generations/iterations or the number of function evaluations. To do that, let's assume that the metaheuristic should evaluate at most $9000D$ times the objective function. Moreover, since information is provided, then we can set the desired accuracy ($|f(x) - f(x^*)| $) to $10^{-5}$.

options = Options(f_calls_limit = 9000*10, f_tol = 1e-5);

Choose a Metaheuristic

Metaheuristics.jl provides different metaheuristics for optimization such as Evolutionary Centers Algorithm (ECA), Differential Evolution (DE), Particle Swarm Optimization (PSO), etc. In this tutorial, we will use ECA, but you can use another algorithm following the same steps.

The metaheuristics accept their parameters but share two common and optional settings information and options.

algorithm = ECA(information = information, options = options)
Algorithm Parameters
====================
  ECA(η_max=2.0, K=7, N=0, N_init=0, p_exploit=0.95, p_bin=0.02, ε=0.0, adaptive=false, resize_population=false)

Optimization Result
===================
  Empty status.
Consider changing the default parameters.

Change population size for high dimensional problems.

Optimize

Now, we are able to approximate the optimum. To do that it is necessary to use the optimize function as follows:

result = optimize(f, bounds, algorithm)
Optimization Result
===================
  Iteration:       1286
  Minimum:         6.96471
  Minimizer:       [7.18572e-11, 3.85002e-09, 1.29255e-09, …, -4.24857e-09]
  Function calls:  90020
  Total time:      0.3687 s
  Stop reason:     Maximum objective function calls exceeded.

Get the Results

Once optimize stops, we can get the approximate solutions.

Approximated minimum:

fx = minimum(result)
6.964708361833942

Approximated minimizer:

x = minimizer(result)
10-element Vector{Float64}:
  7.185718864798145e-11
  3.850023007302432e-9
  1.2925470715075545e-9
 -1.5773505243911296e-9
 -0.9949586389374668
  0.9949586367536691
  0.9949586387179171
 -1.9899122324118408
  1.643581537829554e-9
 -4.24856976967872e-9

Get Information about the Resulting Population

Sometimes it is useful to analyze the resulting population (for population-based metaheuristics). To do that you can use fvals to get objective function evaluation and positions to get their positions.

Bonus

We recommend you wrap your program in a function for performance purposes:

using Metaheuristics

function main()
    # objective function
    f(x) = 10length(x) + sum( x.^2 - 10cos.(2π*x) )

    # limits/bounds
    bounds = BoxConstrainedSpace(lb = -5ones(10), ub = 5ones(10))

    # information on the minimization problem
    information = Information(f_optimum = 0.0)

    # generic settings
    options = Options(f_calls_limit = 9000*10, f_tol = 1e-5)

    # metaheuristic used to optimize
    algorithm = ECA(information = information, options = options)

    # start the minimization process
    result = optimize(f, bounds, algorithm)

    fx = minimum(result)
    x = minimizer(result)

    x, fx
end

main()
([-4.2252467672932e-9, 2.9293550189813002e-9, -2.5176897397499105e-9, 5.633829934502006e-10, -2.893388148458334e-9, 3.6933660066302294e-9, -9.713322866297521e-10, 0.9949586364866625, -3.479911930464337e-9, -2.4024014222661465e-9], 0.9949590570932969)

Summary

Now you are able to approximate global optimum solutions using Metaheuristics.