API References

Metaheuristics.optimizeFunction
  optimize(
        f::Function, # objective function
        search_space,
        method::AbstractAlgorithm = ECA();
        logger::Function = (status) -> nothing,
  )

Minimize a n-dimensional function f with domain search_space (2×n matrix) using method = ECA() by default.

Example

Minimize f(x) = Σx² where x ∈ [-10, 10]³.

Solution:

julia> f(x) = sum(x.^2)
f (generic function with 1 method)

julia> bounds = [  -10.0 -10 -10; # lower bounds
                    10.0  10 10 ] # upper bounds
2×3 Array{Float64,2}:
 -10.0  -10.0  -10.0
  10.0   10.0   10.0

julia> result = optimize(f, bounds)
+=========== RESULT ==========+
  iteration: 1429
    minimum: 2.5354499999999998e-222
  minimizer: [-1.5135301653303966e-111, 3.8688354844737692e-112, 3.082095708730726e-112]
    f calls: 29989
 total time: 0.1543 s
+============================+
source
Metaheuristics.optimize!Function
optimize!(f, search_space, method;logger)

Perform an iteration of method, and save the results in method.status.

Example

f, bounds, _ = Metaheuristics.TestProblems.sphere();
method = ECA()
while !Metaheuristics.should_stop(method)
    optimize!(f, bounds, method)
end
result = Metaheuristics.get_result(method)

See also optimize.

source
Metaheuristics.StateType
State datatype

State is used to store the current metaheuristic status. In fact, the optimize function returns a State.

  • best_sol Stores the best solution found so far.
  • population is an Array{typeof(best_sol)} for population-based algorithms.
  • f_calls is the number of objective functions evaluations.
  • g_calls is the number of inequality constraints evaluations.
  • h_calls is the number of equality constraints evaluations.
  • iteration is the current iteration.
  • success_rate percentage of new generated solutions better that their parents.
  • convergence used save the State at each iteration.
  • start_time saves the time() before the optimization process.
  • final_time saves the time() after the optimization process.
  • stop if true, then stops the optimization process.

Example

julia> f(x) = sum(x.^2)
f (generic function with 1 method)

julia> bounds = [  -10.0 -10 -10; # lower bounds
                    10.0  10 10 ] # upper bounds
2×3 Array{Float64,2}:
 -10.0  -10.0  -10.0
  10.0   10.0   10.0

julia> state = optimize(f, bounds)
+=========== RESULT ==========+
| Iter.: 1009
| f(x) = 7.16271e-163
| solution.x = [-7.691251412064516e-83, 1.0826961235605951e-82, -8.358428300092186e-82]
| f calls: 21190
| Total time: 0.2526 s
+============================+

julia> minimum(state)
7.162710802659093e-163

julia> minimizer(state)
3-element Array{Float64,1}:
 -7.691251412064516e-83
  1.0826961235605951e-82
 -8.358428300092186e-82
source
Metaheuristics.InformationType
Information Structure

Information can be used to store the true optimum in order to stop a metaheuristic early.

Properties:

  • f_optimum known minimum.
  • x_optimum known minimizer.

If Options is provided, then optimize will stop when |f(x) - f(x_optimum)| < Options.f_tol or ‖ x - x_optimum ‖ < Options.x_tol (euclidean distance).

Example

If you want an approximation to the minimum with accuracy of 1e-3 (|f(x) - f(x*)| < 1e-3), then you may use Information.

julia> f(x) = sum(x.^2)
f (generic function with 1 method)

julia> bounds = [  -10.0 -10 -10; # lower bounds
                    10.0  10 10 ] # upper bounds
2×3 Array{Float64,2}:
 -10.0  -10.0  -10.0
  10.0   10.0   10.0

julia> information = Information(f_optimum = 0.0)
Information(0.0, Float64[])

julia> options = Options(f_tol = 1e-3)
Options(0.0, 0.001, 0.0, 0.0, 1000.0, 0.0, 0.0, 0, false, true, false, :minimize)

julia> state = optimize(f, bounds, ECA(information=information, options=options))
+=========== RESULT ==========+
| Iter.: 22
| f(x) = 0.000650243
| solution.x = [0.022811671589729583, 0.007052331140376011, -0.008951836265056107]
| f calls: 474
| Total time: 0.0106 s
+============================+
source
Metaheuristics.OptionsType
Options(;
    x_tol::Real = 1e-8,
    f_tol::Real = 1e-12,
    f_tol_rel::Real = eps(),
    f_tol_abs::Real = 0.0,
    g_tol::Real = 0.0,
    h_tol::Real = 0.0,
    f_calls_limit::Real = 0,
    time_limit::Real = Inf,
    iterations::Int = 1,
    store_convergence::Bool = false,
    debug::Bool = false,
    seed = rand(UInt),
    rng  = default_rng_mh(seed),
    parallel_evaluation = false,
    verbose = false,
)

Options stores common settings for metaheuristics such as the maximum number of iterations debug options, maximum number of function evaluations, etc.

Main properties:

  • x_tol tolerance to the true minimizer if specified in Information.
  • f_tol tolerance to the true minimum if specified in Information.
  • f_tol_rel relative tolerance.
  • f_calls_limit is the maximum number of function evaluations limit.
  • time_limit is the maximum time that optimize can spend in seconds.
  • iterations is the maximum number of allowed iterations.
  • store_convergence if true, then push the current State in State.convergence at each generation/iteration
  • debug if true, then optimize function reports the current State (and interest information) for each iterations.
  • seed non-negative integer for the random generator seed.
  • parallel_evaluation enables batch evaluations.
  • verbose show simplified results each iteration.
  • rng user-defined Random Number Generator.

Example

julia> options = Options(f_calls_limit = 1000, debug=false, seed=1);

julia> f(x) = sum(x.^2)
f (generic function with 1 method)

julia> bounds = [  -10.0 -10 -10; # lower bounds
                    10.0  10 10 ] # upper bounds
2×3 Array{Float64,2}:
 -10.0  -10.0  -10.0
  10.0   10.0   10.0

julia> state = optimize(f, bounds, ECA(options=options));
source
Metaheuristics.convergenceFunction
convergence(state)

get the data (touple with the number of function evaluations and fuction values) to plot the convergence graph.

Example

julia> f(x) = sum(x.^2)
f (generic function with 1 method)

julia> bounds = [  -10.0 -10 -10; # lower bounds
                    10.0  10 10 ] # upper bounds
2×3 Array{Float64,2}:
 -10.0  -10.0  -10.0
  10.0   10.0   10.0

julia> state = optimize(f, bounds, ECA(options=Options(store_convergence=true)))
+=========== RESULT ==========+
| Iter.: 1022
| f(x) = 7.95324e-163
| solution.x = [-7.782044850211721e-82, 3.590044165897827e-82, -2.4665318114710003e-82]
| f calls: 21469
| Total time: 0.3300 s
+============================+

julia> n_fes, fxs = convergence(state);
source
Base.minimumMethod
minimum(state::Metaheuristics.State)

Returns the approximation to the minimum (min f(x)) stored in state.

source
Metaheuristics.termination_status_messageFunction
termination_status_message(status)

Return a string of the message related to the status.

See TerminationStatusCode

Example:

julia> termination_status_message(Metaheuristics.ITERATION_LIMIT)
"Maximum number of iterations exceeded."

julia> termination_status_message(optimize(f, bounds))
"Maximum number of iterations exceeded."

julia> termination_status_message(ECA())
"Unknown stop reason."
source

Methods for Solutions/Individuals

Metaheuristics.fvalsFunction
fvals(state)

If state.population has N solutions, then returns a Vector with the objective function values from items in state.population.

source
Metaheuristics.create_childFunction
Metaheuristics.create_child(x, fx)

Constructor for a solution depending on the result of fx.

Example

julia> import Metaheuristics

julia> Metaheuristics.create_child(rand(3), 1.0)
| f(x) = 1
| solution.x = [0.2700437125780806, 0.5233263210622989, 0.12871108215859772]

julia> Metaheuristics.create_child(rand(3), (1.0, [2.0, 0.2], [3.0, 0.3]))
| f(x) = 1
| g(x) = [2.0, 0.2]
| h(x) = [3.0, 0.3]
| x = [0.9881102595664819, 0.4816273348099591, 0.7742585077942159]

julia> Metaheuristics.create_child(rand(3), ([-1, -2.0], [2.0, 0.2], [3.0, 0.3]))
| f(x) = [-1.0, -2.0]
| g(x) = [2.0, 0.2]
| h(x) = [3.0, 0.3]
| x = [0.23983577719146854, 0.3611544510766811, 0.7998754930109109]

julia> population = [ Metaheuristics.create_child(rand(2), (randn(2),  randn(2), rand(2))) for i = 1:100  ]
                           F space
          ┌────────────────────────────────────────┐ 
        2 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ 
          │⠀⠀⠀⠀⠀⠀⠀⠀⠄⠀⠀⠂⠀⠀⠀⠀⠀⡇⠈⡀⠂⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ 
          │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⢀⠘⠀⡇⠀⠀⠘⠀⠄⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ 
          │⠀⠀⠀⠀⠀⠀⠀⠀⠂⠀⠂⠀⠀⢀⠠⠐⠀⡇⠄⠁⠀⠀⠀⡀⠀⢁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ 
          │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠁⠂⢈⠀⠈⡇⠀⡐⠃⠀⠄⠄⠀⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡀⠀⠀│ 
          │⠀⠀⠀⠀⠀⠀⠀⠀⠀⡀⠄⢐⠠⠀⡄⠀⠀⡇⠀⠂⠈⠀⠐⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ 
          │⠉⠉⠉⠉⠋⠉⠉⠉⠉⠉⠉⠙⢉⠉⠙⠉⠉⡏⠉⠉⠩⠋⠉⠩⠉⠉⠉⡉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉⠉│ 
   f_2    │⠀⠀⠀⠀⠀⡀⠀⠀⠀⠄⠀⠀⡀⠀⠀⠂⠀⡇⠀⠀⠀⠐⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ 
          │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠄⠀⠀⠐⡇⠠⠀⠀⠀⠈⢀⠄⠂⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ 
          │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠂⠀⠄⠀⡀⠀⠂⡇⠐⠘⠈⠂⠀⠈⡀⠀⠀⠀⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ 
          │⠀⠀⠀⠀⠀⠀⠄⠀⠀⠀⠀⠀⠂⠀⠂⠀⠀⡇⠀⠈⢀⠐⠀⠈⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ 
          │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠁⠀⠀⠀⠀⠀⢁⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ 
          │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ 
          │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠠⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ 
       -3 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⡇⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│ 
          └────────────────────────────────────────┘ 
          -3                                       4
                             f_1
source
Metaheuristics.fvalFunction
fval(solution)

Get the objective function value (fitness) of a solution.

source
fval(solution)

Get the fitness of a bee when optimize using ABC algorithm.

source
Metaheuristics.is_betterFunction
is_better(A, B)
return true if A is better than B in a minimization problem.
Feasibility rules and dominated criteria are used in comparison.
source
Metaheuristics.compareFunction
compare(a, b)
compares whether two vectors are dominated or not.
Output:
`1` if argument 1 (a) dominates argument 2 (b).
`2` if argument 2 (b) dominates argument 1 (a).
`3` if both arguments 1 (a) and 2 (b) are incomparable.
`0` if both arguments 1 (a) and 2 (b) are equal.
source
Metaheuristics.gen_initial_stateFunction
gen_initial_state(problem,parameters,information,options)

Generate an initial state, i.e., compute uniformly distributed random vectors in bounds, after that are evaluated in objective function. This method require that parameters.N is valid attribute.

source
Metaheuristics.set_user_solutions!Function
set_user_solutions!(optimizer, x, fx;verbose=true)

Provide initial solutions to the optimizer.

  • x can be a Vector and fx a function or fx = f(x)
  • x can be a matrix containing solutions in rows.

Example

julia> f(x) = abs(x[1]) + x[2]  + x[3]^2 # objective function
f (generic function with 1 method)

julia> algo  = ECA(N = 61); # optimizer

julia> # one solution can be provided
       x0 = [0.5, 0.5, 0.5];

julia> set_user_solutions!(algo, x0, f);

julia> # providing multiple solutions
       X0 = rand(30, 3); # 30 solutions with dim 3

julia> set_user_solutions!(algo, X0, f);

julia> optimize(f, [0 0 0; 1 1 1.0], algo)
+=========== RESULT ==========+
  iteration: 413
    minimum: 0
  minimizer: [0.0, 0.0, 0.0]
    f calls: 25132
 total time: 0.0856 s
stop reason: Small difference of objective function values.
+============================+
source

Variation

Metaheuristics.ECA_operatorFunction
ECA_operator(population, K, η_max)

Compute a solution using ECA variation operator, K is the number of solutions used to calculate the center of mass and η_max is the maximum stepsize.

source
Metaheuristics.DE_crossoverFunction
DE_crossover(x, u, CR)

Binomial crossover between x and u for Differential Evolution with probability CR, i.e., v[j] = u[j] if rand() < CR, otherwise v[j] = x[j]. Return v.

source
Metaheuristics.DE_mutationFunction
DE_mutation(population, F = 1.0, strategy = :rand1)

Generate a Vector computed from population used in Differential Evolution. Parameters: F is the stepsize, strategy can be one the following :best1, :rand2, :randToBest1 or :best2.

source
Metaheuristics.MOEAD_DE_reproductionFunction
MOEAD_DE_reproduction(a, b, c, F, CR, p_m, η, bounds)

Perform Differential Evolution operators and polynomial mutation using three vectors a, b, c and parameters F, CR, p_m, η, i.e., stepsize, crossover and mutation probability.

source
Metaheuristics.GA_reproductionFunction
GA_reproduction(pa::AbstractVector{T},
                pb::AbstractVector{T},
                bounds;
                η_cr = 20,
                η_m  = 15,
                p_cr = 0.9,
                p_m  = 0.1)

Crate two solutions by applying SBX to parents pa and pb and polynomial mutation to offspring. Return two vectors.

source
Metaheuristics.GA_reproduction_halfFunction
GA_reproduction_half(pa::AbstractVector{T},
                pb::AbstractVector{T},
                bounds;
                η_cr = 20,
                η_m  = 15,
                p_cr = 0.9,
                p_m  = 0.1)

Same that GA_reproduction but only returns one offspring.

source
Metaheuristics.SlightMutationType
SlightMutation

Fogel, D. B. (1988). An evolutionary approach to the traveling salesman problem. Biological Cybernetics, 60(2), 139-144.

source

Population

Metaheuristics.nadirFunction
nadir(points)

Computes the nadir point from a provided array of Vectors or a population or row vectors in a Matrix.

source
Metaheuristics.idealFunction
ideal(points)

Computes the ideal point from a provided array of Vectors or a population or row vectors in a Matrix.

source
Metaheuristics.get_frontsFunction
get_fronts(population, computed_ranks = true)

Return each sub-front in an array. If computed_ranks == true, this method assumes that fast_non_dominated_sort!(population) has been called before.

source

Stopping Criteria

Metaheuristics.diff_checkFunction
diff_check(status, information, options; d = options.f_tol, p = 0.5)

Check the difference between best and worst objective function values in current population (where at least %p of solution are feasible). Return true when such difference is <= d, otherwise return false.

Ref. Zielinski, K., & Laur, R. (n.d.). Stopping Criteria for Differential Evolution in Constrained Single-Objective Optimization. Studies in Computational Intelligence, 111–138. doi:10.1007/978-3-540-68830-34 (https://doi.org/10.1007/978-3-540-68830-34)

source
Metaheuristics.accuracy_stop_checkFunction
accuracy_stop_check(status, information, options)

If the optimum is provided, then check if the accuracy is met via abs(status.best_sol.f - information.f_optimum) < options.f_tol.

source

Sampling

Metaheuristics.sampleFunction
sample(method, [bounds])

Return a matrix with data by rows generated by using method (real representation) in inclusive interval [0, 1]. Here, method can be LatinHypercubeSampling, Grid or RandomInBounds.

Example

julia> sample(LatinHypercubeSampling(10,2))
10×2 Matrix{Float64}:
 0.0705631  0.795046
 0.7127     0.0443734
 0.118018   0.114347
 0.48839    0.903396
 0.342403   0.470998
 0.606461   0.275709
 0.880482   0.89515
 0.206142   0.321041
 0.963978   0.527518
 0.525742   0.600209

julia> sample(LatinHypercubeSampling(10,2), [-10 -10;10 10.0])
10×2 Matrix{Float64}:
 -7.81644   -2.34461
  0.505902   0.749366
  3.90738   -8.57816
 -2.05837    9.803
  5.62434    6.82463
 -9.34437    2.72363
  6.43987   -1.74596
 -1.3162    -4.50273
  9.45114   -7.13632
 -4.71696    5.0381
source
Metaheuristics.LatinHypercubeSamplingType
LatinHypercubeSampling(nsamples, dim; iterations)

Create N solutions within a Latin Hypercube sample in bounds with dim.

Example

julia> sample(LatinHypercubeSampling(10,2))
10×2 Matrix{Float64}:
 0.0705631  0.795046
 0.7127     0.0443734
 0.118018   0.114347
 0.48839    0.903396
 0.342403   0.470998
 0.606461   0.275709
 0.880482   0.89515
 0.206142   0.321041
 0.963978   0.527518
 0.525742   0.600209

julia> sample(LatinHypercubeSampling(10,2), [-10 -10;10 10.0])
10×2 Matrix{Float64}:
 -7.81644   -2.34461
  0.505902   0.749366
  3.90738   -8.57816
 -2.05837    9.803
  5.62434    6.82463
 -9.34437    2.72363
  6.43987   -1.74596
 -1.3162    -4.50273
  9.45114   -7.13632
 -4.71696    5.0381
source
Metaheuristics.GridType
Grid(npartitions, dim)

Parameters to generate a grid with npartitions in a space with dim dimensions.

Example

julia> sample(Grid(5,2))
25×2 Matrix{Float64}:
 0.0   0.0
 0.25  0.0
 0.5   0.0
 0.75  0.0
 ⋮     
 0.5   1.0
 0.75  1.0
 1.0   1.0

julia> sample(Grid(5,2), [-1 -1; 1 1.])
25×2 Matrix{Float64}:
 -1.0  -1.0
 -0.5  -1.0
  0.0  -1.0
  0.5  -1.0
  ⋮    
  0.0   1.0
  0.5   1.0
  1.0   1.0

Note that the sample is with size npartitions^(dim).

source