Single-objective

Evolutionary Centers Algorithm

ECA was proposed for solving global optimization problems. See (Mejía-de-Dios and Mezura-Montes, 2019) for more information.

Metaheuristics.ECAType
ECA(;
    η_max = 2.0,
    K = 7,
    N = 0,
    N_init = N,
    p_exploit = 0.95,
    p_bin = 0.02,
    p_cr = Float64[],
    adaptive = false,
    resize_population = false,
    information = Information(),
    options = Options()
)

Parameters for the metaheuristic ECA: step-size η_max,K is number of vectors to generate the center of mass, N is the population size.

Example

julia> f(x) = sum(x.^2)
f (generic function with 1 method)

julia> optimize(f, [-1 -1 -1; 1 1 1.0], ECA())
+=========== RESULT ==========+
  iteration: 1429
    minimum: 3.3152400000000004e-223
  minimizer: [4.213750597785841e-113, 5.290977430907081e-112, 2.231685329262638e-112]
    f calls: 29989
 total time: 0.1672 s
+============================+

julia> optimize(f, [-1 -1 -1; 1 1 1.0], ECA(N = 10, η_max = 1.0, K = 3))
+=========== RESULT ==========+
  iteration: 3000
    minimum: 0.000571319
  minimizer: [-0.00017150889316537758, -0.007955828028420616, 0.022538733289139145]
    f calls: 30000
 total time: 0.1334 s
+============================+
source

Differential Evolution

DE is an evolutionary algorithm based on vector differences. See (Price, 2013) for more details.

Metaheuristics.DEType
DE(;
    N  = 0,
    F  = 1.0,
    CR = 0.5,
    strategy = :rand1,
    information = Information(),
    options = Options()
)

Parameters for Differential Evolution (DE) algorithm: step-size F,CR controlls the binomial crossover, N is the population size. The parameter strategy is related to the variation operator (:rand1, :rand2, :best1, :best2, :randToBest1).

Example

julia> f(x) = sum(x.^2)
f (generic function with 1 method)

julia> optimize(f, [-1 -1 -1; 1 1 1.0], DE())
+=========== RESULT ==========+
  iteration: 1000
    minimum: 0
  minimizer: [0.0, 0.0, 0.0]
    f calls: 30000
 total time: 0.0437 s
+============================+

julia> optimize(f, [-1 -1 -1; 1 1 1.0], DE(N=50, F=1.5, CR=0.8))
+=========== RESULT ==========+
  iteration: 600
    minimum: 8.68798e-25
  minimizer: [3.2777877981303293e-13, 3.7650459509488005e-13, -7.871487597385812e-13]
    f calls: 30000
 total time: 0.0319 s
+============================+
source

Particle Swarm Optimization

PSO is a population-based optimization technique inspired by the motion of bird flocks and schooling fish by (Kennedy and Eberhart, 1995).

Metaheuristics.PSOType
PSO(;
    N  = 0,
    C1 = 2.0,
    C2 = 2.0,
    ω  = 0.8,
    information = Information(),
    options = Options()
)

Parameters for Particle Swarm Optimization (PSO) algorithm: learning rates C1 and C2, N is the population size and ω controls the inertia weight.

Example

julia> f(x) = sum(x.^2)
f (generic function with 1 method)

julia> optimize(f, [-1 -1 -1; 1 1 1.0], PSO())
+=========== RESULT ==========+
  iteration: 1000
    minimum: 1.40522e-49
  minimizer: [3.0325415595139883e-25, 1.9862212295897505e-25, 9.543772256546461e-26]
    f calls: 30000
 total time: 0.1558 s
+============================+

julia> optimize(f, [-1 -1 -1; 1 1 1.0], PSO(N = 100, C1=1.5, C2=1.5, ω = 0.7))
+=========== RESULT ==========+
  iteration: 300
    minimum: 2.46164e-39
  minimizer: [-3.055334698085433e-20, -8.666986835846171e-21, -3.8118413472544027e-20]
    f calls: 30000
 total time: 0.1365 s
+============================+
source

Artificial Bee Colony

A powerful and efficient algorithm for numerical function optimization: artificial bee colony (ABC) algorithm by (Karaboga and Basturk, 2007).

Metaheuristics.ABCType
ABC(;
    N = 50,
    Ne = div(N+1, 2),
    No = div(N+1, 2),
    limit=10,
    information = Information(),
    options = Options()
)

ABC implements the original parameters for the Artificial Bee Colony Algorithm. N is the population size, Ne is the number of employees, No is the number of outlookers bees. limit is related to the times that a solution is visited.

Example

julia> f(x) = sum(x.^2)
f (generic function with 1 method)

julia> optimize(f, [-1 -1 -1; 1 1 1.0], ABC())
+=========== RESULT ==========+
  iteration: 595
    minimum: 4.03152e-28
  minimizer: [1.489845115451046e-14, 1.2207275971717747e-14, -5.671872444705246e-15]
    f calls: 30020
 total time: 0.0360 s
+============================+

julia> optimize(f, [-1 -1 -1; 1 1 1.0], ABC(N = 80,  No = 20, Ne = 50, limit=5))
+=========== RESULT ==========+
  iteration: 407
    minimum: 8.94719e-08
  minimizer: [8.257485723496422e-5, 0.0002852795196258074, -3.5620824723352315e-5]
    f calls: 30039
 total time: 0.0432 s
+============================+
source

Gravitational Search Algorithm

Chaotic gravitational constants for the gravitational search algorithm by (Mirjalili and Gandomi, 2017)

Metaheuristics.CGSAType
CGSA(;
    N::Int    = 30,
    chValueInitial::Real   = 20,
    chaosIndex::Real   = 9,
    ElitistCheck::Int    = 1,
    Rpower::Int    = 1,
    Rnorm::Int    = 2,
    wMax::Real   = chValueInitial,
    wMin::Real   = 1e-10,
    information = Information(),
    options = Options()
)

CGSA is an extension of the GSA algorithm but with Chaotic gravitational constants for the gravitational search algorithm.

Ref. Chaotic gravitational constants for the gravitational search algorithm. Applied Soft Computing 53 (2017): 407-419.

Parameters:

  • N: Population size
  • chValueInitial: Initial value for the chaos value
  • chaosIndex: Integer 1 ≤ chaosIndex ≤ 10 is the function that model the chaos
  • Rpower: power related to the distance norm(x)^Rpower
  • Rnorm: is the value as in norm(x, Rnorm)

Example

julia> f(x) = sum(x.^2)
f (generic function with 1 method)

julia> optimize(f, [-1 -1 -1; 1 1 1.0], CGSA())
+=========== RESULT ==========+
  iteration: 500
    minimum: 8.63808e-08
  minimizer: [0.0002658098418993323, -1.140808975532608e-5, -0.00012488307670533095]
    f calls: 15000
 total time: 0.1556 s
+============================+

julia> optimize(f, [-1 -1 -1; 1 1 1.0], CGSA(N = 80, chaosIndex = 1))
+=========== RESULT ==========+
  iteration: 500
    minimum: 1.0153e-09
  minimizer: [-8.8507563788141e-6, -1.3050111801923072e-5, 2.7688577445980026e-5]
    f calls: 40000
 total time: 1.0323 s
+============================+
source

Simulated Annealing

Physics-inspired algorithm for optimization by (Van L. and Aarts, 1987).

Metaheuristics.SAType
    SA(;
        x_initial::Vector = zeros(0),
        N::Int = 500,
        tol_fun::Real= 1e-4,
        information = Information(),
        options = Options()
    )

Parameters for the method of Simulated Annealing (Kirkpatrick et al., 1983).

Parameters:

  • x_intial: Inital solution. If empty, then SA will generate a random one within the bounds.
  • N: The number of test points per iteration.
  • tol_fun: tolerance value for the Metropolis condition to accept or reject the test point as current point.

Example

julia> f(x) = sum(x.^2)
f (generic function with 1 method)

julia> optimize(f, [-1 -1 -1; 1 1 1.0], SA())
+=========== RESULT ==========+
  iteration: 60
    minimum: 5.0787e-68
  minimizer: [-2.2522059499734615e-34, 3.816133503985569e-36, 6.934348004465088e-36]
    f calls: 29002
 total time: 0.0943 s
+============================+

julia> optimize(f, [-1 -1 -1; 1 1 1.0], SA(N = 100, x_initial = [1, 0.5, -1]))
+=========== RESULT ==========+
  iteration: 300
    minimum: 1.99651e-69
  minimizer: [4.4638292404181215e-35, -1.738939846089388e-36, -9.542441152683457e-37]
    f calls: 29802
 total time: 0.0965 s
+============================+
source

BCA

Bilevel Centers Algorithm has been proposed to solve bilevel optimization problems. See BilevelHeuristics.BCA for details.

MCCGA

Machine-coded Compact Genetic Algorithms for real-valued optimization problems by (Satman and Akadal, 2020).

Metaheuristics.MCCGAType
MCCGA(;N, maxsamples)

Parameters:

  • N population size. Default is 100.
  • maxsamples maximum number of samples. Default is 10000.

Description

MCCGA method implements the Machine-coded genetic algorithms for real valued optimization problems. The algorithm is based on the concept of a compact genetic algorithm but with a machine-coded representation using IEEE-754 floating point encoding standard. In the first stage of the algorithm, maxsamples number of samples are generated within the range of function domain. This process is required to obtain a vector of probabilities for each single bit of the IEEE-754 representation. In classical CGAs, the initial vector of probabilities is generated using the constant probability of 0.5 whereas in MCCGA, the probability of ith bit having a value of 1 depends on the function domain. The second step performs a CGA search but with IEEE-754 bits again. Since CGA does not use a population of solutions but a single vector of probabilities, the parameter N does not really mean number of solutions. Instead, it means the amount of mutation in each iteration, e.g. 1/N. In the last stage, a local search is performed for fine-tuning. In this implementation, Hooke & Jeeves direct search algorithm is used.

References

  • Moser, Irene. "Hooke-Jeeves Revisited." 2009 IEEE Congress on Evolutionary Computation. IEEE, 2009.
  • Harik, G. R., Lobo, F. G., & Goldberg, D. E. (1999). The compact genetic algorithm. IEEE transactions on evolutionary computation, 3(4), 287-297.
  • Satman, M. H. & Akadal, E. (2020). Machine Coded Compact Genetic Algorithms for Real Parameter Optimization Problems . Alphanumeric Journal , 8 (1) , 43-58 . DOI: 10.17093/alphanumeric.576919
  • Mehmet Hakan Satman, Emre Akadal, Makine Kodlu Hibrit Kompakt Genetik Algoritmalar Optimizasyon Yöntemi, Patent, TR, 2022/01, 2018-GE-510239

Example


julia> f, bounds, solutions = Metaheuristics.TestProblems.rastrigin();

julia> result = optimize(f, bounds, MCCGA())

+=========== RESULT ==========+
  iteration: 42833
    minimum: 0
  minimizer: [-5.677669786379456e-17, 2.7942451898022582e-39, -3.60925916059986e-33, -6.609510861086017e-34, 2.998586759675011e-32, -1.8825832500775007e-38, -3.0729484147585938e-31, 1.7675578057632446e-38, 5.127944874215823e-16, 1.9623770480857484e-19]
    f calls: 86404
 total time: 3.2839 s
+============================+

Explicit Example


julia> using Metaheuristics                                                                                         
                                                              
julia> f(x::Vector{Float64})::Float64 = (x[1]-pi)^2 + (x[2]-exp(1))^2                                                                                
                                                              
julia> bounds = [ -500.0  -500.0;                                    
                   500.0  500.0]                                      

julia> result = Metaheuristics.optimize(f, bounds, MCCGA())

+=========== RESULT ==========+
  iteration: 2974
    minimum: 1.80997e-09
  minimizer: [3.1416284249228976, 2.7183048585824263]
    f calls: 6012
 total time: 1.5233 s
stop reason: Other stopping criteria.
+============================+
source

GA

Metaheuristics.GAType
GA(;
    N = 100,
    p_mutation  = 1e-5,
    p_crossover = 0.5,
    initializer = RandomInBounds(),
    selection   = TournamentSelection(),
    crossover   = UniformCrossover(),
    mutation    = BitFlipMutation(),
    environmental_selection = ElitistReplacement()
    )

Just another Genetic Algorithm Framework.

Parameters:

  • N is the population size.
  • p_mutation mutation probability (gen).
  • p_crossover crossover probability (gen).
  • initializer parameters for the population initializer.
  • selection parameters for the selection operator.
  • crossover parameters for the crossover operators.
  • mutation parameters for the mutation operator.
  • environmental_selection parameters for the replacement method.

Example: Binary Encoding (default)

julia> f(x) = sum(x) / length(x)
f (generic function with 1 method)

julia> dim = 10;

julia> optimize(f, BitArraySpace(dim), GA())
+=========== RESULT ==========+
  iteration: 500
    minimum: 0
  minimizer: Bool[0, 0, 0, 0, 0, 0, 0, 0, 0, 0]
    f calls: 50000
 total time: 0.0523 s
stop reason: Maximum number of iterations exceeded.
+============================+

Example: Permutation Encoding

julia> f(x) = sum(abs.(x .- (length(x):-1:1.0)))
f (generic function with 1 method)

julia> perm_size = 10;

julia> ga = GA(;initializer = RandomPermutation(N=100), crossover=OrderCrossover(), mutation=SlightMutation());

julia> optimize(f, PermutationSpace(perm_size), ga)
+=========== RESULT ==========+
  iteration: 500
    minimum: 0
  minimizer: [10, 9, 8, 7, 6, 5, 4, 3, 2, 1]
    f calls: 49900
 total time: 0.6230 s
stop reason: Maximum number of iterations exceeded.
+============================+

Example: Integer Encoding

julia> c = randn(10);

julia> A = randn(10, 10);

julia> f(x) = c'*x, A*x, [0.0]
f (generic function with 1 method)

julia> bounds = repeat([0,100], 1, 10)
2×10 Matrix{Int64}:
   0    0    0    0    0    0    0    0    0    0
 100  100  100  100  100  100  100  100  100  100

julia> ga = GA(;crossover=SBX(;bounds),
                mutation=PolynomialMutation(;bounds),
                environmental_selection=GenerationalReplacement()
            );

julia> result = optimize(f, bounds, ga)
+=========== RESULT ==========+
  iteration: 500
    minimum: -76.9031
  minimizer: [0, 10, 49, 100, 24, 0, 0, 0, 67, 76]
    f calls: 50000
  feasibles: 95 / 100 in final population
 total time: 0.6028 s
stop reason: Maximum number of iterations exceeded.
+============================+

Example: Real Encoding

julia> f, bounds, solutions = Metaheuristics.TestProblems.rastrigin();

julia> ga = GA(;mutation=PolynomialMutation(;bounds),
                crossover=SBX(;bounds),
                environmental_selection=GenerationalReplacement()
               );

julia> optimize(f, bounds, ga)
+=========== RESULT ==========+
  iteration: 500
    minimum: 5.91136e-09
  minimizer: [-5.446073166732363e-6, -1.7900876625850504e-7, 2.8548505431323723e-8, 2.1130514595980084e-8, -1.632381298278562e-8, 2.8216219650016283e-9, -3.2114953600427333e-7, 2.4222648522114125e-9, -5.5236545829928716e-9, -2.3479408274628334e-9]
    f calls: 49900
 total time: 0.5775 s
stop reason: Maximum number of iterations exceeded.
+============================+
source

$\varepsilon$DE

$\varepsilon$ Constrained Differential Evolution with Gradient-Based Mutation and Feasible Elites by (Takahama and Sakai, 2006).

Gradient mutation

Gradient mutation is not implemented here.

Metaheuristics.εDEType
εDE(cp = 5, DE_kargs...)
epsilonDE(cp = 5, DE_kargs...)

Parameters for ε Differential Evolution for constrained optimization.

See DE for more details about DE parameters (DE_kargs).

This implementation is not implementing the gradient-based repair method.

source