Algorithms
List of implemented metaheuristics. The algorithms were implemented based on the contributor's understanding of the algorithms detailed in the published paper.
Algorithm | Objective | Constraints | Large Scale | Batch Evaluation | Structure Name |
---|---|---|---|---|---|
ECA | Single | ✅ | ➖ | ✅ | ECA |
DE | Single | ✅ | ➖ | ✅ | DE |
PSO | Single | ✅ | ➖ | ✅ | PSO |
ABC | Single | ❌ | ➖ | ❌ | ABC |
MOEA/D-DE | Multi | ➖ | ➖ | ❌ | MOEAD_DE |
GSA | Single | ❌ | ❌ | ✅ | CGSA |
SA | Single | ✅ | ➖ | ❌ | SA |
WOA | Single | ✅ | ➖ | ✅ | WOA |
NSGA-II | Multi | ✅ | ➖ | ✅ | NSGA2 |
NSGA-III | Many | ✅ | ➖ | ✅ | NSGA3 |
SMS-EMOA | Multi | ✅ | ➖ | ✅ | SMS_EMOA |
SPEA2 | Multi | ✅ | ➖ | ✅ | SPEA2 |
BCA | Bilevel | ✅ | ❌ | ❌ | BCA |
MCCGA | Single | ❌ | ❌ | ❌ | MCCGA |
GA | Single | ✅ | ➖ | ✅ | GA |
CCMO | Multi | ✅ | ➖ | ✅ | CCMO |
$\varepsilon$DE | Single | ✅ | ➖ | ✅ | εDE |
BRKGA | Single | ✅ | ➖ | ✅ | BRKGA |
✅ = supported, ❌ = not supported, ➖ = can be supported by changing default parameters.
- Batch Evaluation = Simultaneous evaluation of multiple solutions (batch) see "Batch Evaluation".
- Constraints = Equality and inequality constraints.
- Large Scale = High dimensional problems (variables space).
Evolutionary Centers Algorithm
ECA was proposed for solving global optimization problems. See [1] for more information.
Metaheuristics.ECA
— TypeECA(;
η_max = 2.0,
K = 7,
N = 0,
N_init = N,
p_exploit = 0.95,
p_bin = 0.02,
p_cr = Float64[],
adaptive = false,
resize_population = false,
information = Information(),
options = Options()
)
Parameters for the metaheuristic ECA: step-size η_max
,K
is number of vectors to generate the center of mass, N
is the population size.
Example
julia> f(x) = sum(x.^2)
f (generic function with 1 method)
julia> optimize(f, [-1 -1 -1; 1 1 1.0], ECA())
+=========== RESULT ==========+
iteration: 1429
minimum: 3.3152400000000004e-223
minimizer: [4.213750597785841e-113, 5.290977430907081e-112, 2.231685329262638e-112]
f calls: 29989
total time: 0.1672 s
+============================+
julia> optimize(f, [-1 -1 -1; 1 1 1.0], ECA(N = 10, η_max = 1.0, K = 3))
+=========== RESULT ==========+
iteration: 3000
minimum: 0.000571319
minimizer: [-0.00017150889316537758, -0.007955828028420616, 0.022538733289139145]
f calls: 30000
total time: 0.1334 s
+============================+
Differential Evolution
DE is an evolutionary algorithm based on vector differences. See [2] for more details.
Metaheuristics.DE
— TypeDE(;
N = 0,
F = 1.0,
CR = 0.5,
strategy = :rand1,
information = Information(),
options = Options()
)
Parameters for Differential Evolution (DE) algorithm: step-size F
,CR
controlls the binomial crossover, N
is the population size. The parameter strategy
is related to the variation operator (:rand1
, :rand2
, :best1
, :best2
, :randToBest1
).
Example
julia> f(x) = sum(x.^2)
f (generic function with 1 method)
julia> optimize(f, [-1 -1 -1; 1 1 1.0], DE())
+=========== RESULT ==========+
iteration: 1000
minimum: 0
minimizer: [0.0, 0.0, 0.0]
f calls: 30000
total time: 0.0437 s
+============================+
julia> optimize(f, [-1 -1 -1; 1 1 1.0], DE(N=50, F=1.5, CR=0.8))
+=========== RESULT ==========+
iteration: 600
minimum: 8.68798e-25
minimizer: [3.2777877981303293e-13, 3.7650459509488005e-13, -7.871487597385812e-13]
f calls: 30000
total time: 0.0319 s
+============================+
Particle Swarm Optimization
PSO is a population-based optimization technique inspired by the motion of bird flocks and schooling fish by [3].
Metaheuristics.PSO
— TypePSO(;
N = 0,
C1 = 2.0,
C2 = 2.0,
ω = 0.8,
information = Information(),
options = Options()
)
Parameters for Particle Swarm Optimization (PSO) algorithm: learning rates C1
and C2
, N
is the population size and ω
controls the inertia weight.
Example
julia> f(x) = sum(x.^2)
f (generic function with 1 method)
julia> optimize(f, [-1 -1 -1; 1 1 1.0], PSO())
+=========== RESULT ==========+
iteration: 1000
minimum: 1.40522e-49
minimizer: [3.0325415595139883e-25, 1.9862212295897505e-25, 9.543772256546461e-26]
f calls: 30000
total time: 0.1558 s
+============================+
julia> optimize(f, [-1 -1 -1; 1 1 1.0], PSO(N = 100, C1=1.5, C2=1.5, ω = 0.7))
+=========== RESULT ==========+
iteration: 300
minimum: 2.46164e-39
minimizer: [-3.055334698085433e-20, -8.666986835846171e-21, -3.8118413472544027e-20]
f calls: 30000
total time: 0.1365 s
+============================+
Artificial Bee Colony
A powerful and efficient algorithm for numerical function optimization: artificial bee colony (ABC) algorithm by [4].
Metaheuristics.ABC
— TypeABC(;
N = 50,
Ne = div(N+1, 2),
No = div(N+1, 2),
limit=10,
information = Information(),
options = Options()
)
ABC implements the original parameters for the Artificial Bee Colony Algorithm. N
is the population size, Ne
is the number of employees, No
is the number of outlookers bees. limit
is related to the times that a solution is visited.
Example
julia> f(x) = sum(x.^2)
f (generic function with 1 method)
julia> optimize(f, [-1 -1 -1; 1 1 1.0], ABC())
+=========== RESULT ==========+
iteration: 595
minimum: 4.03152e-28
minimizer: [1.489845115451046e-14, 1.2207275971717747e-14, -5.671872444705246e-15]
f calls: 30020
total time: 0.0360 s
+============================+
julia> optimize(f, [-1 -1 -1; 1 1 1.0], ABC(N = 80, No = 20, Ne = 50, limit=5))
+=========== RESULT ==========+
iteration: 407
minimum: 8.94719e-08
minimizer: [8.257485723496422e-5, 0.0002852795196258074, -3.5620824723352315e-5]
f calls: 30039
total time: 0.0432 s
+============================+
MOEA/D-DE
Multiobjective optimization problems with complicated Pareto sets by [5].
Metaheuristics.MOEAD_DE
— TypeMOEAD_DE(weights ;
F = 0.5,
CR = 1.0,
λ = Array{Vector{Float64}}[], # ref. points
η = 20,
p_m = -1.0,
T = round(Int, 0.2*length(weights)),
δ = 0.9,
n_r = round(Int, 0.02*length(weights)),
z = zeros(0),
B = Array{Int}[],
s1 = 0.01,
s2 = 20.0,
information = Information(),
options = Options())
MOEAD_DE
implements the original version of MOEA/D-DE. It uses the contraint handling method based on the sum of violations (for constrained optimizaton): g(x, λ, z) = max(λ .* abs.(fx - z)) + sum(max.(0, gx)) + sum(abs.(hx))
To use MOEAD_DE, the output from the objective function should be a 3-touple (f::Vector, g::Vector, h::Vector)
, where f
contains the objective functions, g
and h
are the equality and inequality constraints respectively.
A feasible solution is such that g_i(x) ≤ 0 and h_j(x) = 0
.
Ref. Multiobjective Optimization Problems With Complicated Pareto Sets, MOEA/D and NSGA-II; Hui Li and Qingfu Zhang.
Example
Assume you want to solve the following optimizaton problem:
Minimize:
f(x) = (x_1, x_2)
subject to:
g(x) = x_1^2 + x_2^2 - 1 ≤ 0
x_1, x_2 ∈ [-1, 1]
A solution can be:
# Dimension
D = 2
# Objective function
f(x) = ( x, [sum(x.^2) - 1], [0.0] )
# bounds
bounds = [-1 -1;
1 1.0
]
nobjectives = 2
npartitions = 100
# reference points (Das and Dennis's method)
weights = gen_ref_dirs(nobjectives, npartitions)
# define the parameters
moead_de = MOEAD_DE(weights, options=Options(debug=false, iterations = 250))
# optimize
status_moead = optimize(f, bounds, moead_de)
# show results
display(status_moead)
Gravitational Search Algorithm
Chaotic gravitational constants for the gravitational search algorithm by [6]
Metaheuristics.CGSA
— TypeCGSA(;
N::Int = 30,
chValueInitial::Real = 20,
chaosIndex::Real = 9,
ElitistCheck::Int = 1,
Rpower::Int = 1,
Rnorm::Int = 2,
wMax::Real = chValueInitial,
wMin::Real = 1e-10,
information = Information(),
options = Options()
)
CGSA is an extension of the GSA algorithm but with Chaotic gravitational constants for the gravitational search algorithm.
Ref. Chaotic gravitational constants for the gravitational search algorithm. Applied Soft Computing 53 (2017): 407-419.
Parameters:
- N: Population size
- chValueInitial: Initial value for the chaos value
- chaosIndex: Integer 1 ≤ chaosIndex ≤ 10 is the function that model the chaos
- Rpower: power related to the distance norm(x)^Rpower
- Rnorm: is the value as in norm(x, Rnorm)
Example
julia> f(x) = sum(x.^2)
f (generic function with 1 method)
julia> optimize(f, [-1 -1 -1; 1 1 1.0], CGSA())
+=========== RESULT ==========+
iteration: 500
minimum: 8.63808e-08
minimizer: [0.0002658098418993323, -1.140808975532608e-5, -0.00012488307670533095]
f calls: 15000
total time: 0.1556 s
+============================+
julia> optimize(f, [-1 -1 -1; 1 1 1.0], CGSA(N = 80, chaosIndex = 1))
+=========== RESULT ==========+
iteration: 500
minimum: 1.0153e-09
minimizer: [-8.8507563788141e-6, -1.3050111801923072e-5, 2.7688577445980026e-5]
f calls: 40000
total time: 1.0323 s
+============================+
Simulated Annealing
Physics-inspired algorithm for optimization by [7].
Metaheuristics.SA
— Type SA(;
x_initial::Vector = zeros(0),
N::Int = 500,
tol_fun::Real= 1e-4,
information = Information(),
options = Options()
)
Parameters for the method of Simulated Annealing (Kirkpatrick et al., 1983).
Parameters:
- x_intial: Inital solution. If empty, then SA will generate a random one within the bounds.
- N: The number of test points per iteration.
- tol_fun: tolerance value for the Metropolis condition to accept or reject the test point as current point.
Example
julia> f(x) = sum(x.^2)
f (generic function with 1 method)
julia> optimize(f, [-1 -1 -1; 1 1 1.0], SA())
+=========== RESULT ==========+
iteration: 60
minimum: 5.0787e-68
minimizer: [-2.2522059499734615e-34, 3.816133503985569e-36, 6.934348004465088e-36]
f calls: 29002
total time: 0.0943 s
+============================+
julia> optimize(f, [-1 -1 -1; 1 1 1.0], SA(N = 100, x_initial = [1, 0.5, -1]))
+=========== RESULT ==========+
iteration: 300
minimum: 1.99651e-69
minimizer: [4.4638292404181215e-35, -1.738939846089388e-36, -9.542441152683457e-37]
f calls: 29802
total time: 0.0965 s
+============================+
Whale Optimization Algorithm
The Whale Optimization Algorithm inspired by humpback whales proposed in [8].
Metaheuristics.WOA
— TypeWOA(;N = 30, information = Information(), options = Options())
Parameters for the Whale Optimization Algorithm. N
is the population size (number of whales).
Example
julia> f(x) = sum(x.^2)
f (generic function with 1 method)
julia> optimize(f, [-1 -1 -1; 1 1 1.0], WOA())
+=========== RESULT ==========+
iteration: 500
minimum: 3.9154600000000003e-100
minimizer: [8.96670478694908e-52, -1.9291317455298046e-50, 4.3113080446722046e-51]
f calls: 15000
total time: 0.0134 s
+============================+
julia> optimize(f, [-1 -1 -1; 1 1 1.0], WOA(N = 100))
+=========== RESULT ==========+
iteration: 500
minimum: 1.41908e-145
minimizer: [9.236161414012512e-74, -3.634919950380001e-73, 3.536831799149254e-74]
f calls: 50000
total time: 0.0588 s
+============================+
NSGA-II
A fast and elitist multiobjective genetic algorithm: NSGA-II by [9].
Metaheuristics.NSGA2
— TypeNSGA2(;
N = 100,
η_cr = 20,
p_cr = 0.9,
η_m = 20,
p_m = 1.0 / D,
information = Information(),
options = Options(),
)
Parameters for the metaheuristic NSGA-II.
Parameters:
N
Population size.η_cr
η for the crossover.p_cr
Crossover probability.η_m
η for the mutation operator.p_m
Mutation probability (1/D for D-dimensional problem by default).
To use NSGA2, the output from the objective function should be a 3-touple (f::Vector, g::Vector, h::Vector)
, where f
contains the objective functions, g
and h
are inequality, equality constraints respectively.
A feasible solution is such that g_i(x) ≤ 0 and h_j(x) = 0
.
using Metaheuristics
# Dimension
D = 2
# Objective function
f(x) = ( x, [sum(x.^2) - 1], [0.0] )
# bounds
bounds = [-1 -1;
1 1.0
]
# define the parameters (use `NSGA2()` for using default parameters)
nsga2 = NSGA2(N = 100, p_cr = 0.85)
# optimize
status = optimize(f, bounds, nsga2)
# show results
display(status)
NSGA-III
An Evolutionary Many-Objective Optimization Algorithm Using Reference-Point-Based Nondominated Sorting Approach, Part I: Solving Problems With Box Constraints by [10].
Metaheuristics.NSGA3
— TypeNSGA3(;
N = 100,
η_cr = 20,
p_cr = 0.9,
η_m = 20,
p_m = 1.0 / D,
partitions = 12,
reference_points = Vector{Float64}[],
information = Information(),
options = Options(),
)
Parameters for the metaheuristic NSGA-III.
Parameters:
N
Population size.η_cr
η for the crossover.p_cr
Crossover probability.η_m
η for the mutation operator.p_m
Mutation probability (1/D for D-dimensional problem by default).reference_points
reference points usually generated bygen_ref_dirs
.partitions
number of Das and Dennis's reference points ifreference_points
is empty.
To use NSGA3, the output from the objective function should be a 3-touple (f::Vector, g::Vector, h::Vector)
, where f
contains the objective functions, g
and h
are inequality, equality constraints respectively.
A feasible solution is such that g_i(x) ≤ 0 and h_j(x) = 0
.
using Metaheuristics
# Objective function, bounds, and the True Pareto front
f, bounds, pf = Metaheuristics.TestProblems.get_problem(:DTLZ2)
# define the parameters (use `NSGA3()` for using default parameters)
nsga3 = NSGA3(p_cr = 0.9)
# optimize
status = optimize(f, bounds, nsga3)
# show results
display(status)
SMS-EMOA
An EMO algorithm using the hypervolume measure as a selection criterion by [11].
Metaheuristics.SMS_EMOA
— TypeSMS_EMOA(;
N = 100,
η_cr = 20,
p_cr = 0.9,
η_m = 20,
p_m = 1.0 / D,
n_samples = 10_000,
information = Information(),
options = Options(),
)
Parameters for the metaheuristic SMS-EMOA.
Parameters:
N
Population size.η_cr
η for the crossover.p_cr
Crossover probability.η_m
η for the mutation operator.p_m
Mutation probability (1/D for D-dimensional problem by default).n_samples
number of samples to approximate hypervolume in many-objective (M > 2).
To use SMS_EMOA, the output from the objective function should be a 3-touple (f::Vector, g::Vector, h::Vector)
, where f
contains the objective functions, g
and h
are inequality, equality constraints respectively.
A feasible solution is such that g_i(x) ≤ 0 and h_j(x) = 0
.
using Metaheuristics
# Dimension
D = 2
# Objective function
f(x) = ( x, [sum(x.^2) - 1], [0.0] )
# bounds
bounds = [-1 -1;
1 1.0
]
# define the parameters (use `SMS_EMOA()` for using default parameters)
sms_emoa = SMS_EMOA(N = 100, p_cr = 0.85)
# optimize
status = optimize(f, bounds, sms_emoa)
# show results
display(status)
SPEA2
Improved strength Pareto evolutionary algorithm by [12].
Metaheuristics.SPEA2
— TypeSPEA2(;
N = 100,
η_cr = 20,
p_cr = 0.9,
η_m = 20,
p_m = 1.0 / D,
information = Information(),
options = Options(),
)
Parameters for the metaheuristic NSGA-II.
Parameters:
N
Population size.η_cr
η for the crossover.p_cr
Crossover probability.η_m
η for the mutation operator.p_m
Mutation probability (1/D for D-dimensional problem by default).
To use SPEA2, the output from the objective function should be a 3-touple (f::Vector, g::Vector, h::Vector)
, where f
contains the objective functions, g
and h
are inequality, equality constraints respectively.
A feasible solution is such that g_i(x) ≤ 0 and h_j(x) = 0
.
using Metaheuristics
# Dimension
D = 2
# Objective function
f(x) = ( x, [sum(x.^2) - 1], [0.0] )
# bounds
bounds = [-1 -1;
1 1.0
]
# define the parameters (use `SPEA2()` for using default parameters)
nsga2 = SPEA2(N = 100, p_cr = 0.85)
# optimize
status = optimize(f, bounds, nsga2)
# show results
display(status)
BCA
Bilevel Centers Algorithm has been proposed to solve bilevel optimization problems. See BilevelHeuristics.BCA
for details.
MCCGA
Machine-coded Compact Genetic Algorithms for real-valued optimization problems by [13].
Metaheuristics.MCCGA
— TypeMCCGA(;N, maxsamples)
Parameters:
N
population size. Default is 100.maxsamples
maximum number of samples. Default is 10000.
Description
MCCGA method implements the Machine-coded genetic algorithms for real valued optimization problems. The algorithm is based on the concept of a compact genetic algorithm but with a machine-coded representation using IEEE-754 floating point encoding standard. In the first stage of the algorithm, maxsamples
number of samples are generated within the range of function domain. This process is required to obtain a vector of probabilities for each single bit of the IEEE-754 representation. In classical CGAs, the initial vector of probabilities is generated using the constant probability of 0.5 whereas in MCCGA, the probability of ith bit having a value of 1 depends on the function domain. The second step performs a CGA search but with IEEE-754 bits again. Since CGA does not use a population of solutions but a single vector of probabilities, the parameter N
does not really mean number of solutions. Instead, it means the amount of mutation in each iteration, e.g. 1/N. In the last stage, a local search is performed for fine-tuning. In this implementation, Hooke & Jeeves direct search algorithm is used.
References
- Moser, Irene. "Hooke-Jeeves Revisited." 2009 IEEE Congress on Evolutionary Computation. IEEE, 2009.
- Harik, G. R., Lobo, F. G., & Goldberg, D. E. (1999). The compact genetic algorithm. IEEE transactions on evolutionary computation, 3(4), 287-297.
- Satman, M. H. & Akadal, E. (2020). Machine Coded Compact Genetic Algorithms for Real Parameter Optimization Problems . Alphanumeric Journal , 8 (1) , 43-58 . DOI: 10.17093/alphanumeric.576919
- Mehmet Hakan Satman, Emre Akadal, Makine Kodlu Hibrit Kompakt Genetik Algoritmalar Optimizasyon Yöntemi, Patent, TR, 2022/01, 2018-GE-510239
Example
julia> f, bounds, solutions = Metaheuristics.TestProblems.rastrigin();
julia> result = optimize(f, bounds, MCCGA())
+=========== RESULT ==========+
iteration: 42833
minimum: 0
minimizer: [-5.677669786379456e-17, 2.7942451898022582e-39, -3.60925916059986e-33, -6.609510861086017e-34, 2.998586759675011e-32, -1.8825832500775007e-38, -3.0729484147585938e-31, 1.7675578057632446e-38, 5.127944874215823e-16, 1.9623770480857484e-19]
f calls: 86404
total time: 3.2839 s
+============================+
Explicit Example
julia> using Metaheuristics
julia> f(x::Vector{Float64})::Float64 = (x[1]-pi)^2 + (x[2]-exp(1))^2
julia> bounds = [ -500.0 -500.0;
500.0 500.0]
julia> result = Metaheuristics.optimize(f, bounds, MCCGA())
+=========== RESULT ==========+
iteration: 2974
minimum: 1.80997e-09
minimizer: [3.1416284249228976, 2.7183048585824263]
f calls: 6012
total time: 1.5233 s
stop reason: Other stopping criteria.
+============================+
GA
Metaheuristics.GA
— TypeGA(;
N = 100,
p_mutation = 1e-5,
p_crossover = 0.5,
initializer = RandomInBounds(),
selection = TournamentSelection(),
crossover = UniformCrossover(),
mutation = BitFlipMutation(),
environmental_selection = ElitistReplacement()
)
Just another Genetic Algorithm Framework.
Parameters:
N
is the population size.p_mutation
mutation probability (gen).p_crossover
crossover probability (gen).initializer
parameters for the population initializer.selection
parameters for the selection operator.crossover
parameters for the crossover operators.mutation
parameters for the mutation operator.environmental_selection
parameters for the replacement method.
Example: Binary Encoding (default)
julia> f(x) = sum(x) / length(x)
f (generic function with 1 method)
julia> dim = 10;
julia> optimize(f, BitArraySpace(dim), GA())
+=========== RESULT ==========+
iteration: 500
minimum: 0
minimizer: Bool[0, 0, 0, 0, 0, 0, 0, 0, 0, 0]
f calls: 50000
total time: 0.0523 s
stop reason: Maximum number of iterations exceeded.
+============================+
Example: Permutation Encoding
julia> f(x) = sum(abs.(x .- (length(x):-1:1.0)))
f (generic function with 1 method)
julia> perm_size = 10;
julia> ga = GA(;initializer = RandomPermutation(N=100), crossover=OrderCrossover(), mutation=SlightMutation());
julia> optimize(f, PermutationSpace(perm_size), ga)
+=========== RESULT ==========+
iteration: 500
minimum: 0
minimizer: [10, 9, 8, 7, 6, 5, 4, 3, 2, 1]
f calls: 49900
total time: 0.6230 s
stop reason: Maximum number of iterations exceeded.
+============================+
Example: Integer Encoding
julia> c = randn(10);
julia> A = randn(10, 10);
julia> f(x) = c'*x, A*x, [0.0]
f (generic function with 1 method)
julia> bounds = repeat([0,100], 1, 10)
2×10 Matrix{Int64}:
0 0 0 0 0 0 0 0 0 0
100 100 100 100 100 100 100 100 100 100
julia> ga = GA(;crossover=SBX(;bounds),
mutation=PolynomialMutation(;bounds),
environmental_selection=GenerationalReplacement()
);
julia> result = optimize(f, bounds, ga)
+=========== RESULT ==========+
iteration: 500
minimum: -76.9031
minimizer: [0, 10, 49, 100, 24, 0, 0, 0, 67, 76]
f calls: 50000
feasibles: 95 / 100 in final population
total time: 0.6028 s
stop reason: Maximum number of iterations exceeded.
+============================+
Example: Real Encoding
julia> f, bounds, solutions = Metaheuristics.TestProblems.rastrigin();
julia> ga = GA(;mutation=PolynomialMutation(;bounds),
crossover=SBX(;bounds),
environmental_selection=GenerationalReplacement()
);
julia> optimize(f, bounds, ga)
+=========== RESULT ==========+
iteration: 500
minimum: 5.91136e-09
minimizer: [-5.446073166732363e-6, -1.7900876625850504e-7, 2.8548505431323723e-8, 2.1130514595980084e-8, -1.632381298278562e-8, 2.8216219650016283e-9, -3.2114953600427333e-7, 2.4222648522114125e-9, -5.5236545829928716e-9, -2.3479408274628334e-9]
f calls: 49900
total time: 0.5775 s
stop reason: Maximum number of iterations exceeded.
+============================+
CCMO
A Coevolutionary Framework for Constrained Multiobjective Optimization Problems proposed by [14].
Metaheuristics.CCMO
— TypeCCMO(base_optimizer; infromation, options)
Parameters for CCMO algorithm. base_algorithm
only supports NSGA2()
.
A feasible solution is such that g_i(x) ≤ 0 and h_j(x) = 0
.
Example
julia> f, bounds, pf = Metaheuristics.TestProblems.MTP();
julia> ccmo = CCMO(NSGA2(N=100, p_m=0.001));
julia> optimize(f, bounds, ccmo)
+=========== RESULT ==========+
iteration: 500
population: ⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀F space⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
┌────────────────────────────────────────┐
2 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
f₂ │⡆⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⢧⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠈⢢⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠈⠢⢀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠈⠑⠦⣀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠀⠉⠒⠤⢤⣀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠉⠑⠒⠢⠤⡀⣀⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
0 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠈⠉⠁⠒⠒⠒⠄⠤⠤⢠⢀⣀⢀⣀⠀⣀⣀⡀⣀⡀│
└────────────────────────────────────────┘
⠀0⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀1⠀
⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀f₁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
non-dominated solution(s):
⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀F space⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
┌────────────────────────────────────────┐
2 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
f₂ │⡆⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⢧⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠈⢢⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠈⠢⢀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠈⠑⠦⣀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠀⠉⠒⠤⢤⣀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
│⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠉⠑⠒⠢⠤⡀⣀⡀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀│
0 │⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠈⠉⠁⠒⠒⠒⠄⠤⠤⢠⢀⣀⢀⣀⠀⣀⣀⡀⣀⡀│
└────────────────────────────────────────┘
⠀0⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀1⠀
⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀f₁⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀⠀
f calls: 50000
feasibles: 100 / 100 in final population
total time: 7.0616 s
stop reason: Maximum number of iterations exceeded.
+============================+
$\varepsilon$DE
$\varepsilon$ Constrained Differential Evolution with Gradient-Based Mutation and Feasible Elites by [15].
Gradient mutation is not implemented here.
Metaheuristics.εDE
— TypeεDE(cp = 5, DE_kargs...)
epsilonDE(cp = 5, DE_kargs...)
Parameters for ε Differential Evolution for constrained optimization.
See DE for more details about DE parameters (DE_kargs
).
This implementation is not implementing the gradient-based repair method.
BRKGA
Biased Random Key Genetic Algorithm by [16].
Metaheuristics.BRKGA
— FunctionBRKGA(num_elites = 20, num_mutants = 10, num_offsprings = 70, bias = 0.7)
Biased Random Key Genetic Algorithm (BRKGA).
Example
julia> target_perm = collect(reverse(1:10))
10-element Vector{Int64}:
10
9
8
7
6
5
4
3
2
1
julia> decode(rk) = sortperm(rk);
julia> f(rk) = sum(abs.(decode(rk) - target_perm));
julia> res = optimize(f, [zeros(10) ones(10)], BRKGA(num_elites=70))
Optimization Result
===================
Iteration: 22
Minimum: 0
Minimizer: [0.938595, 0.851247, 0.823736, …, 0.375321]
Function calls: 2200
Total time: 0.0238 s
Stop reason: Due to Convergence Termination criterion.
julia> decode(minimizer(res))
10-element Vector{Int64}:
10
9
8
7
6
5
4
3
2
1