# Niverel

### Site Tools

en:mathematics:optimisation:evolution

# Evolution strategies

Evolution strategies (ES) are stochastic, derivative-free methods for numerical optimisation of non-linear or non-convex continuous optimisation problems. Unlike a convex optimisation problem that maintains the properties of a linear programming problem, a non-convex optimisation may have multiple locally optimal points and it can take a lot of time to identify whether the problem has no solution or if the solution is global.

## Genetic algorithm

A genetic algorithm (GA) is a metaheuristic inspired by the process of natural selection that belongs to the larger class of evolutionary algorithms (EA). Genetic algorithms are commonly used to generate high-quality solutions to optimisation and search problems by relying on bio-inspired operators such as mutation, crossover and selection: A population of candidate solutions (called individuals, creatures, or phenotypes) to an optimization problem is evolved toward better solutions. Each candidate solution has a set of properties (its chromosomes or genotype) which can be mutated and altered.

The evolution usually starts from a population of randomly generated individuals, and is an iterative process, with the population in each iteration called a generation. In each generation, the fitness of every individual in the population is evaluated; the fitness is usually the value of the objective function in the optimisation problem being solved. The more fit individuals are stochastically selected from the current population, and each individual's genome is modified (recombined and possibly randomly mutated) to form a new generation. The new generation of candidate solutions is then used in the next iteration of the algorithm. Commonly, the algorithm terminates when either a maximum number of generations has been produced, or a satisfactory fitness level has been reached for the population.

```Input: instance, size_of_population, rate_of_elitism, rate_of_mutation, number_of_iterations
Output: solution

// Initialisation
population <- RandomFeasibleSolutions

loop until terminal_condition
for (i = 1 to number_of_iterations) do

//Elitism based selection
number_of_elitism = size_of_population times rate_of_elitism
population_1 <- select the best number of elitism solutions in the population

// Crossover
number_of_crossovers  = (size_of_population - number_of_elitism) / 2
for j = 1 to (number_of_crossovers) do
// Roulette wheel selection
solution_a, solution_b <- Select(population)
//Choose a locus at which the remaining alleles are swapped from on parent to the other.
end for

// Mutation
for j = 1 to (number_of_crossovers) do
// Roulette wheel selection
solution[j] <- Select(population_2)
new_solution[j] <- MutateEachBit (solution[j], rate_of_mutation)
if (new_solution[j] is unfeasible)
// Generate a feasible solution from an infeasible solution
new_solution[j] <- Repair(new_solution[j])
end if
population_2 <- UpdateWith(population_2, new_solution[j])
end for

// Updating
population = population_1 + population_2

end for

// Returning the best solution
return the best solution in population```

### Selection

The most common type of `Select` uses roulette wheel selection, in which individuals are given a probability of being selected that is directly proportionate to their fitness.

```for all members of population
sum += fitness
end for

for all members of population
probability = sum_of_probabilities + (fitness / sum)
sum_of_probabilities += probability
end for

loop until new_population is full
do this twice
number = Random between 0 and 1
for all members of population
if number > probability but less than next probability
then select member
end for
end
create offspring
end loop```

## Covariance Matrix Adaptation Evolution strategy (CMA-ES)

A version of evolution strategy algorithms is the Covariance Matrix Adaptation Evolution Strategy (CMA-ES) by Nikolaus Hansen and Andreas Ostermeier. In contrast to most other evolutionary algorithms, the CMA-ES is (from a user perspective), quasi parameter-free. CMA-ES creates a Gaussian probabilistic model of promising solutions and in creation of the next generation population, this model is used. This makes CMA-ES, an Estimation of Distribution Algorithm (EDA). CMA-ES has been empirically successful in hundreds of applications and is considered to be useful in particular on non-convex, non-separable, ill-conditioned, multi-modal or noisy objective function.

The user has to choose an initial solution point and the initial step-size. Optionally, the number of candidate samples can be modified by the user in order to change the characteristic search behaviour and termination conditions can be adjusted to the search space at hand.

```Input: number_of_samples_per_iteration    // At least two, commonly > 4, smaller values (<10) lead to more local search behaviour

// Initialisation
Initialise distribution_mean, step-size, covariance_matrix, evolution_path_isotropic, evolution_path_anisotropic

loop until terminal_condition

// Sample new solutions and evaluate
for (i = 1 to number_of_samples_per_iteration) do
x[i] = SampleMultivariateNorm(distribution_mean, covariance_matrix)   // Sample candidate solutions from a multivariate normal distribution
f[i] = Fitness(x[i])                                                  // Mutation of the current favourite solution vector
end for

// Sort solutions: the positive (recombination) weights sum to one
for (i = 1 to number_of_samples_per_iteration) do
s[i] = argsort(x, f)
end for
x <- s

// Move mean to better solutions
new_m = m
m <- UpdateMean(x)

// Update isotropic (invariant with respect to direction) evolution path
evolution_path_isotropic <- UpdatePS(evolution_path_isotropic, step_size, covariance_matrix, m, new_m)

// Update anisotropic evolution path
evolution_path_anisotropic <- UpdatePC(evolution_path_anisotropic, step_size, m, new_m, evolution_path_isotropic)

// Update covariance matrix
covariance_matrix <- UpdateC(covariance_matrix, evolution_path_anisotropic, x[1], new_m, step-size, x[number_of_samples_per_iteration])

// Update step-size using isotropic path length - cumulative step-size adaptation (CSA), path length control.

end loop

return m or x[1]
```