04 Evolution Strategies

Download as ppt, pdf, or txt
Download as ppt, pdf, or txt
You are on page 1of 30

Chapter 4

Evolution Strategies
1
A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing
ES quick overview
Developed: Germany in the 1970s
Early names: I. Rechenberg, H.-P. Schwefel
Typically applied to:
numerical optimisation
Attributed features:
fast
good optimizer for real-valued optimisation
relatively much theory
Special:
self-adaptation of (mutation) parameters standard

Evolution Strategies
2 / 30
A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing
ES technical summary tableau

Representation Real-valued vectors

Recombination Discrete or intermediary

Mutation Gaussian perturbation

Parent selection Uniform random

Survivor selection (,) or (+)


Specialty Self-adaptation of mutation
step sizes

Evolution Strategies
3 / 30
A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing
Introductory example
Task: minimimise f : Rn R
Algorithm: two-membered ES using
Vectors from Rn directly as chromosomes
Population size 1
Only mutation creating one child
Greedy selection

Evolution Strategies
4 / 30
A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing
Introductory example: pseudocde
Set t = 0
Create initial point xt = x1t,,xnt
REPEAT UNTIL (TERMIN.COND satisfied) DO
Draw zi from a normal distr. for all i = 1,,n
yit = xit + zi
IF f(xt) < f(yt) THEN xt+1 = xt
ELSE xt+1 = yt
FI
Set t = t+1

OD

Evolution Strategies
5 / 30
A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing
Introductory example: mutation mechanism
z values drawn from normal distribution N(,)
mean is set to 0
variation is called mutation step size

is varied on the fly by the 1/5 success rule:


This rule resets after every k iterations by
= / c if ps > 1/5
= c if ps < 1/5
= if ps = 1/5
where ps is the % of successful mutations, 0.8 c 1

Evolution Strategies
6 / 30
A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing
Illustration of normal distribution

Evolution Strategies
7 / 30
A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing
Another historical example:
the jet nozzle experiment

Evolution Strategies
8 / 30
A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing
The famous jet nozzle experiment (movie)

Evolution Strategies
9 / 30
A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing
Representation

Chromosomes consist of three parts:


Object variables: x1,,xn
Strategy parameters:
Mutation step sizes: 1,,n
Rotation angles: 1,, n

Not every component is always present


Full size: x1,,xn, 1,,n ,1,, k
where k = n(n-1)/2 (no. of i,j pairs)

Evolution Strategies
10 / 30
A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing
Mutation

Main mechanism: changing value by adding


random noise drawn from normal distribution
xi = xi + N(0,)
Key idea:
is part of the chromosome x1,,xn,
is also mutated into (see later how)
Thus: mutation step size is coevolving with the
solution x

Evolution Strategies
11 / 30
A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing
Mutate first

Net mutation effect: x, x,


Order is important:
first (see later how)
then x x = x + N(0,)
Rationale: new x , is evaluated twice
Primary: x is good if f(x) is good
Secondary: is good if the x it created is good
Step-size only survives through hitch-hiking
Reversing mutation order this would not work

Evolution Strategies
12 / 30
A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing
Mutation case 1:
Uncorrelated mutation with one
Chromosomes: x1,,xn,
= exp( N(0,1))
xi = xi + N(0,1)

Typically the learning rate 1/ n


And we have a boundary rule < 0 = 0

Evolution Strategies
13 / 30
A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing
Mutants with equal likelihood

Circle: mutants having the same chance to be created

Evolution Strategies
14 / 30
A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing
Mutation case 2:
Uncorrelated mutation with n s
Chromosomes: x1,,xn, 1,, n
i = i exp( N(0,1) + Ni (0,1))
xi = xi + i Ni (0,1)
Two learning rate parameters:
overall learning rate
coordinate wise learning rate
1/(2 n) and 1/(2 n)
Boundary rule: i < 0 i = 0

Evolution Strategies
15 / 30
A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing
Mutants with equal likelihood

Ellipse: mutants having the same chance to be created

Evolution Strategies
16 / 30
A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing
Mutation case 3:
Correlated mutations

Chromosomes: x1,,xn, 1,, n ,1,, k


where k = n (n-1)/2
Covariance matrix C is defined as:
cii = i2

cij = 0 if i and j are not correlated

cij = ( i2 - j2 ) tan(2 ij) if i and j are correlated

Note the numbering / indices of the s

Evolution Strategies
17 / 30
A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing
Correlated mutations contd
The mutation mechanism is then:
i = i exp( N(0,1) + Ni (0,1))
j = j + N (0,1)
x = x + N(0,C)
x stands for the vector x1,,xn
C is the covariance matrix C after mutation of the values
1/(2 n) and 1/(2 n) and 5
i < 0 i = 0 and
| j | > j = j - 2 sign(j)

NB Covariance Matrix Adaptation Evolution Strategy (CMA-


ES) is probably the best EA for numerical optimisation, cf.
CEC-2005 competition
Evolution Strategies
18 / 30
A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing
Mutants with equal likelihood

Ellipse: mutants having the same chance to be created


Evolution Strategies
19 / 30
A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing
Recombination
Creates one child
Acts per variable / position by either
Averaging parental values, or
Selecting one of the parental values

From two or more parents by either:


Using two selected parents to make a child
Selecting two parents for each position anew

Evolution Strategies
20 / 30
A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing
Names of recombinations

Two parents
Two fixed
selected for
parents
each i

Local Global
zi = (xi + yi)/2
intermediary intermediary

zi is xi or yi
chosen Local discrete Global discrete
randomly

Evolution Strategies
21 / 30
A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing
Parent selection

Parents are selected by uniform random


distribution whenever an operator needs
one/some
Thus: ES parent selection is unbiased - every
individual has the same probability to be selected
Note that in ES parent means a population
member (in GAs: a population member selected
to undergo variation)

Evolution Strategies
22 / 30
A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing
Survivor selection

Applied after creating children from the


parents by mutation and recombination
Deterministically chops off the bad stuff
Two major variants, distinguished by the basis of
selection:
(,)-selection based on the set of children only
(+)-selection based on the set of parents and
children:

Evolution Strategies
23 / 30
A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing
Survivor selection contd
(+)-selection is an elitist strategy
(,)-selection can forget
Often (,)-selection is preferred for:
Better in leaving local optima
Better in following moving optima
Using the + strategy bad values can survive in x, too long if
their host x is very fit
Selective pressure in ES is high compared with GAs,
7 is a traditionally good setting (decreasing over the
last couple of years, 3 seems more popular lately)

Evolution Strategies
24 / 30
A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing
Self-adaptation illustrated

Given a dynamically changing fitness landscape


(optimum location shifted every 200 generations)
Self-adaptive ES is able to
follow the optimum and
adjust the mutation step size after every shift !

Evolution Strategies
25 / 30
A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing
Self-adaptation illustrated contd

Evolution Strategies
26 / 30
A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing
Prerequisites for self-adaptation

> 1 to carry different strategies


> to generate offspring surplus
Not too strong selection, e.g., 7
(,)-selection to get rid of misadapted s
Mixing strategy parameters by (intermediary)
recombination on them

Evolution Strategies
27 / 30
A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing
Example application:
the cherry brandy experiment
Task: to create a colour mix yielding a target colour (that of
a well known cherry brandy)
Ingredients: water + red, yellow, blue dye
Representation: w, r, y ,b no self-adaptation!
Values scaled to give a predefined total volume (30 ml)
Mutation: lo / med / hi values used with equal chance
Selection: (1,8) strategy

Evolution Strategies
28 / 30
A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing
Example application:
cherry brandy experiment contd
Fitness: students effectively making the mix and
comparing it with target colour
Termination criterion: student satisfied with mixed
colour
Solution is found mostly within 20 generations
Accuracy is very good

Evolution Strategies
29 / 30
A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing
Example application:
the Ackley function (Bck et al 93)
The Ackley function (here used with n =30):

exp cos( 2xi ) 20 e
1 n 2 1 n
f ( x) 20 exp 0.2 xi
n i 1 n i 1

Evolution strategy:
Representation:
-30 < xi < 30 (coincidence of 30s!)
30 step sizes
(30,200) selection
Termination : after 200000 fitness evaluations
Results: average best solution is 7.48 10 8 (very good)

Evolution Strategies
30 / 30
A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing

You might also like