Document 7636876

Download Report

Transcript Document 7636876

Evolutionary Computational
Intelligence
Lecture 1:
Basic Concepts
Ferrante Neri
University of Jyväskylä
1
25/05/2016 05:48:53
Lecture 1:Basic Concepts
Introductory Example: Radio Tuning


Position (e.g. angular) of the Radio Knob :
candidate solution
We want to have a clear signal that is:
–
–
Maximize the signal
Minimize the background noise
noise power
minimize f   
signal power
2
25/05/2016 05:48:53
Lecture 1:Basic Concepts
Optimization Problem




3
Candidate solution
Decision (or design) variables
Variable bounds define the decision
space
objective function or fitness function (the
behavior taken by the fitness over the
decision space
is called fitness landscape)
25/05/2016 05:48:53
Lecture 1:Basic Concepts
Real-World Optimization Problems


4
Optimization Problems are often rather easily
formulated but very hard to be solved when
the problem comes from an application. In
fact, some features characterizing the
problem can make it extremely challenging.
These features are summarized in the
following:
25/05/2016 05:48:53
Lecture 1:Basic Concepts
Highly nonlinear fitness function

5
Usually optimization problems are
characterized by nonlinear function. In real
world optimization problems, the physical
phenomenon, due to its nature (e.g. in the
case of saturation phenomenon or for
systems which employ electronic
components), cannot be approximated by a
linear function neither in some areas of the
decision space.
25/05/2016 05:48:53
Lecture 1:Basic Concepts
Highly multimodal fitness landscape



6
It often happens that the fitness landscape contains
many local optima and that many of these have an
unsatisfactory performance (fitness value)
These fitness landscapes are usually rather difficult
to be handled since the optimization algorithms
which employ gradient based information in
detecting the search direction could easily converge
to a suboptimal basin of attraction
Basin of attraction: set of points of the decision
space such that ,initial conditions chosen,
dynamically evolve to a particular attractor
25/05/2016 05:48:53
Lecture 1:Basic Concepts
Optimization in Noisy Environment




7
Uncertainties in optimization can be categorized into three
classes.
Noisy fitness function. Noise in fitness evaluations may come
from many different sources such as sensory measurement
errors or randomized simulations.
Approximated fitness function. When the fitness function is
very expensive to evaluate, or an analytical fitness function is
not available, approximated fitness functions are often used
instead. These approximated models implicitly introduce a
noise which is the difference between the approximated value
and real fitness value, which is unknown.
Robustness. Often, when a solution is implemented, the
design variables or the environmental parameters are subject to
perturbations or changes (e.g. control problems).
25/05/2016 05:48:53
Lecture 1:Basic Concepts
Computationally expensive problems



8
Optimization problems can be
computationally expensive because of two
reasons:
high cardinality decision space (usually
combinatorial)
computationally expensive fitness function
(e.g. design of on-line electric drives)
25/05/2016 05:48:53
Lecture 1:Basic Concepts
Real-World Problems and Classical
Methods


9
When such features are present in an optimization
problem, the application of exact methods is usually
unfeasible since the hypotheses are not respected.
Moreover the application of classical deterministic
algorithms is also questionable since their use could
easily lead to suboptimal solutions (e.g. a hill climber
for highly multimodal functions) or return completely
unreliable results (e.g. a deterministic optimizer in
noisy environments).
25/05/2016 05:48:53
Lecture 1:Basic Concepts
Rosenbrock Algorithm 1960




10
From chemical application
Well defined decision space
No analytical expression and no derivatives
Modification of a steepest descent method
25/05/2016 05:48:53
Lecture 1:Basic Concepts
Rosenbrock Algorithm



11
The search is executed along each direction
variable (orthogonal search)
The search is continued by enlarging the
step size for successful directions and
reducing for unsuccessful directions
The search is stopped when the trial was
successful in all the directions
25/05/2016 05:48:53
Lecture 1:Basic Concepts
Rosenbrock Algorithm

Under these conditions, a new set of
directions is determined by means of GramSchmidt procedure and the search is start
B1
over
1
B A  
1
1
1
B1
B2
 
B2
B2  A2  A  
1 1
1 1 1
1
2
n 1
A1 sum of advances in all directions
12
Bn  An   An 
j 1
1 1
j j
and so on until
Bn
 
Bn
1
n
A2 sum of advances in all directions
other than the first
25/05/2016 05:48:53
Lecture 1:Basic Concepts
Rosenbrock Algorithm
13
25/05/2016 05:48:53
Lecture 1:Basic Concepts
Hooke Jeeves Algorithm (1961)




14
exploratory radius h, an initial candidate
Initial solution x
n × n direction exploratory matrix U (e.g. diag
(w(1),w(2), …w(i)...,w(n), where w (i) is the
width of the range of variability of the i-th
variable)
U(i,:) is the i-th row of the matrix
25/05/2016 05:48:53
Lecture 1:Basic Concepts
Hooke Jeeves Algorithm
Exploratory Move: samples solutions x
(i)+hU(i, :) (”+” move) with i = 1, 2, . . . , n and
thesolutions x (i)−hU(i, :) (”-” move) with i = 1,
2, . . . , n only along those directions which
turned out unsuccessful during the ”+” move
 Directions are analyzed
Separately!

15
25/05/2016 05:48:53
Lecture 1:Basic Concepts
Hooke Jeeves Algorithm


16
Pattern Move: The pattern move is an aggressive
attempt of the algorithm to exploit promising search
directions.
Rather than centering the following exploration at the
most promising explored candidate solution the HJA
tries to move further. The algorithm makes a double
step and centers the subsequent exploratory move.
If this second exploratory fails: step back and
exploratory.
25/05/2016 05:48:53
Lecture 1:Basic Concepts
Hooke Jeeves Algorithm
17
25/05/2016 05:48:53
Lecture 1:Basic Concepts
Nelder Mead Algortihm (1965)



18
works on a setof n + 1 solutions in order to perform the
local search
it employs an exploratory logic based on a dynamic
construction of a polyhedron (simplex)
n + 1 solutions x0, x1, . . . , xn sorted in descending order
according to their fitness values (i.e. x0 is the best), the
NMA attempts to improve xn
25/05/2016 05:48:53
Lecture 1:Basic Concepts
Nelder Mead Algorithm
19

The centroid xm is calculated:

1st step: reflection

If the reflected point outperforms
x0,replacement occurs
25/05/2016 05:48:53
Lecture 1:Basic Concepts
Nelder Mead Algorithm
20

2nd step: expansion. if the reflection was
successful, (i.e. reflected point better than x0) it
calculates

if the expansion is also successful new replacement
occurs
25/05/2016 05:48:53
Lecture 1:Basic Concepts
Nelder Mead Algorithm

If xr did not improve upon x0,
–

If this trial is also unsuccessful,
–
21
If f (xr) < f (xn−1) then xr replaces xn.
if f (xr) < f (xn), xr replaces xn and the 3rd step:
outside contraction.
25/05/2016 05:48:53
Lecture 1:Basic Concepts
Nelder Mead Algorithm
22

If xr does not outperform neither xn then the
4th step: inside contraction:

if the contraction was successful then xc
replaces xn
25/05/2016 05:48:53
Lecture 1:Basic Concepts
Nelder Mead Algorithm

23
If there is no way
to improve x0 the
5th step
shrinking: n new
points a sampled
and the process
is started over
25/05/2016 05:48:53
Lecture 1:Basic Concepts
Comparative Analysis



24
The three algorithms do not require derivatives and
do not require explicit analytical expression
Rosenbrock and Hooke Jeeves are fully
deterministic while Nelder Mead has some
randomness
Rosenbrock and Nelder Mead move in the space
along all the directions simultaneously (e.g. diagonal
in 2D) while Hooke Jeeves moves along one
direction at once
25/05/2016 05:48:53
Lecture 1:Basic Concepts
Fundamental Points in Comparative
Analysis
25

Rosenbrock and Hooke Jeeves have a
mathematically proved convergence while
Nelder Mead doesn’t!

Rosenbrock and Hooke Jeeves have
“local properties” while Nelder Mead has
“global properties”
25/05/2016 05:48:53
Lecture 1:Basic Concepts
Two-Phase Nozzle Design (Experimental)
Experimental design optimisation: Optimise efficieny.
... evolves...
Initial design
Final design: 32% improvement in efficieny.
26
25/05/2016 05:48:53
Lecture 1:Basic Concepts