CS 2710, ISSP 2610: R&N Chapter 4.1 Local Search and Optimization
CS 2710, ISSP 2610: R&N Chapter 4.1 Local Search and Optimization
CS 2710, ISSP 2610: R&N Chapter 4.1 Local Search and Optimization
1
Search Types
2
Local Search and Optimization
• Previous searches: keep paths in memory, and remember
alternatives so search can backtrack. Solution is a path to a
goal.
• Path may be irrelevant, if only the final configuration is
needed (8-queens, IC design, network optimization, …)
3
Local Search
4
Optimization
5
Examples
6
Examples
7
Examples
8
Examples
9
Examples
10
Comparison to tree/graphsearch framework
11
Visualization
12
Local Search Alogorithms
13
Hillclimbing
(Greedy Local Search)
• Generate nearby successor states to the current state
• Pick the best and replace the current state with that one.
• Loop
14
Hill-climbing search problems
(this slide assumes maximization rather than minimization)
16
Simulated Annealing
17
Simulated Annealing
18
Simulated Annealing
• More formally…
– Generate a random new neighbor from current state.
– If it’s better take it.
– If it’s worse then take it with some probability proportional to
the temperature and the delta between the new and old states.
19
Simulated annealing
20
function Simulated-Annealing(start,schedule)
current ← start
for t ← 1 to ∞ do
T ← schedule[t]
if T=0 then return current
next ← a randomly selected successor of current
ΔE ← Value[next] – Value[current]
if ΔE > 0 then current ← next
else current ← next only with probability eΔE/T
21
Intuitions
• Hill-climbing is incomplete
• Pure random walk, keeping track of the best state found so
far, is complete but very inefficient
• Combine the ideas: add some randomness to hill-climbing to
allow the possibility of escape from a local optimum
22
Intuitions
23
Theoretical Completeness
24
Local Beam Search
25
Local Beam Search
26
Local Beam Search
27
Genetic Algorithms
28
function GA (pop, fitness-fn)
Repeat
new-pop = {}
for i from 1 to size(pop):
x = rand-sel(pop,fitness-fn)
y = rand-sel(pop,fitness-fn)
child = reproduce(x,y)
if (small rand prob): child mutate(child)
add child to new-pop
pop = new-pop
Until an indiv is fit enough, or out of time
Return best indiv in pop, according to fitness-fn
29
function reproduce(x,y)
n = len(x)
c = random num from 1 to n
return:
append(substr(x,1,c),substr(y,c+1,n)
30
Example: n-queens
31
Genetic Algorithms
Notes
• Representation of individuals
– Classic approach: individual is a string over a finite
alphabet with each element in the string called a gene
– Usually binary instead of AGTC as in real DNA
• Selection strategy
– Random
– Selection probability proportional to fitness
– Selection is done with replacement to make a very fit
individual reproduce several times
• Reproduction
– Random pairing of selected individuals
– Random selection of cross-over points
– Each gene can be altered by a random mutation
32
Genetic Algorithms
When to use them?
• Genetic algorithms are easy to apply
• Results can be good on some problems, but bad on other
problems
• Genetic algorithms are not well understood
33
Example Local Search Problem Formulation
34
Wrapup
35