Greedy Method

Download as pdf or txt
Download as pdf or txt
You are on page 1of 71

The Greedy Method

The greedy method


 Suppose that a problem can be solved by a
sequence of decisions. The greedy method
has that each decision is locally optimal.
These locally optimal solutions will finally add
up to a globally optimal solution.
The Knapsack Problem
The classic Knapsack problem is:
A thief breaks into a store and wants to fill his knapsack
of capacity K with goods of as much value as possible.

Decision version: Does there exist a collection of items


that fits into his knapsack and whose total value is >=
W?
The Knapsack Problem
 Input
 Capacity K
 n items with weights wi and values vi
 Output: a set of items S such that
 the sum of weights of items in S is at most K
 and the sum of values of items in S is
maximized
Some Simplest Versions…
Fractional Knapsack Problem: Can items be picked up
partially?
The thief’s knapsack can hold 100 gms and has to choose
from:
30 gms of gold dust at Rs 1000 /gm
60 gms of silver dust at Rs 500/gm
30 gms of platinum dust at Rs 1500/gm

Note: Optimal fills the Knapsack upto full capacity.


Proof: Else the remaining capacity can be filled with
some item, picking it partially if the need be.
Greedy Algorithm for fractional
Knapsack

1. Sort the items in the increasing order of


value/weight ratio (cost effectiveness).
2. If the next item cannot fit into the knapsack,
break it and pick it partially just to fill the
knapsack.
The knapsack problem
 n objects, each with a weight wi > 0
a profit pi > 0
capacity of knapsack: M

 pi xi
Maximize 1 i  n

Subject to 1i n w i xi  M

0  xi  1, 1  i  n
The knapsack algorithm
 The greedy algorithm:
Step 1: Sort pi/wi into nonincreasing order.
Step 2: Put the objects into the knapsack according
to the sorted sequence as possible as we can.
 e. g.
n = 3, M = 20, (p1, p2, p3) = (25, 24, 15)
(w1, w2, w3) = (18, 15, 10)
Sol: p1/w1 = 25/18 = 1.32
p2/w2 = 24/15 = 1.6
p3/w3 = 15/10 = 1.5
Optimal solution: x1 = 0, x2 = 1, x3 = 1/2
total profit = 24 + 7.5 = 31.5
Fractional knapsack
 Much easier
 For item Ij, let rj = bj/wj. This gives you
the benefit per measure of weight.
 Sort the items in descending order of rj
 Pack the knapsack by putting as many
of each item as you can walking down
the sorted list.
Minimum Spanning Trees
Problem: Laying Telephone
Wire

Central office
Wiring: Naive Approach

Central office

Expensive!
Wiring: Better Approach

Central office

Minimize the total length of wire connecting the customers


Minimum-cost spanning trees
 Suppose you have a connected undirected graph with a
weight (or cost) associated with each edge
 The cost of a spanning tree would be the sum of the costs
of its edges
 A minimum-cost spanning tree is a spanning tree that has
the lowest cost
16 16
A B A B
21 11 6 11 6

19 5 5
F C F C
33 14
10
E 18 D E 18 D
A connected, undirected graph A minimum-cost spanning tree
Minimum Spanning Tree
(MST)
A minimum spanning tree is a subgraph of an
undirected weighted graph G, such that

 it is a tree (i.e., it is acyclic)


 it covers all the vertices V
 contains |V| - 1 edges
 the total cost associated with tree edges is the
minimum among all possible spanning trees
 not necessarily unique
How Can We Generate a MST?

9 b 9 b
a 2 6 a 2 6
d d
4 5 4 5
5 4 5 4
5 e 5 e
c c
Prim(-Jarnik)’s Algorithm
 Similar to Dijkstra’s algorithm (for a connected graph)
 We pick an arbitrary vertex s and we grow the MST as a cloud of
vertices, starting from s
 We store with each vertex v a label d(v) = the smallest weight of an
edge connecting v to a vertex in the cloud

At each step:
 We add to the cloud the
vertex u outside the cloud
with the smallest distance
label
 We update the labels of the
vertices adjacent to u
Prim’s algorithm
T = a spanning tree containing a single node s;
E = set of edges adjacent to s;
while T does not contain all the nodes {
remove an edge (v, w) of lowest cost from E
if w is already in T then discard edge (v, w)
else {
add edge (v, w) and node w to T
add to E the edges adjacent to w
}
}
 An edge of lowest cost can be found with a priority queue
 Testing for a cycle is automatic
 Hence, Prim’s algorithm is far simpler to implement than Kruskal’s
algorithm (presented below)
Example
 7
7 D 7 D
2 2
B 4 B 4
8 9  5 9 
2 5 F 2 5 F
C C
8 8
8 3 8 3
E E
A 7 A 7
0 7 0 7

7 7
7 D D
2 2 7
B 4 B 4
5 9  9 4
2 5 F 5 5
C 2 F
8 C
3 8
8 8 3
E E
A 7 A
0 7 7 7
0
Example (contd.)
7
7 D
2
B 4
5 9 4
2 5 F
C
8
8 3
E
A 3 7
0 7
7 D
2
B 4
5 9 4
2 5 F
C
8
8 3
E
A 3
0 7
Prim’s algorithm
Vertex Parent
e -
e d b c a b -
0     c -
9 b d -
a 2 6
d Vertex Parent
4 5
5 4 e -
5 e d b c a b e
c 4 5 5  c e
d e

The MST initially consists of the vertex e, and we update


the distances and parent for its adjacent vertices
Prim’s algorithm
Vertex Parent
e -
d b c a b e

9
4 5 5  c e
b d e
a 2 6
d
4 5
5 4
Vertex Parent
5 e e -
c a c b b e
2 4 5 c d
d e
a d
Prim’s algorithm
Vertex Parent
e -
a c b b e
2 4 5 c d
9 d e
b
a d
a 2 6
d
4 5
5 4

5 e
c Vertex Parent
e -
c b b e
4 5 c d
d e
a d
Prim’s algorithm
Vertex Parent
e -
c b b e
9 4 5 c d
b d e
a 2 6
a d
d
4 5
5 4

5 e Vertex Parent
c e -
b b e
5 c d
d e
a d
Prim’s algorithm
Vertex Parent
e -
b b e
5 c d
9 b d e
a 2 6 a d
d
4 5
5 4

5 e
c Vertex Parent
e -
b e
The final minimum spanning tree c d
d e
a d
Examples for Prim’s Algorithm
0
28
10 1
14 16
5 6 2
25 24
0 18 12 0 0
4
10 1 22 3
10 1 10 1

5 6 2 5 6 2 5 6 2
25 25
4 4 4
3 3 22 3
0
28
10 1
14 16
5 6 2
25 24
18 12
4
22 3 0 0
0
10 1 10 1 10 1
16 14 16
5 6 2 5 6 2 5 6 2
25 25 25
4 12 4 12 4 12
22 3 22 3 22 3
Kruskal’s Idea
 Build a minimum cost spanning tree T by
adding edges to T one at a time
 Select the edges for inclusion in T in
nondecreasing order of the cost
 An edge is added to T if it does not form
a cycle
 Since G is connected and has n > 0
vertices, exactly n-1 edges will be
selected
0 10 5

2 12 3

1 14 6 0 0 0
28
1 16 2 10 1 1 10 1
14 16
3 18 6 5 6 2 5 6 2 5 6 2
24
3 22 4 25
18 12 4 4
4
4 24 6 22 3 3 3

4 25 5
6/9
28
0 10 5

2 12 3

1 14 6

1 16 2 0 0 0

10 1 10 1 10 1
3 18 6
14 14 16
3 22 4 5 6 2 5 6 2 5 6 2

12 12 12
4 24 6 4 4 4
3 3 3
4 25 5

0 28 1 +3 6
cycle
0 10 5

2 12 3

1 14 6
0 0
1 16 2
10 1 10 1
3 18 6 14 16 14 16
5 6 2 5 6 2
3 22 4
25
12 12
4 4
4 24 6
22 3 + 4 6 22 3
4 25 5 cycle cost = 10 +25+22+12+1

0 28 1
Kruskal’s Algorithm
T= {};
while (T contains less than n-1 edges
&& E is not empty) {
choose a least cost edge (v,w) from E;
delete (v,w) from E; min heap construction tim
choose and delete O(log e
if ((v,w) does not create a cycle in T)
add (v,w) to T
find find & union O(log e
else discard (v,w);
} {0,5}, {1,2,3,6}, {4} + edge(3,6) X + edge(3,4) --> {0,5}
if (T contains fewer than n-1 edges)
printf(“No spanning tree\n”);

O(e log e)
Kruskal’s algorithm
T = empty spanning tree;
E = set of edges;
N = number of nodes in graph;
while T has fewer than N - 1 edges {
remove an edge (v, w) of lowest cost from E
if adding (v, w) to T would create a cycle
then discard (v, w)
else add (v, w) to T
}
 Finding an edge of lowest cost can be done just by
sorting the edges
 Testing for a cycle: Efficient testing for a cycle
requires a additional algorithm (UNION-FIND) which
we don’t cover in this course. The main idea: If both
nodes v, w are in the same component of T, then
adding (v, w) to T would result in a cycle.
Kruskal
Example 2704
867 BOS

849 PVD
ORD 187
740 144
1846 621 JFK
184 1258
802
SFO BWI
1391
1464
337 1090
DFW 946
LAX 1235
1121
MIA
2342
Example
2704
BOS
867

849 PVD
ORD 187
740 144
1846 621 JFK
184 1258
802
SFO BWI
1391
1464
337 1090
DFW 946
LAX 1235
1121
MIA
2342

35
Example
2704
BOS
867

849 PVD
ORD 187
740 144
1846 621 JFK
184 1258
802
SFO BWI
1391
1464
337 1090
DFW 946
LAX 1235
1121
MIA
2342
Example
2704
BOS
867

849 PVD
ORD 187
740 144
1846 621 JFK
184 1258
802
SFO BWI
1391
1464
337 1090
DFW 946
LAX 1235
1121
MIA
2342

37
Example
Example
2704
BOS
867

849 PVD
ORD 187
740 144
1846 621 JFK
184 1258
802
SFO BWI
1391
1464
337 1090
DFW 946
LAX 1235
1121
MIA
2342
Example
2704
867 BOS

849 PVD
ORD 187
740 144
1846 621 JFK
184 1258
802
SFO BWI
1391
1464
337 1090
DFW 946
LAX 1235
1121
MIA
2342
Example
2704
BOS
867

849 PVD
ORD 187
740 144
1846 621 JFK
184 1258
802
SFO BWI
1391
1464
337 1090
DFW 946
LAX 1235
1121
MIA
2342
Example
2704
BOS
867

849 PVD
ORD 187
740 144
1846 621 JFK
184 1258
802
SFO BWI
1391
1464
337 1090
DFW 946
LAX 1235
1121
MIA
2342
Example
Example
2704
BOS
867

849 PVD
ORD 187
740 144
1846 621 JFK
184 1258
802
SFO BWI
1391
1464
337 1090
DFW 946
LAX 1235
1121
MIA
2342
Example
2704
BOS
867

849 PVD
ORD 187
740 144
1846 621 JFK
184 1258
802
SFO BWI
1391
1464
337 1090
DFW 946
LAX 1235
1121
MIA
2342
Example
2704
BOS
867

849 PVD
ORD 187
740 144
1846 621 JFK
184 1258
802
SFO BWI
1391
1464
337 1090
DFW 946
LAX 1235
1121
MIA
2342
Example
2704
867 BOS

849 PVD
ORD 187
740 144
1846 621 JFK
184 1258
802
SFO BWI
1391
1464
337 1090
DFW 946
LAX 1235
1121
MIA
2342
Time Compexity
 Let v be number of vertices and e the
number of edges of a given graph.
 Kruskal’s algorithm: O(e log e)
 Prim’s algorithm: O( e log v)
Single Source Shortest Paths
Problem
Shortest Path Problems

• Directed weighted graph.


• Path length is sum of weights of edges on
path.
• The vertex at which the path begins is the
source vertex.
• The vertex at which the path ends is the
destination vertex.
Example
8 6
1
2 3
3 1
16
6 7 4 5 10
4
2 4 7
5 3

14

A path from 1 to 7.
Path length is 14.
Example
8 6
1
2 3
3 1
16
6 7 4 5 10
4
2 4 7
5 3

14

Another path from 1 to 7.


Path length is 11.
Shortest Path Problems

• Single source single destination.


• Single source all destinations.
• All pairs (every vertex is a
source and destination).
Single Source Single
Destination

Possible greedy algorithm:


 Leave source vertex using cheapest/shortest
edge.
 Leave new vertex using cheapest edge subject
to the constraint that a new vertex is reached.
 Continue until destination is reached.
Greedy Shortest 1 To 7 Path
8 6
1
2 3
3 1
16
6 7 4 5 10
4
2 4 7
5 3

14

Path length is 12.


Not shortest path. Algorithm doesn’t work!
Dijkstra’s Algorithm - Example

1
10
2 3 9
4 6
5 7

2
Dijkstra’s Algorithm - Example

1
 
10
2 3 9
0 4 6
5 7
 
2
Dijkstra’s Algorithm - Example

1
10 
10
2 3 9
0 4 6
5 7
5 
2
Dijkstra’s Algorithm - Example

1
10 
10
2 3 9
0 4 6
5 7
5 
2
Dijkstra’s Algorithm - Example

1
8 14
10
2 3 9
0 4 6
5 7
5 7
2
Dijkstra’s Algorithm - Example

1
8 14
10
2 3 9
0 4 6
5 7
5 7
2
Dijkstra’s Algorithm - Example

1
8 13
10
2 3 9
0 4 6
5 7
5 7
2
Dijkstra’s Algorithm - Example

1
8 13
10
2 3 9
0 4 6
5 7
5 7
2
Dijkstra’s Algorithm - Example

1
8 9
10
2 3 9
0 4 6
5 7
5 7
2
Dijkstra’s Algorithm - Example

1
8 9
10
2 3 9
0 4 6
5 7
5 7
2
Single Source Single
Destination
Terminate single source all destinations
greedy algorithm as soon as shortest
path to desired vertex has been
generated.
Data Structures For Dijkstra’s
Algorithm
 The greedy single source all destinations
algorithm is known as Dijkstra’s algorithm.
 Implement d() and p() as 1D arrays.
 Keep a linear list L of reachable vertices to
which shortest path is yet to be generated.
 Select and remove vertex v in L that has
smallest d() value.
 Update d() and p() values of vertices
adjacent to v.
Complexity
 O(n) to select next destination vertex.
 updating d() and p() values:
 O(out-degree) when adjacency lists are
used.
 O(n) when adjacency matrix is used.
 Selection and update done once for
each vertex to which a shortest path is
found.
 Total time is O(n2).
Complexity

 When a min heap of d() values is used


in place of the linear list L of reachable
vertices, total time is O((n+e) log n),
because O(n) remove min operations
and O(e) change key (d() value)
operations are done.

You might also like