0% found this document useful (0 votes)
16 views

Module 3 Greedy Algorithms

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
16 views

Module 3 Greedy Algorithms

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 36

MODULE 3

GREEDY ALGORITHM
Introduction
• Methods of solving problems

1. Divide and conquer


2. Randomized algorithms
3. Greedy algorithms (not an algorithm, but a technique.)
4. Dynamic programming
What is a 'Greedy algorithm'?
• A greedy algorithm, as the name suggests, always
makes the choice that seems to be the best at that
moment.
• This means that it makes a locally-optimal choice in
the hope that this choice will lead to a globally-
optimal solution.
• When a problem have feasible solution with
different cost or benefit, finding the best solution is
known as an optimization problem and the best
solution is known as optimal solution.
• The local decisions (or choices) must possess
three characteristics as mentioned below:
• Feasibility: The selected choice must fulfil
local constraints.
• Optimality: The selected choice must be the
best at that stage (locally optimal choice).
• Irrevocability: The selected choice cannot be
changed once it is made.
Greedy Algorithm
Example:
1. Machine scheduling
2. Fractional Knapsack Problem
3. Minimum Spanning Tree
4. Huffman Code
5. Job Sequencing
6. Activity Selection Problem
knapsack problem
Given a set of items, each with a weight (W) and a value (P), determine the
subset of the items to be added in a knapsack such that, the total
weight of the items must not exceed the limit of the knapsack and its
total profit value is maximum.
Eg: Q1). Find an optimal solution for knapsack problem, where n=7, M=15
(P1, P2, P3, P4, P5, P6) = (10,5,15,7,6,18,3)
(W1, W2, W3, W4, W5, W6) = (2,3,5,7,1,4,1)

Soln : Objects 1 2 3 4 5 6 7
Profit Pi 10 5 15 7 6 18 3
Weights Wi 2 3 5 7 1 4 1
Ratio of Pi/Wi 5 1.6 3 1 6 4.5 3
2). Select object with maximum profit (Pi)

Soln : Objects 1 2 3 4 5 6 7
Profit Pi 10 5 15 7 6 18 3
Weights Wi 2 3 5 7 1 4 1
Ratio of Pi/Wi 5 1.6 3 1 6 4.5 3

Objects Profit(Pi) Weights Remaining Portion


(Wi) Weight considere
d (x)

6 18 4 15-4=11 1

3 15 5 11-5=6 1

1 10 2 6-2=4 1

4 7 7 4-4=0 4/7

ΣxiPi = 18 * 1 + 15 * 1 + 10*1 + 7* 4/7 = 47


3). Select object with minimum weight (Wi)

Objects Profit(Pi) Weights (Wi) Remaining Portion


Weight considered (x)

5 6 1 15-1=14 1

7 3 1 14-1=13 1

1 10 2 13-2=11 1

2 5 3 11-3=8 1

6 18 4 8-4=4 1

3 15 5 4-4=0 4/5

ΣxiPi = 6 * 1 + 3 * 1 + 10*1 + 5*1+ 18*1 + 15*⅘ = 54


Soln : Objects 1 2 3 4 5 6 7
Profit Pi 10 5 15 7 6 18 3
Weights Wi 2 3 5 7 1 4 1
Ratio of Pi/Wi 5 1.6 3 1 6 4.5 3

Objects Profit(Pi) Weights (Wi) Remaining Portion


Weight considered (x)

5 6 1 15-1=14 1

1 10 2 14-2=12 1

6 18 4 12-4=8 1

3 15 5 8-5=3 1

7 3 1 3-1=2 1

2 5 3 2-2=0 2/3
Knapsack Algorithm
Complexity Analysis of Knapsack Problem

For n times, Knapsack has 2n choices. So brute force approach runs in O(2n)
times.

While sorting using merge sort, n times can be sorted in O(n log n) times.

To select the items, we need one scan to the sorted list, which will take O(n)
time.

So total time required is

T(n) = O(n log n) + O(n) = O(n log n)


Job scheduling Algorithm
Job scheduling is the problem of scheduling jobs out of a set of N jobs on a single
processor which maximizes profit as much as possible. Consider N jobs, each taking
unit time for execution. Each job is having some profit and deadline associated with it.
Profit earned only if the job is completed on or before its deadline.

Schedule S is an array of slots S(t), S(t) ∈ {1, 2, 3, …, N} for each t ∈ {1, 2, 3, …, N}

Schedule S is feasible if,

● S(t) = i, then t ≤ di (Scheduled job must meet its deadline)


● Each job can be scheduled at max once.

Our goal is to find a feasible schedule S that maximizes the scheduled job's profit. The
goal can be achieved as follows: Sort all jobs in decreasing order of profit. Start with the
empty schedule, select one job at a time and schedule it in the latest possible slot if it is
feasible.

For N jobs, there exist 2N schedules, so this brute force approach runs in O(2N) time.
Example

Select the jobs to be performed giving max profit before


deadlines.
Example

0 __ 1__ 2 __ 3

Job considered Slot assign Solution Profit

J1 [1,2] J1 20

J2 [0,1][1,2] J1,J2 20+15

J3❌ [0,1][1,2] J1,J2 20+15

J4 [0,1][1,2][2,3] J1,J2,J4 20+15+5

J5❌ [0,1][1,2][2,3] J1,J2,J4 20+15+5 =40


Q2). Jobs J1 J2 J3 J4

Profit P 100 10 15 27

Deadline D 2 1 2 1

Soln : 0_________1_________2

Profit Obtain = 127

Q3). Jobs J1 J2 J3 J4 J5
j6 j7

Profit P 3 5 20 18 1
6 30

Deadline D 1 3 4 3
2 1 2

Soln : 0_________1__________2__________3___________4

Profit Obtain = 64
Job Sequencing Algorithm
Dijkstra’s algorithm
Graph
Dijkstra's Algorithm
algorithm for finding the shortest paths
between nodes in a weighted graph.
• It is a greedy algorithm that solves the single-
source shortest path problem for a directed
graph G = (V, E) with nonnegative edge
weights.
• i.e., w (u, v) ≥ 0 for each edge (u, v) ∈ E.
Dijkstra's Algorithm
Dijkstra's Algorithm maintains a set S of vertices whose final
shortest-path weights from the source s have already been
determined.

Relaxation:
d[u]+c(u , v)<d( v) u v
2+3<ꝏ

For all vertices v ∈ S; we have d [v] = δ (s, v). The algorithm repeatedly selects
the vertex u ∈ V - S with the minimum shortest - path estimate, insert u into S
and relaxes all edges leaving u.

Because it always chooses the "lightest" or "closest" vertex in V - S to insert into set S, it
is called the greedy strategy .
Dijkstra’s Algorithm
2 7 4
2
1
1 1 2 6

4
5
3 5
3
2 7 4
2
1
1 1 2 6

4
5
3 5
3

2 4
2

1 1 2 6

5
3 5
3
Complexity Analysis –

For the first for loop will take O(|V|) times to execute.

As there are |V| nodes in the graph, size of queue Q would be V , & hence
while loop iterate |V| times in worst case.

Its node have maximum |V|-1 edges.

The worst-case upper bound running times of this algorithm is described as


O(|V2|)
45

50 10
1 2 3

35
10
15 20
30

4 15
5 6
3

Selected 2 3 4 5 6 Parent
Vertex Node
(1) 50 45 10 ∞ ∞

4 50 45 10 25 ∞ 1

5 45 45 10 25 ∞ 4

2 45 45 10 25 ∞ 5

3 45 45 10 25 ∞ 2
10
1 2 3

10
20

4 15
5 6
D

H
D

H
Minimum spanning tree
A graph can have many spanning trees
with different cost. Minimum spanning
tree is spanning tree with minimum cost
Kruskal’s Algorithm
PRIM’S ALGORITHM KRUSKAL’S ALGORITHM
It starts to build the Minimum It starts to build the Minimum
Spanning Tree from any vertex in Spanning Tree from the vertex
the graph. carrying minimum weight in the
graph.
It traverses one node more than
one time to get the minimum It traverses one node only once.
distance.
Prim’s algorithm has a time Kruskal’s algorithm’s time
complexity of O(V^2), V being the complexity is O(logV), V being the
number of vertices. number of vertices.

Prim’s algorithm gives connected Kruskal’s algorithm can generate


forest(disconnected components)
component as well as it works only at any instant as well as it can work
on connected graph. on disconnected components

Prim’s algorithm runs faster in Kruskal’s algorithm runs faster in


dense graphs. sparse graphs.

You might also like