0% found this document useful (0 votes)
2 views

DAA Unit 1

Design and analysis of algorithms
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

DAA Unit 1

Design and analysis of algorithms
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

DAA Unit – 1

Q1] Write a control abstraction for greedy method. Comment on the time complexity of this abstraction?

Ans: The control abstraction for the greedy method consists of a structured approach to solving optimization
problems through a series of locally optimal choices. Here are the key points regarding this abstraction and its
time complexity:

Control Abstraction for Greedy Method

1. Selection:

• At each step, select the best available option from the remaining choices based on a specific
criterion (e.g., minimum cost, maximum value).

2. Feasibility Check:

• Verify if the selected option can be included in the current solution without violating any
constraints. This is done using a feasibility function.

3. Solution Update:

• If the selected option is feasible, add it to the current solution set; otherwise, discard it. Repeat
until all options are considered or a complete solution is achieved.

4. Pseudocode Representation:

Algorithm GreedyMethod(a, n)

Solution := Ø // Initialize an empty solution set

for i := 0 to n do

s := select(a) // Select an option from array a

if feasible(Solution, s) then

Solution := union(Solution, s) // Add to solution if feasible

else

reject(s) // Discard if not feasible

return Solution // Return the final solution set

Time Complexity :

• Overall Complexity: The time complexity of greedy algorithms typically ranges


from O(nlog⁡n)O(nlogn) to O(n2)O(n2), depending on:

• The efficiency of the selection function.

• The complexity of the feasibility check.

• The time required for updating the solution set (union operation).

• Factors Influencing Complexity:

• If sorting is involved (common in many greedy algorithms), it contributes O(nlog⁡n)O(nlogn).

• The feasibility check can vary in complexity based on problem constraints, potentially leading to
higher overall complexity.
Q2] Comment on the statement “Problem which does not satisfy the principle of optimality cannot be
solved by dynamic programming”

Ans: Here’s a concise commentary on the statement "Problems that do not satisfy the principle of optimality
cannot be solved by dynamic programming":

1. Principle of Optimality: States that optimal solutions can be constructed from optimal subproblem
solutions.

2. Optimal Substructure: DP requires problems to have this property; otherwise, it cannot be applied.

3. Applicability: If a problem lacks optimal substructure, DP techniques are ineffective.

4. Examples:

• Applicable: Shortest path problems (e.g., Dijkstra's algorithm).

• Not Applicable: Traveling Salesman Problem (TSP).

5. Efficiency: DP is efficient for problems with overlapping subproblems and optimal structure; others
may need exhaustive methods.

6. Conclusion: The statement underscores that DP relies on the principle of optimality; without it, DP is
unsuitable.

Q3] Write a control abstraction for dynamic programming strategy. Comment on the time complexity of
this abstraction?

Ans: Control Abstraction for Dynamic Programming

1. Problem Decomposition:

• Break the problem into smaller overlapping subproblems.

2. Optimal Substructure:

• Ensure that the optimal solution to the overall problem can be constructed from optimal
solutions of its subproblems.

3. Memoization or Tabulation:

• Memoization (Top-Down): Store results of subproblems in a cache to avoid redundant


calculations during recursive calls.

• Tabulation (Bottom-Up): Build a table iteratively to store solutions of subproblems, starting from
the simplest cases and working up.

4. Final Solution Construction:

• Combine the stored results of subproblems to form the final solution to the original problem.

Pseudocode Representation:

Algorithm DynamicProgramming(P)

if solved(P) then

return lookup(P)

else

Ans ← SOLVE(P)

store (P, Ans)


end

FUNCTION SOLVE(P)

if sufficiently small(P) then

return solution(P)

else

Divide P into smaller subproblems P1, P2, ..., Pn

Ans1 ← DynamicProgramming(P1)

Ans2 ← DynamicProgramming(P2)

...

Ansn ← DynamicProgramming(Pn)

return combine(Ans1, Ans2, ..., Ansn)

end

Time Complexity:

• General Complexity: The time complexity of dynamic programming typically reduces from exponential
(in naive recursive solutions) to polynomial, often O(n2)O(n2) or O(n⋅m)O(n⋅m), depending on the
number of subproblems and their interdependencies.

• Space Complexity: Space complexity may also increase due to storage requirements for memoization
or tabulation, often leading to O(n)O(n) or O(n⋅m)O(n⋅m) based on the problem size.

Q4] Write steps for Greedy approach for Job sequencing.

Ans: Steps for Greedy Approach in Job Sequencing

1. Sort Jobs by Profit:

• Arrange all jobs in decreasing order of profit. This ensures that the most profitable jobs are
considered first.

2. Determine Maximum Deadline:

• Identify the maximum deadline among all jobs to establish the time slots available for
scheduling.

3. Initialize Time Slots:

• Create an array (or list) to keep track of free time slots. Each slot corresponds to a unit of time
up to the maximum deadline.

4. Iterate Through Jobs:

• For each job in the sorted list:

• Check for an available time slot starting from the job's deadline down to 0.

• If a free slot is found, assign the job to that slot and mark it as occupied.

5. Skip Unassignable Jobs:


• If no available slot exists for a job, simply ignore that job and move on to the next one.

6. Output the Scheduled Jobs:

• After processing all jobs, output the sequence of jobs that have been successfully scheduled
within their deadlines.

Q5] What is greedy approach?

Ans: A greedy algorithm is a problem-solving approach that makes a series of locally optimal choices at each
step with the hope of finding a global optimum.

Q6] What is job scheduling algorithm?

Ans: A job scheduling algorithm is a method used in computing to determine the order and timing of job
execution on a CPU or in a processing system. These algorithms aim to optimize various performance metrics
such as efficiency, resource utilization, and response time.

Q7] What is optimal binary search tree? How dynamic programming approach

is used to build OBST for following tale.

1 2 3 4

Keys→ 10 20 30 40

Frequency→ 4 2 6 3

Ans: Optimal Binary Search Tree (OBST)

An Optimal Binary Search Tree (OBST) is a binary search tree that minimizes the expected search cost based on
the frequency of access for each key. The goal is to arrange the keys in such a way that the total cost of
searching for all keys is minimized, taking into account their respective frequencies.

Dynamic Programming Approach to Build OBST

To construct an OBST using dynamic programming, follow these steps:

1. Define the Problem:

• Let n be the number of keys.

• Let keys=[10,20,30,40] and frequencies=[4,2,6,3]

2. Create Tables:

• Create a table cost[i][j] to store the minimum cost of the subtree containing keys from index i
to j.

• Create a table root[i][j] to store the root index of the optimal subtree for keys from index i to j.

3. Calculate Cumulative Frequencies:

• Compute cumulative frequencies to facilitate quick calculations of costs:

sum[i][j]=∑freq[k]

4. Dynamic Programming Calculation:

• For each length l from 1 to n:


• For each starting index ii from 0 to n−ln−l:

• Set j=i+l−1.

• Initialize cost[i][j] to infinity.

• For each possible root r from i to j:

• Calculate the cost if key rr is chosen as the root:

cost[i][j]=min(cost[i][j],cost[i][r−1]+cost[r+1][j]+sum[i][j])

• Update root[i][j] with the index of the chosen root.

5. Constructing the OBST:

• The optimal binary search tree can be constructed using the root table by recursively building
subtrees based on the roots stored in root[i][j].

You might also like