DAA Unit 1
DAA Unit 1
Q1] Write a control abstraction for greedy method. Comment on the time complexity of this abstraction?
Ans: The control abstraction for the greedy method consists of a structured approach to solving optimization
problems through a series of locally optimal choices. Here are the key points regarding this abstraction and its
time complexity:
1. Selection:
• At each step, select the best available option from the remaining choices based on a specific
criterion (e.g., minimum cost, maximum value).
2. Feasibility Check:
• Verify if the selected option can be included in the current solution without violating any
constraints. This is done using a feasibility function.
3. Solution Update:
• If the selected option is feasible, add it to the current solution set; otherwise, discard it. Repeat
until all options are considered or a complete solution is achieved.
4. Pseudocode Representation:
Algorithm GreedyMethod(a, n)
for i := 0 to n do
if feasible(Solution, s) then
else
Time Complexity :
• The time required for updating the solution set (union operation).
• The feasibility check can vary in complexity based on problem constraints, potentially leading to
higher overall complexity.
Q2] Comment on the statement “Problem which does not satisfy the principle of optimality cannot be
solved by dynamic programming”
Ans: Here’s a concise commentary on the statement "Problems that do not satisfy the principle of optimality
cannot be solved by dynamic programming":
1. Principle of Optimality: States that optimal solutions can be constructed from optimal subproblem
solutions.
2. Optimal Substructure: DP requires problems to have this property; otherwise, it cannot be applied.
4. Examples:
5. Efficiency: DP is efficient for problems with overlapping subproblems and optimal structure; others
may need exhaustive methods.
6. Conclusion: The statement underscores that DP relies on the principle of optimality; without it, DP is
unsuitable.
Q3] Write a control abstraction for dynamic programming strategy. Comment on the time complexity of
this abstraction?
1. Problem Decomposition:
2. Optimal Substructure:
• Ensure that the optimal solution to the overall problem can be constructed from optimal
solutions of its subproblems.
3. Memoization or Tabulation:
• Tabulation (Bottom-Up): Build a table iteratively to store solutions of subproblems, starting from
the simplest cases and working up.
• Combine the stored results of subproblems to form the final solution to the original problem.
Pseudocode Representation:
Algorithm DynamicProgramming(P)
if solved(P) then
return lookup(P)
else
Ans ← SOLVE(P)
FUNCTION SOLVE(P)
return solution(P)
else
Ans1 ← DynamicProgramming(P1)
Ans2 ← DynamicProgramming(P2)
...
Ansn ← DynamicProgramming(Pn)
end
Time Complexity:
• General Complexity: The time complexity of dynamic programming typically reduces from exponential
(in naive recursive solutions) to polynomial, often O(n2)O(n2) or O(n⋅m)O(n⋅m), depending on the
number of subproblems and their interdependencies.
• Space Complexity: Space complexity may also increase due to storage requirements for memoization
or tabulation, often leading to O(n)O(n) or O(n⋅m)O(n⋅m) based on the problem size.
• Arrange all jobs in decreasing order of profit. This ensures that the most profitable jobs are
considered first.
• Identify the maximum deadline among all jobs to establish the time slots available for
scheduling.
• Create an array (or list) to keep track of free time slots. Each slot corresponds to a unit of time
up to the maximum deadline.
• Check for an available time slot starting from the job's deadline down to 0.
• If a free slot is found, assign the job to that slot and mark it as occupied.
• After processing all jobs, output the sequence of jobs that have been successfully scheduled
within their deadlines.
Ans: A greedy algorithm is a problem-solving approach that makes a series of locally optimal choices at each
step with the hope of finding a global optimum.
Ans: A job scheduling algorithm is a method used in computing to determine the order and timing of job
execution on a CPU or in a processing system. These algorithms aim to optimize various performance metrics
such as efficiency, resource utilization, and response time.
Q7] What is optimal binary search tree? How dynamic programming approach
1 2 3 4
Keys→ 10 20 30 40
Frequency→ 4 2 6 3
An Optimal Binary Search Tree (OBST) is a binary search tree that minimizes the expected search cost based on
the frequency of access for each key. The goal is to arrange the keys in such a way that the total cost of
searching for all keys is minimized, taking into account their respective frequencies.
2. Create Tables:
• Create a table cost[i][j] to store the minimum cost of the subtree containing keys from index i
to j.
• Create a table root[i][j] to store the root index of the optimal subtree for keys from index i to j.
sum[i][j]=∑freq[k]
• Set j=i+l−1.
cost[i][j]=min(cost[i][j],cost[i][r−1]+cost[r+1][j]+sum[i][j])
• The optimal binary search tree can be constructed using the root table by recursively building
subtrees based on the roots stored in root[i][j].