Time Complexity
Time complexity measures how the computation time of an algorithm increases with the size of the
input. It helps predict the behavior of the algorithm for very large inputs.
Big-O Notation:
- O(1): Constant time (e.g., array access)
- O(log n): Logarithmic time (e.g., binary search)
- O(n): Linear time (e.g., linear search)
- O(n log n): Log-linear time (e.g., merge sort)
- O(n^2): Quadratic time (e.g., bubble sort)
- O(2^n): Exponential time (e.g., subset problems)
Best, Average, and Worst Cases:
- Best Case: Minimum steps.
- Average Case: Expected steps over all inputs.
- Worst Case: Maximum steps.
Comparative Growth:
O(1) < O(log n) < O(n) < O(n log n) < O(n^2) < O(2^n) < O(n!).
Divide and Conquer
Divide and Conquer solves a problem by:
1. Dividing into smaller subproblems.
2. Solving each recursively.
3. Combining the solutions.
Examples:
- Merge Sort
- Quick Sort
- Binary Search
Advantages:
- Reduces complexity.
- Suitable for parallelism.
Recurrence:
T(n) = 2T(n/2) + O(n) => O(n log n) (Using Master's theorem).
Dynamic Programming
Dynamic Programming solves problems by:
- Breaking into subproblems.
- Storing results to avoid recomputation.
Key Concepts:
- Overlapping subproblems.
- Optimal substructure.
Methods:
- Top-Down (Memoization)
- Bottom-Up (Tabulation)
Examples:
- Fibonacci Sequence
- Knapsack Problem
- Matrix Chain Multiplication
Time complexity reduced significantly compared to naive recursion.
Greedy Strategy
Greedy Strategy builds a solution by choosing the best local option at each step.
Characteristics:
- Local optimization at each step.
- No backtracking.
Examples:
- Huffman Encoding
- Prim's and Kruskal's MST
- Fractional Knapsack
When it Works:
- Greedy-choice property.
- Optimal substructure present.
Revision of Data Structures and Matrix Chain Multiplication
Common Data Structures:
- Array: Simple linear storage.
- Linked List: Sequential storage.
- Stack: LIFO behavior.
- Queue: FIFO behavior.
- Trees: Hierarchical structure.
Traversals:
- Arrays, Lists: Linear.
- Trees: Inorder, Preorder, Postorder, Level-order.
Matrix Chain Multiplication:
- Minimize number of multiplications.
- DP Approach with O(n^3) time.
- Useful in database query optimization.
Backtracking
Backtracking tries to build a solution incrementally and abandons paths that lead to invalid solutions.
Steps:
1. Choose
2. Explore
3. Unchoose (Backtrack)
Examples:
- N-Queens
- Sudoku Solver
- Subset Sum Problem
Backtracking can lead to exponential time in worst cases.
Branch and Bound
Branch and Bound improves backtracking by calculating bounds to avoid exploring paths that
cannot yield better solutions.
Steps:
- Branch into subproblems.
- Calculate bounds.
- Prune paths with bad bounds.
Examples:
- Traveling Salesman Problem (TSP)
- 0/1 Knapsack Problem
Compared to backtracking, more aggressive pruning happens.
NP-Hard and NP-Complete
Definitions:
- NP: Problems verifiable in polynomial time.
- NP-Hard: As hard as the hardest NP problems.
- NP-Complete: Problems that are both NP and NP-Hard.
Examples:
- SAT (Satisfiability Problem)
- Hamiltonian Path
- 0/1 Knapsack (decision version)
Important Concepts:
- P vs NP
- Reducibility among problems.
- Cook-Levin Theorem proving SAT is NP-Complete.