Introduction to Algorithms
An algorithm is a finite sequence of well-defined instructions used to solve a problem or perform a
computation. Algorithms are essential in computer science for tasks like searching, sorting, and numerical
computations.
Characteristics of an Algorithm:
1. Input: Zero or more inputs are externally supplied.
2. Output: At least one output is produced.
3. Definiteness: Each instruction is clearly and unambiguously defined.
4. Finiteness: The algorithm terminates after a finite number of steps.
5. Effectiveness: All operations must be basic enough to be performed exactly and in a finite amount
of time.
Example: Algorithm to add two numbers:
1. Start
2. Read two numbers A and B
3. Sum = A + B
4. Display Sum
5. End
Algorithm Analysis: Space & Time Complexity
• Time Complexity: The total time taken by an algorithm to run, as a function of the input size.
• Space Complexity: The total memory space required by the algorithm for its execution.
Example:
for(int i = 0; i < n; i++) {
printf("%d", i);
}
Time Complexity: O(n) Space Complexity: O(1)
Growth of Functions: Big-O, Big-Ω, Big-Θ
• Big-O (O): Upper bound; worst-case scenario.
• Big-Ω (Omega): Lower bound; best-case scenario.
• Big-Θ (Theta): Tight bound; average-case scenario.
1
Example: For the function f(n) = 3n^2 + 2n + 1:
• O(f(n)) = O(n^2)
• Ω(f(n)) = Ω(n^2)
• Θ(f(n)) = Θ(n^2)
Performance Measurement
• Execution Time: Use timers to measure runtime.
• Counting Primitive Operations: Count number of steps as a function of input size.
• Profiling Tools: Use tools like gprof, Valgrind.
Example: Comparing Bubble Sort and Merge Sort:
• Bubble Sort: O(n^2)
• Merge Sort: O(n log n)
Asymptotic Notations
1. Big-O Notation (O): Upper bound
2. Big-Omega Notation (Ω): Lower bound
3. Big-Theta Notation (Θ): Exact bound
Examples:
• O(n^2): Bubble Sort
• Θ(n log n): Merge Sort
• Ω(n): Linear Search best case
Introduction to Recurrence Relations
A recurrence relation is an equation that recursively defines a sequence. Common in divide-and-conquer
algorithms.
Example: Merge Sort: T(n) = 2T(n/2) + O(n)
Substitution Method
Used to guess a solution and prove it by induction.
2
Steps:
1. Guess the form of the solution.
2. Use induction to prove.
Example: T(n) = 2T(n/2) + n Guess: T(n) = O(n log n) Proof: Use mathematical induction.
Recursive Tree Method
Draw a tree of recursive calls and sum the cost at each level.
Example: T(n) = 2T(n/2) + n Tree has log n levels, cost per level = n → Total = O(n log n)
Master’s Theorem
For T(n) = aT(n/b) + f(n):
• If f(n) = O(n^c) where c < log_b(a), then T(n) = Θ(n^log_b(a))
• If f(n) = Θ(n^c) where c = log_b(a), then T(n) = Θ(n^c log n)
• If f(n) = Ω(n^c) where c > log_b(a) and satisfies regularity, then T(n) = Θ(f(n))
Example: T(n) = 2T(n/2) + n → a = 2, b = 2, f(n) = n log_b(a) = log_2(2) = 1 → f(n) = Θ(n^1) → Case 2 → T(n) =
Θ(n log n)