Design and Analysis of Algorithms
Design and Analysis of Algorithms
Diagram: One-dimensional array with indexed slots. An array is a contiguous block of memory storing
elements of the same type. You can access any element by its index in O(1) time. Common operations:
- Traverse: Visit all elements sequentially, O(n) time.
1
- Search – Linear: Check each element (unsorted), worst-case O(n) 13 .
- Search – Binary: On sorted array; repeatedly split search range, O(log n) 14 .
- Insertion/Deletion: If inserting or removing at arbitrary position, elements after that point must shift
(O(n)). E.g. deleting an element costs O(n) 15 . Inserting at end (if space available) can be O(1).
- 2D Arrays: Think of as a matrix (rows×columns) stored in row-major order. Example: a 3×3 matrix is a 2D
array 16 . Common for grid or table data.
4. Stacks (LIFO)
Diagram: Stack operations (push and pop on top). A stack is Last-In-First-Out (LIFO). Only the top element is
accessible. Operations:
- Push(x): Add element x on top (O(1)).
- Pop(): Remove the top element (O(1)).
- Peek/Top(): Read top element without removing it (O(1)).
- Applications: Expression evaluation and conversion. In postfix (Reverse Polish) notation, operators come
after operands. For example, to evaluate postfix 2 3 4 * + : push 2, 3, 4; pop 3 and 4 for ‘*’, push result
12; then pop 2 and 12 for ‘+’, yielding 14. Postfix format is “operand operand operator” 17 . Stacks also
implement function-call history (call stack) and undo functionality.
2
6. Queues (FIFO)
Diagram: Queue (first-in-first-out) showing enqueue at rear, dequeue at front. A queue is First-In-First-Out (FIFO).
Operations:
- Enqueue(x): Insert element x at the rear/tail (O(1)).
- Dequeue(): Remove element from the front/head (O(1)).
- Traverse: Visit elements front to rear (O(n)).
- Usage: CPU process scheduling (ready queue), breadth-first search on graphs. A queue supports FIFO: the
earliest enqueued element is the first to be dequeued 20 21 .
3
Diagram: A binary tree.
- Binary Tree: Each node has at most two children (left/right) 22 . A complete binary tree has all levels full
except possibly the last, which is filled left-to-right 22 .
- Binary Search Tree (BST): A binary tree where left-subtree values < node < right-subtree values 23 .
Allows fast search/insertion/deletion in average O(log n). Inserting a key follows BST property (go left or
right from root until leaf). Deletion has cases (leaf, one-child, two-children) requiring tree adjustment.
- Tree Traversals: In-order (Left→Node→Right), Pre-order (Node→Left→Right), Post-order
(Left→Right→Node). In-order of a BST yields sorted order 24. (Pre-order is used for copying a tree; post-
order can delete a tree or produce postfix expressions).
- B-tree: A self-balancing multi-way search tree used in databases and file systems 25 . It generalizes BST by
allowing nodes with many children, keeping height low. Search/insert/delete in O(log n) 25 . All leaves stay
at same level.
- AVL Tree: A self-balancing BST where for every node, heights of left and right subtrees differ by at most 1
26 . Ensures O(log n) search/insert/delete by performing rotations to rebalance on inserts/deletions.
4
9. Asymptotic Rules
• Order-of-Growth: Focus on how runtime scales. Big-O notation describes the upper bound growth
rate 10 . Big-Θ gives tight bounds.
• Ignoring Constants: Drop constants and lower-order terms. E.g. 3n²+10n+7 is Θ(n²) 11 .
• Combining Costs: For sequential code, add complexities (e.g. two loops: O(n)+O(n)=O(n)). For nested
loops, multiply (e.g. double loop gives O(n²) 28 ). Master Theorem or recursion can solve divide-and-
conquer recurrences.
5
12. Calculating Time Complexity
• Loop Analysis: Count how many times key operations run. A single loop from 1..n does O(n)
operations. Nested loops multiply (e.g. two nested loops give ~n² operations 28 ). Consecutive loops
add (e.g. two separate O(n) loops give O(2n)=O(n)) 35 .
• Recurrences: For recursive algorithms, set up recurrence (e.g. T(n) = T(n/2)+O(1) ⇒ O(log n), or
Master Theorem for divide-and-conquer).
• Combine Costs: Sum or multiply as above, then apply big-O simplifications (drop lower terms).
6
1 Algorithm - Wikipedia
https://en.wikipedia.org/wiki/Algorithm
25 B-tree - Wikipedia
https://en.wikipedia.org/wiki/B-tree
7
27 Heap Data Structure | GeeksforGeeks
https://www.geeksforgeeks.org/heap-data-structure/
31 32 Difference between 0/1 Knapsack problem and Fractional Knapsack problem | GeeksforGeeks
https://www.geeksforgeeks.org/difference-between-0-1-knapsack-problem-and-fractional-knapsack-problem/
40 The Algorithms Behind The Working Of Google Maps | by Sachin Singh | Medium
https://medium.com/@sachin28/the-algorithms-behind-the-working-of-google-maps-73c379bcc9b9