DSA-Full Presentation
DSA-Full Presentation
qINTRODUCTION TO DATA
STRUCTURES AND ALGORITHMS
qAPPENDIX
26/08/24 2
Data Structure & Algorithm
Definition and Importance:
Operations:
• Access: O(1)
• Insertion/Deletion
Data Structure & Algorithm at the end: O(1) 26/08/24 4
• Insertion/Deletion at arbitrary positions: O(n)
array = [1, 2, 3, 4, 5]
• Singly Linked List: Each node has a single pointer to the next node.
• Doubly Linked List: Each node has pointers to both the next and
previous nodes.
Operations:
BASIC DATA • Traversal: Visiting each node starting from the head.
• Insertion: Adding a node at the beginning, end, or a specific position.
STRUCTURES - • Deletion: Removing a node from the beginning, end, or a specific
LINKED LISTS position.
• Searching: Finding a node with a specific value.
• Access: O(n)
• Insertion/Deletion at the head: O(1)
• First in, first out (FIFO) means the first element added to the queue is
the first one to be removed.
• Implementation:
• Operations:
26/08/24 11
Basic data structures - Trees
Binary Trees and Binary Search Trees:
• Binary Tree: A tree where each node has at most two children.
• Binary Search Tree: A binary tree where every node satisfies the BST
property (left child < parent < right child).
Tree Traversal:
Balanced Trees:
• Representation:
o Adjacency list: A list where each element represents a node and its neighbors.
o Adjacency matrix: a matrix where matrix[i][j] represents the edge between nodes i and j.
• Graph traversal:
o Breadth-first search (BFS): visits all nodes at the current depth before moving on to nodes at the next depth.
o Depth-first search (dfs): explores as far as possible along each branch before backtracking.
• Applications:
o BFS, DFS: O(V + E), where V is the number of vertices and E is the number of edges.
visited.add(vertex) g.add_edge(2, 3)
g.add_edge(3, 3)
print(vertex)
o Applications:
§ Auto-complete features:
suggesting words as you type.
o Definition: A space-efficient
probabilistic data structure used to
test whether an element is a member
of a set.
o Applications:
o Bubble sort:
§ Time complexity: o(n^2) in the worst and average cases, o(n) in the
best case (already sorted).
o Selection sort:
§ Description: divides the input list into two parts: a sorted sublist of
o Insertion sort:
ALGORITHMS § Description: builds the final sorted array one item at a time.
§ Time complexity: o(n^2) in the worst and average cases, o(n) in the
best case (already sorted).
o Merge sort:
o Description: A divide-and-conquer algorithm that selects a 'pivot' element and partitions the
array around the pivot.
o Time complexity: o(n log n) in the average case, o(n^2) in the worst case (poor pivot selection).
• Heap sort:
§ Description: exploits the fact that the input elements are integers in a certain range.
o Radix sort:
§ Time complexity: o(d(n + k)), where d is the number of digits and k is the range of input.
o Bucket sort:
§ Time complexity: o(n + k) in the average case, o(n^2) in the worst case.
• Description: starts at the root node and explores as far as possible along each branch before
Depth-first backtracking.
search (dfs): • Time complexity: o(v + e).
• Dijkstra's algorithm:
• Description: finds the shortest paths from a single source vertex to all other vertices in a graph with non-negative edge weights.
• Time complexity: o((v + e) log v).
Shortest path • Bellman-ford algorithm:
• Description: finds the shortest paths from a single source vertex to all other vertices in a graph and can handle negative edge weights.
algorithms: • Time complexity: o(ve).
• Floyd-warshall algorithm:
• Description: finds the shortest paths between all pairs of vertices in a graph.
• Time complexity: o(v^3).
• Kruskal's algorithm:
• Description: finds the minimum spanning tree by adding edges in ascending order of their weights.
Minimum • Time complexity: o(e log e).
spanning tree • Prim's algorithm:
• Description: finds the minimum spanning tree by starting with an arbitrary vertex and adding the minimum weight edge that
algorithms: connects a vertex inside the tree with a vertex outside the tree.
• Time complexity: o((v + e) log v).
Data Structure & Algorithm 26/08/24 31
TIME & SPACE COMPLEXITY FOR GRAPH ALGORITHMS
33
26/08/24 Data Structure & Algorithm
PYTHON CODE FOR GRAPH ALGORITHM (2)
def bellman_ford(graph, start):
distances = {vertex: float('infinity') for vertex in graph}
distances[start] = 0
for _ in range(len(graph) - 1):
for u in graph:
for v, w in graph[u].items():
if distances[u] + w < distances[v]:
distances[v] = distances[u] + w
return distances
def floyd_warshall(graph):
vertices = len(graph)
distances = [[float('infinity') for _ in range(vertices)]
for _ in range(vertices)]
for u in range(vertices):
for v in range(vertices):
if u == v:
distances[u][v] = 0
34
26/08/24 Data Structure & Algorithm
PYTHON CODE FOR GRAPH ALGORITHM (3)
def kruskal(graph):
def find_parent(parent, i):
if parent[i] == i:
return i
return find_parent(parent, parent[i])
35
26/08/24 Data Structure & Algorithm
PYTHON CODE FOR GRAPH ALGORITHM (4)
def union(parent, rank, x, y):
xroot = find_parent(parent, x)
yroot = find_parent(parent, y)
if rank[xroot] < rank[yroot]:
parent[xroot] = yroot
elif rank[xroot] > rank[yroot]:
parent[yroot] = xroot
else:
parent[yroot] = xroot
rank[xroot] += 1
result = []
i = 0
e = 0
graph = sorted(graph, key=lambda item: item[2])
parent = {}
rank = {}
36
26/08/24 Data Structure & Algorithm
PYTHON CODE FOR GRAPH ALGORITHM (4)
for node in graph:
parent[node[0]] = node[0]
rank[node[0]] = 0
for node in graph:
parent[node[1]] = node[1]
rank[node[1]] = 0
while e < len(graph) - 1:
u, v, w = graph[i]
i += 1
x = find_parent(parent, u)
y = find_parent(parent, v)
if x != y:
e += 1
result.append([u, v, w])
union(parent, rank, x, y)
return result
37
26/08/24 Data Structure & Algorithm
PYTHON CODE FOR GRAPH ALGORITHM (5)
def prim(graph, start):
mst = []
visited = set()
edges = [(0, start)]
while edges:
weight, u = heapq.heappop(edges)
if u in visited:
continue
visited.add(u)
mst.append((u, weight))
for v, w in graph[u].items():
if v not in visited:
heapq.heappush(edges, (w, v))
return mst
38
26/08/24 Data Structure & Algorithm
PYTHON CODE FOR GRAPH ALGORITHM (6)
# example usage
graph = {
'a': {'b': 1, 'c': 4},
'b': {'a': 1, 'c': 2, 'd': 5},
'c': {'a': 4, 'b': 2, 'd': 1},
'd': {'b': 5, 'c': 1}
}
print("dijkstra's algorithm:", dijkstra(graph, 'a'))
print("bellman-ford algorithm:", bellman_ford(graph, 'a'))
print("floyd-warshall algorithm:", floyd_warshall(graph))
graph = [
('a', 'b', 1),
('a', 'c', 4),
('b', 'c', 2),
('b', 'd', 5),
('c', 'd', 1)
]
39
26/08/24 Data Structure & Algorithm
PYTHON CODE FOR GRAPH ALGORITHM (7)
graph = {
'a': {'b': 1, 'c': 4},
'b': {'a': 1, 'c': 2, 'd': 5},
'c': {'a': 4, 'b': 2, 'd': 1},
'd': {'b': 5, 'c': 1}
}
print("prim's algorithm:", prim(graph, 'a'))
40
26/08/24 Data Structure & Algorithm
ALGORITHMS - DYNAMIC PROGRAMMING
Memoization vs. Tabulation: Key Problems: Coin Change Problem: Fibonacci Sequence:
45
26/08/24 Data Structure & Algorithm
PYTHON CODE FOR DYNAMIC PROGRAMMING
46
26/08/24 Data Structure & Algorithm
PYTHON CODE FOR DYNAMIC PROGRAMMING
def coin_change(coins, amount):
dp = [float('inf')] * (amount + 1)
dp[0] = 0
for coin in coins:
for i in range(coin, amount + 1):
dp[i] = min(dp[i], dp[i - coin] + 1)
return dp[amount] if dp[amount] != float('inf') else -1
def fibonacci(n):
dp = [0] * (n + 1)
dp[1] = 1
for i in range(2, n + 1):
dp[i] = dp[i - 1] + dp[i - 2]
return dp[n]
47
26/08/24 Data Structure & Algorithm
PYTHON CODE FOR DYNAMIC PROGRAMMING
# example usage
values = [60, 100, 120]
weights = [10, 20, 30]
capacity = 50
print("knapsack problem:", knapsack(values, weights, capacity))
s1 = "abcb"
s2 = "bdcab"
print("longest common subsequence:", longest_common_subsequence(s1, s2))
coins = [1, 2, 5]
amount = 11
print("coin change problem:", coin_change(coins, amount))
n = 10
print("fibonacci sequence:", fibonacci(n))
48
26/08/24 Data Structure & Algorithm
ALGORITHMS - GREEDY ALGORITHMS
• Principle:
o Applications:
52
26/08/24 Data Structure & Algorithm
PYTHON CODE FOR GREEDY ALGORITHM
# example usage
values = [60, 100, 120]
weights = [10, 20, 30]
capacity = 50
print("fractional knapsack problem:", fractional_knapsack(values, weights,
capacity))
53
26/08/24 Data Structure & Algorithm
Data Structure & Algorithm 26/08/24 54
ALGORITHMS - BACKTRACKING
• Principle:
o Description: incrementally builds candidates to
the solutions, and abandons a candidate
("backtracks") as soon as it is determined that the
candidate cannot possibly lead to a valid solution.
o Applications:
§ N-queens problem: placing N queens on an
N×N chessboard such that no two queens
attack each other.
§ Sudoku solver: filling a 9×9 grid with digits so
that each column, each row, and each of the
nine 3×3 subgrids that compose the grid
contain all of the digits from 1 to 9.
56
26/08/24 Data Structure & Algorithm
PYTHON CODE FOR BACKTRACKING ALGORITHM
def solve_n_queens(board, col):
if col >= len(board):
return true
for i in range(len(board)):
if is_safe(board, i, col):
board[i][col] = 1
if solve_n_queens(board, col + 1):
return true
board[i][col] = 0
return false
def print_solution(board):
for row in board:
print(" ".join(str(col) for col in row))
# example usage
n = 4
board = [[0] * n for _ in range(n)]
if solve_n_queens(board, 0) == false:
print("solution does not exist")
else:
print_solution(board)
57
26/08/24 Data Structure & Algorithm
ALGORITHMS - DIVIDE AND CONQUER
• Principle:
o Applications:
# example usage
arr = [3, 6, 2, 7, 1]
merge_sort(arr)
print("merge sort:", arr)
arr = [3, 6, 2, 7, 1]
print("quick sort:", quick_sort(arr))
• PSEUDOCODE:
o Description: A high-level
description of an algorithm
that uses a combination of
natural language and
programming language-
like constructs.
# example usage
arr = [1, 3, 5, 7, 9]
print("binary search:", binary_search(arr, 7))
• Linked list: a linear collection of data elements whose order is not given by their physical
placement in memory.
• Stack: a linear data structure that follows the last in, first out (lifo) principle.
• Queue: a linear data structure that follows the first in, first out (fifo) principle.
• Hash table: a data structure that implements an associative array abstract data type, a
structure that can map keys to values.
• Graph: a data structure that consists of a set of vertices (or nodes) and a set of edges that
connect pairs of nodes.
• Heap: a specialized tree-based data structure that satisfies the heap property.
• Bloom filter: a space-efficient probabilistic data structure used to test whether an element is a
member of a set.
• Dynamic programming: a method for solving complex problems by breaking them down into
simpler subproblems.
• Greedy algorithms: algorithms that make the locally optimal choice at each stage with the
hope of finding a global optimum.
• Backtracking: a general algorithm for finding all solutions to some computational problems.
26/08/24 66
Data Structure & Algorithm
Additional Resources
• Online courses:
o Coursera
o Edx
o Udemy
• Youtube channels:
o MIT opencourseware
o Computerphile
o Algorythmics
• Interactive platforms:
o Leetcode
o Hackerrank
o Codeforces
• BOOKS:
o "Introduction To Algorithms" By Thomas H. Cormen, Charles E. Leiserson, Ronald L. Rivest, And Clifford Stein.
o "Algorithms" By Robert Sedgewick And Kevin Wayne.