0% found this document useful (0 votes)
6 views6 pages

1. Introduction to DSA in Python

The document provides an introduction to Data Structures and Algorithms (DSA) in Python, explaining the significance of DSA in problem-solving, competitive programming, and system design. It covers key concepts such as time and space complexity, including various types of time complexities with examples in Python. Additionally, it discusses optimization strategies for improving algorithm efficiency.

Uploaded by

kingkundu777
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views6 pages

1. Introduction to DSA in Python

The document provides an introduction to Data Structures and Algorithms (DSA) in Python, explaining the significance of DSA in problem-solving, competitive programming, and system design. It covers key concepts such as time and space complexity, including various types of time complexities with examples in Python. Additionally, it discusses optimization strategies for improving algorithm efficiency.

Uploaded by

kingkundu777
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 6

1 Introduction to DSA in Python

What is Data Structures and Algorithms (DSA)?


DSA is a combination of
• Data Structures – Ways to store and organize data (e.g., arrays, linked lists, trees).

• Algorithms – Step-by-step procedures to solve problems efficiently (e.g., sorting,


searching).

Importance of DSA
• Problem-Solving: Helps break down problems into efficient steps.
• Competitive Programming: Optimized solutions give an edge in coding contests.
• System Design: Used in software development for scalable and optimized solutions.

Python for DSA


Python provides built-in libraries for efficient DSA implementations:
• collections (Deque, Counter, OrderedDict)
• heapq (Min/Max heap for priority queues)

• bisect (Binary search utilities)


#1. Variables and Data Types

x = 10 # Integer
y = 3.14 # Float
name = "DSA" # String
arr = [1, 2, 3] # List
print(x,y,name,arr)

#2. Functions and Recursion


def factorial(n):
if n==0:
return 1
else:
return n * factorial(n - 1)

print(factorial(5))

#3.Object-Oriented Programming (OOP)


class Node:
def __init__(self, value):
self.value = value
self.next = None # Pointer to next node

node1 = Node(5)
print(node1.value) # Output: 5

#4.Exception Handling
try:
num = int(input("Enter a number: "))
print(10 / num)
except ZeroDivisionError:
print("Cannot divide by zero!")
except ValueError:
print("Invalid input!")

10 3.14 DSA [1, 2, 3]


120
5
Enter a number: 0
Cannot divide by zero!

---

2 Complexity Analysis
Complexity analysis helps in determining how efficiently an algorithm performs in terms of
time (speed) and space (memory usage).

2.1 Time Complexity


What is Time Complexity?
Time complexity represents the amount of time an algorithm takes to execute relative to
the input size (N).

Asymptotic Notation (Used for Time Complexity Analysis)


Asymptotic Notation describes the growth of an algorithm’s runtime as the input size
increases.

1. Big O (O) - Worst Case Complexity


• Defines the maximum time an algorithm can take.
• Helps in understanding the upper bound of execution time.
• Example: Searching for an element in an unsorted list O(N).
2. Big Theta (Θ) - Average Case Complexity
• Defines the expected time complexity for most cases.
• Used when an algorithm's behavior is the same for most inputs.
• Example: QuickSort's average complexity is Θ(N log N).

3. Big Omega (Ω) - Best Case Complexity


• Defines the minimum time an algorithm can take.
• Used to understand the lower bound of execution time.
• Example: Searching for an element in a sorted array with Binary Search (Ω(1) in
best case).

Common Time Complexities and Their Meanings


Complexity Meaning Example
O(1) Constant Time: Does not Accessing an element in
depend on input size an array
O(log N) Logarithmic Time: Binary Search
Reduces problem size in
each step
O(N) Linear Time: Grows Iterating over an array
proportionally with
input size
O(N log N) Linearithmic Time: Merge Sort, QuickSort
Common in sorting
algorithms
O(N²) Quadratic Time: Nested Bubble Sort, Selection
loops Sort
O(2ⁿ) Exponential Time: Recursive Fibonacci
Doubles for each step
O(N!) Factorial Time: Grows Solving the Travelling
extremely fast Salesman Problem

How to Analyze Time Complexity of an Algorithm?


• Count the number of operations inside loops, recursions, or functions.
• Ignore constant factors and lower-order terms (e.g., O(2N) → O(N)).
• Choose the dominant term (e.g., O(N + log N) simplifies to O(N)).

Examples of Time Complexities in Python

1 O(1) - Constant Time


1️⃣
def get_first_element(arr):
return arr[0] # Always takes constant time
arr = [10, 20, 30, 40]
print(get_first_element(arr)) # Output: 10

2️⃣O(log N) - Logarithmic Time (Binary Search Example)


import bisect

arr = [1, 3, 5, 7, 9]
index = bisect.bisect_left(arr, 5) # Finds position of 5
print(index) # Output: 2

3️⃣O(N) - Linear Time (Looping through a list)


def linear_search(arr, target):
for i in arr:
if i == target:
return True
return False

print(linear_search([1, 2, 3, 4, 5], 3)) # Output: True

4️⃣O(N log N) - Linearithmic Time (Merge Sort Example)


def merge_sort(arr):
if len(arr) <= 1:
return arr
mid = len(arr) // 2
left = merge_sort(arr[:mid])
right = merge_sort(arr[mid:])
return merge(left, right)

def merge(left, right):


sorted_list = []
i = j = 0
while i < len(left) and j < len(right):
if left[i] < right[j]:
sorted_list.append(left[i])
i += 1
else:
sorted_list.append(right[j])
j += 1
sorted_list.extend(left[i:])
sorted_list.extend(right[j:])
return sorted_list

arr = [3, 1, 4, 1, 5, 9]
print(merge_sort(arr)) # Output: [1, 1, 3, 4, 5, 9]

5️⃣O(N²) - Quadratic Time (Bubble Sort Example)


def bubble_sort(arr):
n = len(arr)
for i in range(n):
for j in range(0, n-i-1):
if arr[j] > arr[j+1]:
arr[j], arr[j+1] = arr[j+1], arr[j] # Swap elements

arr = [5, 3, 8, 1]
bubble_sort(arr)
print(arr) # Output: [1, 3, 5, 8]

2.2 Space Complexity


What is Space Complexity?
Space complexity measures how much memory an algorithm uses, including:
• Input storage
• Auxiliary variables
• Recursive call stack

Memory Usage by Variables, Lists, and Recursion

📌 Constant Space O(1) Example


def add_numbers(a, b):
return a + b # Uses only a fixed amount of memory

print(add_numbers(3, 5)) # Output: 8

📌 Linear Space O(N) Example (Using a List)


def store_numbers(n):
arr = []
for i in range(n):
arr.append(i)
return arr

print(store_numbers(5)) # Output: [0, 1, 2, 3, 4]

📌 Recursive Function with O(N) Space Complexity


def factorial(n):
if n == 0:
return 1
return n * factorial(n - 1) # Each call adds to the call stack

print(factorial(5)) # Output: 120

Trade-Offs Between Time and Space Complexity


Optimization Strategy Reduces Example
Iterative over Recursive Space Complexity Convert recursive
Fibonacci to
iterative
Optimization Strategy Reduces Example
Use In-Place Sorting Space Complexity QuickSort (in-
place) vs.
MergeSort (extra
space)
Precompute Values (Memoization) Time Complexity Caching Fibonacci
results

Final Notes
• 🔹 Time Complexity measures execution time growth.
• 🔹 Space Complexity measures memory usage.

• 🔹 Optimizing both leads to efficient algorithms in DSA. 🚀

You might also like