Implementing Recursive and Non-Recursive Algorithms and Studying the Order of
Growth from log2n to n!
Understanding the growth rates from logarithmic to factorial complexity is crucial in
algorithm design and analysis. Here’s how we can implement algorithms representing these
complexities:
1. Logarithmic Growth: O(log2n)
Example: Binary Search
Recursive Version:
def recursive_binary_search(arr, low, high, target):
if high >= low:
mid = (high + low) // 2
if arr[mid] == target:
return mid
elif arr[mid] > target:
return recursive_binary_search(arr, low, mid - 1, target)
else:
return recursive_binary_search(arr, mid + 1, high, target)
else:
return -1
Iterative Version:
def iterative_binary_search(arr, target):
low, high = 0, len(arr) - 1
while low <= high:
mid = (high + low) // 2
if arr[mid] == target:
return mid
elif arr[mid] > target:
high = mid - 1
else:
low = mid + 1
return -1
Growth Rate: O(log2n) The algorithm halves the search space at each step.
2. Linear Growth: O(n)
Example: Linear Search
Recursive Version:
def recursive_linear_search(arr, n, index=0):
if index == len(arr):
return -1
if arr[index] == n:
return index
return recursive_linear_search(arr, n, index + 1)
Iterative Version:
def iterative_linear_search(arr, n):
for i in range(len(arr)):
if arr[i] == n:
return i
return -1
Growth Rate: O(n) – The algorithm checks each element sequentially.
3. Log-Linear Growth: O(nlog2n)O(n \log_2 n)O(nlog2n)
Example: Merge Sort
Recursive Version:
def merge_sort(arr):
if len(arr) > 1:
mid = len(arr) // 2
L = arr[:mid]
R = arr[mid:]
merge_sort(L)
merge_sort(R)
i=j=k=0
while i < len(L) and j < len(R):
if L[i] < R[j]:
arr[k] = L[i]
i += 1
else:
arr[k] = R[j]
j += 1
k += 1
while i < len(L):
arr[k] = L[i]
i += 1
k += 1
while j < len(R):
arr[k] = R[j]
j += 1
k += 1
Growth Rate: O(nlog2n) – The algorithm splits the array (logarithmic) and merges (linear)
them.
4. Quadratic Growth: O(n2)O(n^2)O(n2)
Example: Bubble Sort
Recursive Version:
def recursive_bubble_sort(arr, n=None):
if n is None:
n = len(arr)
if n == 1:
return
for i in range(n - 1):
if arr[i] > arr[i + 1]:
arr[i], arr[i + 1] = arr[i + 1], arr[i]
recursive_bubble_sort(arr, n - 1)
Iterative Version:
def iterative_bubble_sort(arr):
n = len(arr)
for i in range(n):
for j in range(0, n - i - 1):
if arr[j] > arr[j + 1]:
arr[j], arr[j + 1] = arr[j + 1], arr[j]
Growth Rate: O(n2)– The algorithm compares each element with others, leading to quadratic
time.
5. Cubic Growth: O(n3)
Example: Triply Nested Loop (e.g., Matrix Multiplication)
Recursive Version:
def matrix_mult_recursive(A, B):
# Assuming A and B are 2x2 matrices for simplicity
n = len(A)
if n == 1:
return [[A[0][0] * B[0][0]]]
# Partition matrices and perform recursive multiplications
# This is a simplified representation of how you might recursively divide matrices
Iterative Version:
def matrix_mult_iterative(A, B):
n = len(A)
C = [[0] * n for _ in range(n)]
for i in range(n):
for j in range(n):
for k in range(n):
C[i][j] += A[i][k] * B[k][j]
return C
Growth Rate: O(n3) – Three nested loops lead to cubic time complexity.
6. Factorial Growth: O(n!)
Example: Generating Permutations
Recursive Version:
def generate_permutations(arr, n=None):
if n is None:
n = len(arr)
if n == 1:
print(arr)
return
for i in range(n):
arr[i], arr[n-1] = arr[n-1], arr[i]
generate_permutations(arr, n-1)
arr[i], arr[n-1] = arr[n-1], arr[i]
Growth Rate: O(n!) – The number of permutations of n items is n!, leading to factorial time
complexity.
Conclusion
These implementations provide a clear picture of how algorithm complexity can grow from
logarithmic to factorial. Understanding the growth rates is crucial for choosing the right
algorithm for a specific problem, as more complex algorithms can become infeasible for large
inputs.