Skip to content

Commit bc18bfe

Browse files
committed
.
1 parent 7d98fe8 commit bc18bfe

File tree

5 files changed

+589
-0
lines changed

5 files changed

+589
-0
lines changed
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,54 @@
1+
# Divide and Conquer Algorithms
2+
3+
Divide and Conquer is a paradigm for solving problems that involves breaking a problem into smaller sub-problems, solving the sub-problems recursively, and then combining their solutions to solve the original problem.
4+
5+
## Merge Sort
6+
7+
Merge Sort is a popular sorting algorithm that follows the divide and conquer strategy. It divides the input array into two halves, recursively sorts the halves, and then merges them.
8+
9+
**Algorithm Overview:**
10+
- **Divide:** Divide the unsorted list into two sublists of about half the size.
11+
- **Conquer:** Recursively sort each sublist.
12+
- **Combine:** Merge the sorted sublists back into one sorted list.
13+
14+
```python
15+
def merge_sort(arr):
16+
if len(arr) > 1:
17+
mid = len(arr) // 2
18+
left_half = arr[:mid]
19+
right_half = arr[mid:]
20+
21+
merge_sort(left_half)
22+
merge_sort(right_half)
23+
24+
i = j = k = 0
25+
26+
while i < len(left_half) and j < len(right_half):
27+
if left_half[i] < right_half[j]:
28+
arr[k] = left_half[i]
29+
i += 1
30+
else:
31+
arr[k] = right_half[j]
32+
j += 1
33+
k += 1
34+
35+
while i < len(left_half):
36+
arr[k] = left_half[i]
37+
i += 1
38+
k += 1
39+
40+
while j < len(right_half):
41+
arr[k] = right_half[j]
42+
j += 1
43+
k += 1
44+
45+
arr = [12, 11, 13, 5, 6, 7]
46+
merge_sort(arr)
47+
print("Sorted array:", arr)
48+
```
49+
50+
## Complexity Analysis
51+
- **Time Complexity:** O(n log n) in all cases
52+
- **Space Complexity:** O(n) additional space for the merge operation
53+
54+
---
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,132 @@
1+
# Dynamic Programming
2+
3+
Dynamic programming is a method for solving complex problems by breaking them down into simpler subproblems and solving each subproblem only once. It stores the solutions to subproblems to avoid redundant computations, making it particularly useful for optimization problems where the solution can be obtained by combining solutions to smaller subproblems.
4+
5+
## Real-Life Examples of Dynamic Programming
6+
- **Fibonacci Sequence:** Computing the nth Fibonacci number efficiently.
7+
- **Shortest Path:** Finding the shortest path in a graph from a source to a destination.
8+
- **String Edit Distance:** Calculating the minimum number of operations required to transform one string into another.
9+
- **Knapsack Problem:** Maximizing the value of items in a knapsack without exceeding its weight capacity.
10+
11+
# Some Common Dynamic Programming Techniques
12+
13+
# 1. Fibonacci Sequence
14+
15+
The Fibonacci sequence is a classic example used to illustrate dynamic programming. It is a series of numbers where each number is the sum of the two preceding ones, usually starting with 0 and 1.
16+
17+
**Algorithm Overview:**
18+
- **Base Cases:** The first two numbers in the Fibonacci sequence are defined as 0 and 1.
19+
- **Memoization:** Store the results of previously computed Fibonacci numbers to avoid redundant computations.
20+
- **Recurrence Relation:** Compute each Fibonacci number by adding the two preceding numbers.
21+
22+
## Fibonacci Sequence Code in Python (Top-Down Approach with Memoization)
23+
24+
```python
25+
def fibonacci(n, memo={}):
26+
if n in memo:
27+
return memo[n]
28+
if n <= 1:
29+
return n
30+
memo[n] = fibonacci(n-1, memo) + fibonacci(n-2, memo)
31+
return memo[n]
32+
33+
n = 10
34+
print(f"The {n}th Fibonacci number is: {fibonacci(n)}.")
35+
```
36+
37+
## Fibonacci Sequence Code in Python (Bottom-Up Approach)
38+
39+
```python
40+
def fibonacci(n):
41+
fib = [0, 1]
42+
for i in range(2, n + 1):
43+
fib.append(fib[i - 1] + fib[i - 2])
44+
return fib[n]
45+
46+
n = 10
47+
print(f"The {n}th Fibonacci number is: {fibonacci(n)}.")
48+
```
49+
50+
## Complexity Analysis
51+
- **Time Complexity**: O(n) for both approaches
52+
- **Space Complexity**: O(n) for the top-down approach (due to memoization), O(1) for the bottom-up approach
53+
54+
</br>
55+
<hr>
56+
</br>
57+
58+
# 2. Longest Common Subsequence
59+
60+
The longest common subsequence (LCS) problem is to find the longest subsequence common to two sequences. A subsequence is a sequence that appears in the same relative order but not necessarily contiguous.
61+
62+
**Algorithm Overview:**
63+
- **Base Cases:** If one of the sequences is empty, the LCS is empty.
64+
- **Memoization:** Store the results of previously computed LCS lengths to avoid redundant computations.
65+
- **Recurrence Relation:** Compute the LCS length by comparing characters of the sequences and making decisions based on whether they match.
66+
67+
## Longest Common Subsequence Code in Python (Top-Down Approach with Memoization)
68+
69+
```python
70+
def longest_common_subsequence(X, Y, m, n, memo={}):
71+
if (m, n) in memo:
72+
return memo[(m, n)]
73+
if m == 0 or n == 0:
74+
return 0
75+
if X[m - 1] == Y[n - 1]:
76+
memo[(m, n)] = 1 + longest_common_subsequence(X, Y, m - 1, n - 1, memo)
77+
else:
78+
memo[(m, n)] = max(longest_common_subsequence(X, Y, m, n - 1, memo),
79+
longest_common_subsequence(X, Y, m - 1, n, memo))
80+
return memo[(m, n)]
81+
82+
X = "AGGTAB"
83+
Y = "GXTXAYB"
84+
print("Length of Longest Common Subsequence:", longest_common_subsequence(X, Y, len(X), len(Y)))
85+
```
86+
87+
## Complexity Analysis
88+
- **Time Complexity**: O(m * n) for the top-down approach, where m and n are the lengths of the input sequences
89+
- **Space Complexity**: O(m * n) for the memoization table
90+
91+
</br>
92+
<hr>
93+
</br>
94+
95+
# 3. 0-1 Knapsack Problem
96+
97+
The 0-1 knapsack problem is a classic optimization problem where the goal is to maximize the total value of items selected while keeping the total weight within a specified limit.
98+
99+
**Algorithm Overview:**
100+
- **Base Cases:** If the capacity of the knapsack is 0 or there are no items to select, the total value is 0.
101+
- **Memoization:** Store the results of previously computed subproblems to avoid redundant computations.
102+
- **Recurrence Relation:** Compute the maximum value by considering whether to include the current item or not.
103+
104+
## 0-1 Knapsack Problem Code in Python (Top-Down Approach with Memoization)
105+
106+
```python
107+
def knapsack(weights, values, capacity, n, memo={}):
108+
if (capacity, n) in memo:
109+
return memo[(capacity, n)]
110+
if n == 0 or capacity == 0:
111+
return 0
112+
if weights[n - 1] > capacity:
113+
memo[(capacity, n)] = knapsack(weights, values, capacity, n - 1, memo)
114+
else:
115+
memo[(capacity, n)] = max(values[n - 1] + knapsack(weights, values, capacity - weights[n - 1], n - 1, memo),
116+
knapsack(weights, values, capacity, n - 1, memo))
117+
return memo[(capacity, n)]
118+
119+
weights = [10, 20, 30]
120+
values = [60, 100, 120]
121+
capacity = 50
122+
n = len(weights)
123+
print("Maximum value that can be obtained:", knapsack(weights, values, capacity, n))
124+
```
125+
126+
## Complexity Analysis
127+
- **Time Complexity**: O(n * W) for the top-down approach, where n is the number of items and W is the capacity of the knapsack
128+
- **Space Complexity**: O(n * W) for the memoization table
129+
130+
</br>
131+
<hr>
132+
</br>
+135
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,135 @@
1+
# Greedy Algorithms
2+
3+
Greedy algorithms are simple, intuitive algorithms that make a sequence of choices at each step with the hope of finding a global optimum. They are called "greedy" because at each step, they choose the most advantageous option without considering the future consequences. Despite their simplicity, greedy algorithms are powerful tools for solving optimization problems, especially when the problem exhibits the greedy-choice property.
4+
5+
## Real-Life Examples of Greedy Algorithms
6+
- **Coin Change:** Finding the minimum number of coins to make a certain amount of change.
7+
- **Job Scheduling:** Assigning tasks to machines to minimize completion time.
8+
- **Huffman Coding:** Constructing an optimal prefix-free binary code for data compression.
9+
- **Fractional Knapsack:** Selecting items to maximize the value within a weight limit.
10+
11+
# Some Common Greedy Algorithms
12+
13+
# 1. Coin Change Problem
14+
15+
The coin change problem is a classic example of a greedy algorithm. Given a set of coin denominations and a target amount, the objective is to find the minimum number of coins required to make up that amount.
16+
17+
**Algorithm Overview:**
18+
- **Greedy Strategy:** At each step, the algorithm selects the largest denomination coin that is less than or equal to the remaining amount.
19+
- **Repeat Until Amount is Zero:** The process continues until the remaining amount becomes zero.
20+
21+
## Coin Change Code in Python
22+
23+
```python
24+
def coin_change(coins, amount):
25+
coins.sort(reverse=True)
26+
num_coins = 0
27+
for coin in coins:
28+
num_coins += amount // coin
29+
amount %= coin
30+
if amount == 0:
31+
return num_coins
32+
else:
33+
return -1
34+
35+
coins = [1, 5, 10, 25]
36+
amount = 63
37+
result = coin_change(coins, amount)
38+
if result != -1:
39+
print(f"Minimum number of coins required: {result}.")
40+
else:
41+
print("It is not possible to make the amount with the given denominations.")
42+
```
43+
44+
## Complexity Analysis
45+
- **Time Complexity**: O(n log n) for sorting (if not pre-sorted), O(n) for iteration
46+
- **Space Complexity**: O(1)
47+
48+
</br>
49+
<hr>
50+
</br>
51+
52+
# 2. Activity Selection Problem
53+
54+
The activity selection problem involves selecting the maximum number of mutually compatible activities that can be performed by a single person or machine, assuming that a person can only work on one activity at a time.
55+
56+
**Algorithm Overview:**
57+
- **Greedy Strategy:** Sort the activities based on their finish times.
58+
- **Selecting Activities:** Iterate through the sorted activities, selecting each activity if it doesn't conflict with the previously selected ones.
59+
60+
## Activity Selection Code in Python
61+
62+
```python
63+
def activity_selection(start, finish):
64+
n = len(start)
65+
activities = []
66+
i = 0
67+
activities.append(i)
68+
for j in range(1, n):
69+
if start[j] >= finish[i]:
70+
activities.append(j)
71+
i = j
72+
return activities
73+
74+
start = [1, 3, 0, 5, 8, 5]
75+
finish = [2, 4, 6, 7, 9, 9]
76+
selected_activities = activity_selection(start, finish)
77+
print("Selected activities:", selected_activities)
78+
```
79+
80+
## Complexity Analysis
81+
- **Time Complexity**: O(n log n) for sorting (if not pre-sorted), O(n) for iteration
82+
- **Space Complexity**: O(1)
83+
84+
</br>
85+
<hr>
86+
</br>
87+
88+
# 3. Huffman Coding
89+
90+
Huffman coding is a method of lossless data compression that efficiently represents characters or symbols in a file. It uses variable-length codes to represent characters, with shorter codes assigned to more frequent characters.
91+
92+
**Algorithm Overview:**
93+
- **Frequency Analysis:** Determine the frequency of each character in the input data.
94+
- **Building the Huffman Tree:** Construct a binary tree where each leaf node represents a character and the path to the leaf node determines its code.
95+
- **Assigning Codes:** Traverse the Huffman tree to assign codes to each character, with shorter codes for more frequent characters.
96+
97+
## Huffman Coding Code in Python
98+
99+
```python
100+
from heapq import heappush, heappop, heapify
101+
from collections import defaultdict
102+
103+
def huffman_coding(data):
104+
frequency = defaultdict(int)
105+
for char in data:
106+
frequency[char] += 1
107+
108+
heap = [[weight, [symbol, ""]] for symbol, weight in frequency.items()]
109+
heapify(heap)
110+
111+
while len(heap) > 1:
112+
lo = heappop(heap)
113+
hi = heappop(heap)
114+
for pair in lo[1:]:
115+
pair[1] = '0' + pair[1]
116+
for pair in hi[1:]:
117+
pair[1] = '1' + pair[1]
118+
heappush(heap, [lo[0] + hi[0]] + lo[1:] + hi[1:])
119+
120+
return sorted(heappop(heap)[1:], key=lambda p: (len(p[-1]), p))
121+
122+
data = "Huffman coding is a greedy algorithm"
123+
encoded_data = huffman_coding(data)
124+
print("Huffman Codes:")
125+
for symbol, code in encoded_data:
126+
print(f"{symbol}: {code}")
127+
```
128+
129+
## Complexity Analysis
130+
- **Time Complexity**: O(n log n) for heap operations, where n is the number of unique characters
131+
- **Space Complexity**: O(n) for the heap
132+
133+
</br>
134+
<hr>
135+
</br>

0 commit comments

Comments
 (0)