You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: contrib/ds-algorithms/sorting-algorithms.md
+29-40Lines changed: 29 additions & 40 deletions
Original file line number
Diff line number
Diff line change
@@ -2,14 +2,15 @@
2
2
3
3
In computer science, a sorting algorithm takes a collection of items and arranges them in a specific order. This order is usually determined by comparing the items using a defined rule.
4
4
5
-
## Real Life Example of Sorting
5
+
Real Life Example of Sorting
6
+
6
7
- Sorting a deck of cards
7
8
- Sorting names in alphabetical order
8
9
- Sorting a list of items, etc.
9
10
10
-
# Some common sorting techniques
11
+
Some common sorting techniques:
11
12
12
-
# 1. Bubble Sort
13
+
##1. Bubble Sort
13
14
14
15
Bubble sort is a basic sorting technique that iteratively steps through a list, comparing neighboring elements. If elements are out of order, it swaps them. While easy to understand, bubble sort becomes inefficient for large datasets due to its slow execution time.
15
16
@@ -21,7 +22,7 @@ Bubble sort is a basic sorting technique that iteratively steps through a list,
21
22
-**Repeating Until Sorted:** The algorithm continues making passes through the list until no more swaps are needed. This indicates the entire list is sorted.
Let's sort the list `[5, 3, 8, 1, 2]` using bubble sort.
41
43
@@ -53,17 +55,13 @@ Let's sort the list `[5, 3, 8, 1, 2]` using bubble sort.
53
55
- Comparing neighbors: `[1, 2, 3, 5, 8]`
54
56
- No swapping needed, the list is already sorted.
55
57
56
-
## Complexity Analysis
58
+
###Complexity Analysis
57
59
58
60
-**Worst Case:**`O(n^2)` comparisons and swaps. This happens when the list is in reverse order, and we need to make maximum swaps.
59
61
-**Best Case:**`O(n)` comparisons. This occurs when the list is already sorted, but we still need O(n^2) swaps because of the nested loops.
60
62
-**Average Case:**`O(n^2)` comparisons and swaps. This is the expected number of comparisons and swaps over all possible input sequences.
61
63
62
-
</br>
63
-
<hr>
64
-
</br>
65
-
66
-
# 2. Selection Sort
64
+
## 2. Selection Sort
67
65
68
66
Selection sort is a simple sorting algorithm that divides the input list into two parts: a sorted sublist and an unsorted sublist. The algorithm repeatedly finds the smallest (or largest, depending on sorting order) element from the unsorted sublist and moves it to the sorted sublist. It's not efficient for large datasets but performs better than bubble sort due to fewer swaps.
69
67
@@ -73,7 +71,7 @@ Selection sort is a simple sorting algorithm that divides the input list into tw
73
71
-**Expanding the Sorted Sublist:** As elements are moved to the sorted sublist, it expands until all elements are sorted.
74
72
-**Repeating Until Sorted:** The process continues until the entire list is sorted.
75
73
76
-
## Example with Visualization
74
+
###Example with Visualization
77
75
78
76
Let's sort the list `[5, 3, 8, 1, 2]` using selection sort.
79
77
@@ -97,7 +95,7 @@ Let's sort the list `[5, 3, 8, 1, 2]` using selection sort.
-**Worst Case**: `O(n^2)` comparisons and O(n) swaps. This occurs when the list is in reverse order, and we need to make maximum comparisons and swaps.
119
118
-**Best Case**: `O(n^2)` comparisons and O(n) swaps. This happens when the list is in sorted order, but the algorithm still needs to iterate through all elements for comparisons.
120
119
-**Average Case**: `O(n^2)` comparisons and O(n) swaps. This is the expected number of comparisons and swaps over all possible input sequences.
121
-
</br>
122
-
<hr>
123
-
</br>
124
120
125
-
# 3. Quick Sort
121
+
##3. Quick Sort
126
122
Quick sort is a popular divide-and-conquer sorting algorithm known for its efficiency on average. It works by selecting a 'pivot' element from the array and partitioning the other elements into two sub-arrays according to whether they are less than or greater than the pivot. The sub-arrays are then recursively sorted.
127
123
128
124
**Algorithm Overview:**
@@ -131,29 +127,24 @@ Quick sort is a popular divide-and-conquer sorting algorithm known for its effic
131
127
-**Recursion:** Apply the above steps recursively to the sub-arrays formed by partitioning until the base case is reached. The base case is usually when the size of the sub-array becomes 0 or 1, indicating it is already sorted.
132
128
-**Base Case:** If the sub-array size becomes 0 or 1, it is already sorted.
133
129
134
-
## Example with Visualization
130
+
###Example with Visualization
135
131
136
132
Let's sort the list `[5, 3, 8, 1, 2]` using quick sort.
137
133
138
134
1.**Initial Array:**`[5, 3, 8, 1, 2]`
139
-
140
135
2.**Choose Pivot:** Let's choose the last element, `2`, as the pivot.
141
-
142
136
3.**Partitioning:**
143
137
- We'll partition the array around the pivot `2`. All elements less than `2` will be placed to its left, and all elements greater than `2` will be placed to its right.
144
-
145
138
- After partitioning, the array becomes `[1, 2, 5, 3, 8]`. The pivot element, `2`, is now in its correct sorted position.
146
-
147
139
4.**Recursion:**
148
140
- Now, we recursively sort the sub-arrays `[1]` and `[5, 3, 8]`.
149
141
- For the sub-array `[5, 3, 8]`, we choose `8` as the pivot and partition it.
150
142
- After partitioning, the sub-array becomes `[3, 5, 8]`. The pivot element, `8`, is now in its correct sorted position.
151
-
152
-
153
143
5.**Concatenation:**
154
144
- Concatenating the sorted sub-arrays `[1]`, `[2]`, `[3, 5, 8]`, we get the final sorted array `[1, 2, 3, 5, 8]`.
-**Worst Case**: The worst-case time complexity of quick sort is `O(n^2)`. This occurs when the pivot selection consistently results in unbalanced partitioning, such as choosing the smallest or largest element as the pivot.
202
195
-**Best Case**: The best-case time complexity is `O(n log n)`. This happens when the pivot selection leads to well-balanced partitioning, halving the array size in each recursive call.
203
196
-**Average Case**: The average-case time complexity is `O(n log n)`. This is the expected time complexity when the pivot selection results in reasonably balanced partitioning across recursive calls.
204
197
-**Space Complexity**: Quick sort has an `O(log n)` space complexity for the recursion stack, as it recursively sorts sub-arrays.
205
-
</br>
206
-
<hr>
207
-
</br>
208
198
209
-
# 4. Merge Sort
199
+
##4. Merge Sort
210
200
211
201
Merge sort is a divide-and-conquer algorithm that recursively divides the input list into smaller sublists until each sublist contains only one element. Then, it repeatedly merges adjacent sublists while maintaining the sorted order until there is only one sublist remaining, which represents the sorted list.
212
202
213
203
**Algorithm Overview:**
214
204
-**Divide:** Split the input list into smaller sublists recursively until each sublist contains only one element.
215
205
-**Merge:** Repeatedly merge adjacent sublists while maintaining the sorted order until there is only one sublist remaining, which represents the sorted list.
216
206
217
-
## Example with Visualization
207
+
###Example with Visualization
218
208
219
209
Let's sort the list `[38, 27, 43, 3, 9, 82, 10]` using merge sort.
220
210
@@ -233,7 +223,7 @@ Let's sort the list `[38, 27, 43, 3, 9, 82, 10]` using merge sort.
-**Time Complexity**: `O(n log n)` for all cases. Merge sort always divides the list into halves until each sublist contains only one element, and then merges them back together, resulting in O(n log n) time complexity.
325
318
-**Space Complexity**: `O(n)` auxiliary space. In the iterative version, merge sort uses additional space for creating temporary sublists during merging operations.
0 commit comments