You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: contrib/ds-algorithms/time-space-complexity.md
+51-35Lines changed: 51 additions & 35 deletions
Original file line number
Diff line number
Diff line change
@@ -1,7 +1,10 @@
1
1
# Time and Space Complexity
2
2
3
3
We can solve a problem using one or more algorithms. It's essential to learn how to compare the performance of different algorithms and select the best one for a specific task.
4
-
Therefore, it is highly required to use a method to compare the solutions in order to judge which one is more optimal. <br>The method must be:
4
+
5
+
Therefore, it is highly required to use a method to compare the solutions in order to judge which one is more optimal.
6
+
7
+
The method must be:
5
8
6
9
- Regardless of the system or its settings on which the algorithm is executing.
7
10
- Demonstrate a direct relationship with the quantity of inputs.
The best-case scenario refers to the situation where an algorithm performs optimally, achieving the lowest possible time or space complexity. It represents the most favorable conditions under which an algorithm operates.
79
76
80
77
#### Characteristics:
81
-
- Represents the minimum time or space required by an algorithm to solve a problem.
78
+
79
+
- Represents the minimum time or space required by an algorithm to solve a problem.
82
80
- Occurs when the input data is structured in such a way that the algorithm can exploit its strengths fully.
83
81
- Often used to analyze the lower bound of an algorithm's performance.
84
82
85
83
#### Example:
86
-
Consider the `linear search algorithm` where we're searching for a `target element` in an array. The best-case scenario occurs when the target element is found `at the very beginning of the array`. In this case, the algorithm would only need to make one comparison, resulting in a time complexity of O(1).
87
84
85
+
Consider the `linear search algorithm` where we're searching for a `target element` in an array. The best-case scenario occurs when the target element is found `at the very beginning of the array`. In this case, the algorithm would only need to make one comparison, resulting in a time complexity of `O(1)`.
88
86
89
87
### 2. Worst-Case Scenario:
88
+
90
89
The worst-case scenario refers to the situation where an algorithm performs at its poorest, achieving the highest possible time or space complexity. It represents the most unfavorable conditions under which an algorithm operates.
91
90
92
91
#### Characteristics:
92
+
93
93
- Represents the maximum time or space required by an algorithm to solve a problem.
94
94
- Occurs when the input data is structured in such a way that the algorithm encounters the most challenging conditions.
95
95
- Often used to analyze the upper bound of an algorithm's performance.
96
96
97
97
#### Example:
98
-
Continuing with the `linear search algorithm`, the worst-case scenario occurs when the `target element` is either not present in the array or located `at the very end`. In this case, the algorithm would need to iterate through the entire array, resulting in a time complexity of O(n), where n is the size of the array.
99
98
99
+
Continuing with the `linear search algorithm`, the worst-case scenario occurs when the `target element` is either not present in the array or located `at the very end`. In this case, the algorithm would need to iterate through the entire array, resulting in a time complexity of `O(n)`, where `n` is the size of the array.
100
100
101
101
### 3. Average-Case Scenario:
102
+
102
103
The average-case scenario refers to the expected performance of an algorithm over all possible inputs, typically calculated as the arithmetic mean of the time or space complexity.
103
104
104
105
#### Characteristics:
106
+
105
107
- Represents the typical performance of an algorithm across a range of input data.
106
108
- Takes into account the distribution of inputs and their likelihood of occurrence.
107
109
- Provides a more realistic measure of an algorithm's performance compared to the best-case or worst-case scenarios.
108
110
109
111
#### Example:
110
-
- For the `linear search algorithm`, the average-case scenario considers the probability distribution of the target element's position within the array. If the `target element is equally likely to be found at any position in the array`, the average-case time complexity would be O(n/2), as the algorithm would, on average, need to search halfway through the array.
111
112
113
+
For the `linear search algorithm`, the average-case scenario considers the probability distribution of the target element's position within the array. If the `target element is equally likely to be found at any position in the array`, the average-case time complexity would be `O(n/2)`, as the algorithm would, on average, need to search halfway through the array.
112
114
113
115
## Space Complexity
116
+
114
117
The memory space that a code utilizes as it is being run is often referred to as space complexity. Additionally, space complexity depends on the machine, therefore rather than using the typical memory units like MB, GB, etc., we will express space complexity using the Big O notation.
115
118
116
119
#### Examples of Space Complexity
117
-
1.`Constant Space Complexity (O(1))`: Algorithms that operate on a fixed-size array or use a constant number of variables have O(1) space complexity.
118
120
121
+
1.`Constant Space Complexity (O(1))`: Algorithms that operate on a fixed-size array or use a constant number of variables have O(1) space complexity.
119
122
2.`Linear Space Complexity (O(n))`: Algorithms that store each element of the input array in a separate variable or data structure have O(n) space complexity.
120
-
121
123
3.`Quadratic Space Complexity (O(n^2))`: Algorithms that create a two-dimensional array or matrix with dimensions based on the input size have O(n^2) space complexity.
122
124
123
-
124
125
#### Analyzing Space Complexity
126
+
125
127
To analyze space complexity:
126
128
127
129
- Identify the variables, data structures, and recursive calls used by the algorithm.
@@ -130,18 +132,23 @@ To analyze space complexity:
130
132
131
133
## Examples to calculate time and space complexity
132
134
133
-
#### 1. Print all elements of given array :
135
+
#### 1. Print all elements of given array
136
+
134
137
Consider each line takes one unit of time to run. So, to simply iterate over an array to print all elements it will take `O(n)` time, where n is the size of array.
138
+
135
139
Code:
140
+
136
141
```python
137
142
arr = [1,2,3,4] #1
138
143
for x in arr: #2
139
144
print(x) #3
140
145
```
141
-
Here, the 1st statement executes only once. So, it takes one unit of time to run. The for loop consisting of 2nd and 3rd statements executes 4 times.
146
+
147
+
Here, the 1st statement executes only once. So, it takes one unit of time to run. The for loop consisting of 2nd and 3rd statements executes 4 times.
142
148
Also, as the code dosen't take any additional space except the input arr its Space Complexity is O(1) constant.
143
149
144
150
#### 2. Linear Search
151
+
145
152
Linear search is a simple algorithm for finding an element in an array by sequentially checking each element until a match is found or the end of the array is reached. Here's an example of calculating the time and space complexity of linear search:
146
153
147
154
```python
@@ -156,17 +163,20 @@ arr = [1, 3, 5, 7, 9]
156
163
target =5
157
164
print(linear_search(arr, target))
158
165
```
159
-
`Time Complexity Analysis:`
160
-
The for loop iterates through the entire array, which takes O(n) time in the worst case, where n is the size of the array.
161
-
Inside the loop, each operation takes constant time (O(1)).
166
+
167
+
**Time Complexity Analysis**
168
+
169
+
The for loop iterates through the entire array, which takes O(n) time in the worst case, where n is the size of the array.
170
+
Inside the loop, each operation takes constant time (O(1)).
162
171
Therefore, the time complexity of linear search is `O(n)`.
163
172
173
+
**Space Complexity Analysis**
164
174
165
-
`Space Complexity Analysis:`
166
175
The space complexity of linear search is `O(1)` since it only uses a constant amount of additional space for variables regardless of the input size.
167
176
168
177
169
178
#### 3. Binary Search
179
+
170
180
Binary search is an efficient algorithm for finding an element in a sorted array by repeatedly dividing the search interval in half. Here's an example of calculating the time and space complexity of binary search:
The initialization of left and right takes constant time (O(1)).
197
-
The while loop runs for log(n) iterations in the worst case, where n is the size of the array.
198
-
Inside the loop, each operation takes constant time (O(1)).
204
+
205
+
**Time Complexity Analysis**
206
+
207
+
The initialization of left and right takes constant time (O(1)).
208
+
The while loop runs for log(n) iterations in the worst case, where n is the size of the array.
209
+
Inside the loop, each operation takes constant time (O(1)).
199
210
Therefore, the time complexity of binary search is `O(log n)`.
200
211
201
-
`Space Complexity Analysis:`
212
+
**Space Complexity Analysis**
213
+
202
214
The space complexity of binary search is `O(1)` since it only uses a constant amount of additional space for variables regardless of the input size.
203
215
204
216
#### 4. Fibbonaci Sequence
217
+
205
218
Let's consider an example of a function that generates Fibonacci numbers up to a given index and stores them in a list. In this case, the space complexity will not be constant because the size of the list grows with the Fibonacci sequence.
206
219
207
220
```python
@@ -219,9 +232,12 @@ n = 10
219
232
fib_sequence = fibonacci_sequence(n)
220
233
print(fib_sequence)
221
234
```
222
-
`Time Complexity Analysis:`
235
+
236
+
**Time Complexity Analysis**
237
+
223
238
The while loop iterates until the length of the Fibonacci sequence list reaches n, so it takes `O(n)` iterations in the `worst case`.Inside the loop, each operation takes constant time (O(1)).
224
239
225
-
`Space Complexity Analysis:`
226
-
The space complexity of this function is not constant because it creates and stores a list of Fibonacci numbers.
227
-
As n grows, the size of the list also grows, so the space complexity is O(n), where n is the index of the last Fibonacci number generated.
240
+
**Space Complexity Analysis**
241
+
242
+
The space complexity of this function is not constant because it creates and stores a list of Fibonacci numbers.
243
+
As n grows, the size of the list also grows, so the space complexity is O(n), where n is the index of the last Fibonacci number generated.
0 commit comments