Skip to content

Commit db796bf

Browse files
authored
Update time-space-complexity.md
1 parent 976eb8c commit db796bf

File tree

1 file changed

+51
-35
lines changed

1 file changed

+51
-35
lines changed

contrib/ds-algorithms/time-space-complexity.md

Lines changed: 51 additions & 35 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,10 @@
11
# Time and Space Complexity
22

33
We can solve a problem using one or more algorithms. It's essential to learn how to compare the performance of different algorithms and select the best one for a specific task.
4-
Therefore, it is highly required to use a method to compare the solutions in order to judge which one is more optimal. <br>The method must be:
4+
5+
Therefore, it is highly required to use a method to compare the solutions in order to judge which one is more optimal.
6+
7+
The method must be:
58

69
- Regardless of the system or its settings on which the algorithm is executing.
710
- Demonstrate a direct relationship with the quantity of inputs.
@@ -35,9 +38,7 @@ $$
3538

3639
Graphical representation of Big Oh:
3740

38-
![BigOh Notation Graph](./images/Time-And-Space-Complexity-BigOh.png)
39-
40-
Image Credit: www.programiz.com
41+
![BigOh Notation Graph](images/Time-And-Space-Complexity-BigOh.png)
4142

4243
### 2. Big Omega (Ω) Notation
4344

@@ -51,9 +52,7 @@ $$
5152

5253
Graphical representation of Big Omega:
5354

54-
![BigOmega Notation Graph](./images/Time-And-Space-Complexity-BigOmega.png)
55-
56-
Image Credit: www.programiz.com
55+
![BigOmega Notation Graph](images/Time-And-Space-Complexity-BigOmega.png)
5756

5857
### 3. Big Theta (Θ) Notation
5958

@@ -67,61 +66,64 @@ $$
6766

6867
Graphical representation of Big Theta:
6968

70-
![Big Theta Notation Graph](./images/Time-And-Space-Complexity-BigTheta.png)
71-
72-
73-
Image Credit: www.programiz.com
74-
69+
![Big Theta Notation Graph](images/Time-And-Space-Complexity-BigTheta.png)
7570

7671
## Best Case, Worst Case and Average Case
72+
7773
### 1. Best-Case Scenario:
74+
7875
The best-case scenario refers to the situation where an algorithm performs optimally, achieving the lowest possible time or space complexity. It represents the most favorable conditions under which an algorithm operates.
7976

8077
#### Characteristics:
81-
- Represents the minimum time or space required by an algorithm to solve a problem.
78+
79+
- Represents the minimum time or space required by an algorithm to solve a problem.
8280
- Occurs when the input data is structured in such a way that the algorithm can exploit its strengths fully.
8381
- Often used to analyze the lower bound of an algorithm's performance.
8482

8583
#### Example:
86-
Consider the `linear search algorithm` where we're searching for a `target element` in an array. The best-case scenario occurs when the target element is found `at the very beginning of the array`. In this case, the algorithm would only need to make one comparison, resulting in a time complexity of O(1).
8784

85+
Consider the `linear search algorithm` where we're searching for a `target element` in an array. The best-case scenario occurs when the target element is found `at the very beginning of the array`. In this case, the algorithm would only need to make one comparison, resulting in a time complexity of `O(1)`.
8886

8987
### 2. Worst-Case Scenario:
88+
9089
The worst-case scenario refers to the situation where an algorithm performs at its poorest, achieving the highest possible time or space complexity. It represents the most unfavorable conditions under which an algorithm operates.
9190

9291
#### Characteristics:
92+
9393
- Represents the maximum time or space required by an algorithm to solve a problem.
9494
- Occurs when the input data is structured in such a way that the algorithm encounters the most challenging conditions.
9595
- Often used to analyze the upper bound of an algorithm's performance.
9696

9797
#### Example:
98-
Continuing with the `linear search algorithm`, the worst-case scenario occurs when the `target element` is either not present in the array or located `at the very end`. In this case, the algorithm would need to iterate through the entire array, resulting in a time complexity of O(n), where n is the size of the array.
9998

99+
Continuing with the `linear search algorithm`, the worst-case scenario occurs when the `target element` is either not present in the array or located `at the very end`. In this case, the algorithm would need to iterate through the entire array, resulting in a time complexity of `O(n)`, where `n` is the size of the array.
100100

101101
### 3. Average-Case Scenario:
102+
102103
The average-case scenario refers to the expected performance of an algorithm over all possible inputs, typically calculated as the arithmetic mean of the time or space complexity.
103104

104105
#### Characteristics:
106+
105107
- Represents the typical performance of an algorithm across a range of input data.
106108
- Takes into account the distribution of inputs and their likelihood of occurrence.
107109
- Provides a more realistic measure of an algorithm's performance compared to the best-case or worst-case scenarios.
108110

109111
#### Example:
110-
- For the `linear search algorithm`, the average-case scenario considers the probability distribution of the target element's position within the array. If the `target element is equally likely to be found at any position in the array`, the average-case time complexity would be O(n/2), as the algorithm would, on average, need to search halfway through the array.
111112

113+
For the `linear search algorithm`, the average-case scenario considers the probability distribution of the target element's position within the array. If the `target element is equally likely to be found at any position in the array`, the average-case time complexity would be `O(n/2)`, as the algorithm would, on average, need to search halfway through the array.
112114

113115
## Space Complexity
116+
114117
The memory space that a code utilizes as it is being run is often referred to as space complexity. Additionally, space complexity depends on the machine, therefore rather than using the typical memory units like MB, GB, etc., we will express space complexity using the Big O notation.
115118

116119
#### Examples of Space Complexity
117-
1. `Constant Space Complexity (O(1))`: Algorithms that operate on a fixed-size array or use a constant number of variables have O(1) space complexity.
118120

121+
1. `Constant Space Complexity (O(1))`: Algorithms that operate on a fixed-size array or use a constant number of variables have O(1) space complexity.
119122
2. `Linear Space Complexity (O(n))`: Algorithms that store each element of the input array in a separate variable or data structure have O(n) space complexity.
120-
121123
3. `Quadratic Space Complexity (O(n^2))`: Algorithms that create a two-dimensional array or matrix with dimensions based on the input size have O(n^2) space complexity.
122124

123-
124125
#### Analyzing Space Complexity
126+
125127
To analyze space complexity:
126128

127129
- Identify the variables, data structures, and recursive calls used by the algorithm.
@@ -130,18 +132,23 @@ To analyze space complexity:
130132

131133
## Examples to calculate time and space complexity
132134

133-
#### 1. Print all elements of given array :
135+
#### 1. Print all elements of given array
136+
134137
Consider each line takes one unit of time to run. So, to simply iterate over an array to print all elements it will take `O(n)` time, where n is the size of array.
138+
135139
Code:
140+
136141
```python
137142
arr = [1,2,3,4] #1
138143
for x in arr: #2
139144
print(x) #3
140145
```
141-
Here, the 1st statement executes only once. So, it takes one unit of time to run. The for loop consisting of 2nd and 3rd statements executes 4 times.
146+
147+
Here, the 1st statement executes only once. So, it takes one unit of time to run. The for loop consisting of 2nd and 3rd statements executes 4 times.
142148
Also, as the code dosen't take any additional space except the input arr its Space Complexity is O(1) constant.
143149

144150
#### 2. Linear Search
151+
145152
Linear search is a simple algorithm for finding an element in an array by sequentially checking each element until a match is found or the end of the array is reached. Here's an example of calculating the time and space complexity of linear search:
146153

147154
```python
@@ -156,17 +163,20 @@ arr = [1, 3, 5, 7, 9]
156163
target = 5
157164
print(linear_search(arr, target))
158165
```
159-
`Time Complexity Analysis:`
160-
The for loop iterates through the entire array, which takes O(n) time in the worst case, where n is the size of the array.
161-
Inside the loop, each operation takes constant time (O(1)).
166+
167+
**Time Complexity Analysis**
168+
169+
The for loop iterates through the entire array, which takes O(n) time in the worst case, where n is the size of the array.
170+
Inside the loop, each operation takes constant time (O(1)).
162171
Therefore, the time complexity of linear search is `O(n)`.
163172

173+
**Space Complexity Analysis**
164174

165-
`Space Complexity Analysis:`
166175
The space complexity of linear search is `O(1)` since it only uses a constant amount of additional space for variables regardless of the input size.
167176

168177

169178
#### 3. Binary Search
179+
170180
Binary search is an efficient algorithm for finding an element in a sorted array by repeatedly dividing the search interval in half. Here's an example of calculating the time and space complexity of binary search:
171181

172182
```python
@@ -190,18 +200,21 @@ def binary_search(arr, target):
190200
arr = [1, 3, 5, 7, 9]
191201
target = 5
192202
print(binary_search(arr, target))
193-
194203
```
195-
`Time Complexity Analysis:`
196-
The initialization of left and right takes constant time (O(1)).
197-
The while loop runs for log(n) iterations in the worst case, where n is the size of the array.
198-
Inside the loop, each operation takes constant time (O(1)).
204+
205+
**Time Complexity Analysis**
206+
207+
The initialization of left and right takes constant time (O(1)).
208+
The while loop runs for log(n) iterations in the worst case, where n is the size of the array.
209+
Inside the loop, each operation takes constant time (O(1)).
199210
Therefore, the time complexity of binary search is `O(log n)`.
200211

201-
`Space Complexity Analysis:`
212+
**Space Complexity Analysis**
213+
202214
The space complexity of binary search is `O(1)` since it only uses a constant amount of additional space for variables regardless of the input size.
203215

204216
#### 4. Fibbonaci Sequence
217+
205218
Let's consider an example of a function that generates Fibonacci numbers up to a given index and stores them in a list. In this case, the space complexity will not be constant because the size of the list grows with the Fibonacci sequence.
206219

207220
```python
@@ -219,9 +232,12 @@ n = 10
219232
fib_sequence = fibonacci_sequence(n)
220233
print(fib_sequence)
221234
```
222-
`Time Complexity Analysis:`
235+
236+
**Time Complexity Analysis**
237+
223238
The while loop iterates until the length of the Fibonacci sequence list reaches n, so it takes `O(n)` iterations in the `worst case`.Inside the loop, each operation takes constant time (O(1)).
224239

225-
`Space Complexity Analysis:`
226-
The space complexity of this function is not constant because it creates and stores a list of Fibonacci numbers.
227-
As n grows, the size of the list also grows, so the space complexity is O(n), where n is the index of the last Fibonacci number generated.
240+
**Space Complexity Analysis**
241+
242+
The space complexity of this function is not constant because it creates and stores a list of Fibonacci numbers.
243+
As n grows, the size of the list also grows, so the space complexity is O(n), where n is the index of the last Fibonacci number generated.

0 commit comments

Comments
 (0)