Five Time Complexities in Programming
Time complexity is a measure of how long an algorithm takes to run as the input size increases. It's
expressed using big O notation, which provides an upper bound on the growth rate of the algorithm's
running time.
Here are five common time complexities in programming:
1. Constant Time (O(1))
Meaning: The algorithm's running time remains the same regardless of the input size.
Example: Accessing an element in an array by its index.
2. Logarithmic Time (O(log n))
Meaning: The running time increases logarithmically with the input size.
Example: Binary search, where the search space is halved in each iteration.
3. Linear Time (O(n))
Meaning: The running time increases linearly with the input size.
Example: Iterating over an array once.
4. Quadratic Time (O(n^2))
Meaning: The running time increases quadratically with the input size.
Example: Nested loops that iterate over the input twice.
5. Exponential Time (O(2^n))
Meaning: The running time increases exponentially with the input size.
Example: Brute-force algorithms that try every possible combination.