0% found this document useful (0 votes)
31 views

Recursive Function

Time complexity

Uploaded by

joriwalarakib10
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
31 views

Recursive Function

Time complexity

Uploaded by

joriwalarakib10
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 33

CSE246

Algorithms

Asymptotic Notation
What is Asymptotic Notation?
• A way to describe the behavior of an algorithm’s
time or space complexity as the input size (n)
grows.
• Common notations:
• O(n): Upper bound (worst case)
• Ω(n): Lower bound (best case)
• Θ(n): Tight bound (both best and worst cases are the
same)

2
Big O Notation (Upper Bound):

• Big O describes the worst case performance of


an algorithm.
• It tells us how the runtime (or space usage)
grows as the input size increases.
• Example: If an algorithm takes O(n²) time, it
means that as input size increases, the time
required grows quadratically.

3
Calculating the Number of Steps:

1.Constant Time (O(1)):


x = 10;
y = x + 5;
•These operations take a constant time, regardless of input size.
So, they are O(1).
2.Linear Time (O(n)):
for i = 1 to n:
print(i);
•The loop runs n times, so the total number of steps grows
linearly with the size of n. This is O(n).
3.Quadratic Time (O(n²)):
for i = 1 to n:
for j = 1 to n:
print(i, j);
2.Two nested loops, each running n times, result in n * n = n²
steps. This is O(n²).
4
Small Functions and Step
Calculation:

•f(n) = c (constant function):


•Example: x = 5; → O(1) (1 step, regardless of input size).
•f(n) = n (linear function):
•Example: for i = 1 to n → O(n) (loop runs n times).
•f(n) = n² (quadratic function):
•Example: Nested loops → O(n²) (each loop runs n times).
•f(n) = log n (logarithmic function):
•Example: Binary search → O(log n) (halves the search space at each step).

5
Simplifying Time Complexities

•Drop constants: O(2n) becomes O(n).


•Keep the dominant term: O(n² + n) simplifies to
O(n²) because n² grows faster than n as n
increases.

6
Typical Running Time Functions
• 1 (constant running time):
– Instructions are executed once or a few times
• logN (logarithmic)
– A big problem is solved by cutting the original problem in smaller
sizes, by a constant fraction at each step
• N (linear)
– A small amount of processing is done on each input element
• N logN
– A problem is solved by dividing it into smaller problems, solving
them independently and combining the solution

7
Typical Running Time Functions
• N2 (quadratic)
– Typical for algorithms that process all pairs of data items (double
nested loops)

• N3 (cubic)
– Processing of triples of data (triple nested loops)

• NK (polynomial)
• 2N (exponential)
– Few exponential algorithms are appropriate for practical use

8
Growth of Functions

9
Complexity Graphs

log(n
)
Complexity Graphs

n
log(n)

log(n
)
Complexity Graphs

n1 n
0 3

n
2
n
log(n)
Complexity Graphs (log scale)

3
n n
n 2
n
0

2
n

n1
0

1.1
n
Algorithm Complexity
• Worst Case Complexity:
– the function defined by the maximum number of steps
taken on any instance of size n
• Best Case Complexity:
– the function defined by the minimum number of steps
taken on any instance of size n
• Average Case Complexity:
– the function defined by the average number of steps
taken on any instance of size n
Best, Worst, and Average Case Complexity

Number Worst Case


of steps
Complexity

Average Case
Complexity

Best Case
Complexity

N
(input size)
Example Problem (Step Calculation):

for i = 1 to n:
for j = 1 to n:
print(i, j);
•Outer loop runs n times.
• The inner loop runs n times for each iteration of the outer loop.
•Total steps: n * n = n² → O(n²).

16
Big-O

• What does it mean?


– If f(n) = O(n2), then:
• f(n) can be larger than n2 sometimes, but…
• We can choose some constant c and some value n0 such
that for every value of n larger than n0 : f(n) < cn2
• That is, for values larger than n0, f(n) is never more than a
constant multiplier greater than n2
• Or, in other words, f(n) does not grow more than a constant
factor faster than n2.

17
Visualization of O(g(n))
cg(n)

f(n)

n0

18
Examples
– 2n2 = O(n3):
2n2 ≤ cn3 ⇒ 2 ≤ cn ⇒ c = 1 and n0=
– n2 = O(n2): 2
n2 ≤ cn2 ⇒ c ≥ 1 ⇒ c = 1 and n0=
– 1000n2+1000n1 = O(n2):

1000n2+1000n ≤ cn2 ≤ cn2+ 1000n ⇒ c=1001 and n0 =


1
– n = O(n2):
n ≤ cn2 ⇒ cn ≥ 1 ⇒ c = 1 and n0= 1

19
Big-O

20
More Big-O
• Prove that:
• Let c = 21 and n0 = 4
• 21n2 > 20n2 + 2n + 5 for all n > 4
n2 > 2n + 5 for all n > 4
TRUE

21
Visualization of Ω(g(n))
f(n)

cg(n)

n0

22
Visualization of Θ(g(n))
c2g(n)

f(n)

c1g(n)

n0

23
Example 3
• Show that
• Let c = 2 and n0 = 5

24
More examples
• f(n) = 3n + 10
• f(n) = 5n² + 2n + 8
• f(n) = 4log(n) + n
• f(n) = n³ + 50n² + 100
• f(n) = n + nlog(n)

25
• f(n) = 3n + 10 • O(n)
• f(n) = 5n² + 2n + 8 • O(n²)
• f(n) = 4log(n) + n • O(n)
• f(n) = n³ + 50n² + 100 • O(n³ )
• f(n) = n + nlog(n) • O(nlogn)

26
Example
• Code:
• a = b;

• Complexity:

27
Example
• Code:
• sum = 0;
• for (i=1; i <=n; i++)
• sum += n;

• Complexity:

28
Example
• Code:
• sum = 0;
• for (j=1; j<=n; j++)
• for (i=1; i<=j; i++)
• sum++;
• for (k=0; k<n; k++)
• A[k] = k;
• Complexity:

29
Example
• Code:
• sum1 = 0;
• for (i=1; i<=n; i++)
• for (j=1; j<=n; j++)
• sum1++;
• Complexity:

30
Example
• Code:
• sum2 = 0;
• for (i=1; i<=n; i++)
• for (j=1; j<=i; j++)
• sum2++;
• Complexity:

31
Example
• Code:
• sum1 = 0;
• for (k=1; k<=n; k*=2)
• for (j=1; j<=n; j++)
• sum1++;
• Complexity:

32
Example
• Code:
• sum2 = 0;
• for (k=1; k<=n; k*=2)
• for (j=1; j<=k; j++)
• sum2++;
• Complexity:

33

You might also like