Algo - Lecure3 - Asymptotic Analysis

Download as pdf or txt
Download as pdf or txt
You are on page 1of 26

ALGORITHM ANALYSIS-

ASYMPTOTIC ANALYSIS

Tanjina Helaly
Algorithms
MEASURE TIME TO RUN AN ALGORITHM
 One naïve way is to implement the algorithm and
run it in a machine.
 This time depends on
 the speed of the computer,
 the programming language,
 the compiler that translates the program to machine code.
 the program itself.
 And many other factors
 So, you may get different time for the same algorithm.
 Hence, not a good tool to compare different
algorithm.
 To overcome these issues, need to model Time
complexity.
TIME COMPLEXITY
 Developing a formula for predicting how fast an
algorithm is, based on the size of the input
 To compare algorithms
 Measure of efficiency/goodness of algorithm

 3 types of complexity
 Best case
 Lower bound.
 Minimum number of steps/operations to execute an
algorithm.
 Measure the minimum time required to run an algorithm.

 Not a good measure of Algorithm’s performance.


TIME COMPLEXITY
 Worst case
 Upper Bound
 Maximum # of operations/time required to execute

 Main focus

 Reduce risks as it gives the highest time of algorithm

execution
 Average case
 the amount of some computational resource (typically time)
used by the algorithm, averaged over all possible
inputs.
 Difficult to determine

 Typically follow the same curve as worst


TIME COMPLEXITY
 The best, worst, and average case time
complexities for any given algorithm are
numerical functions over the size of possible
problem instances.
 However, it is very difficult to work precisely
with these functions,
 Depends on specific input size.
 Not a smooth curve
 Require too much detail
 So, need more simplification or abstraction.
ASYMPTOTIC ANALYSIS
 We ignore too much details steps such as
 Initialization cost
 Implementation of specific operation.
 Rather we focus on
 how the time change if input doubles/triples
 Or how many more operations do we need for that
change.
A SAMPLE COMPLEXITY EQUATION
 f(n) = 2n2 + 10n + 3
 How does each term effected by change of n?
 For n=1000
 2n2=2000000
 10n = 10000

 Ratio: 2n /10n = 200 -> 0.5% of 2n


2 2

 For n = 1000000
 2n2 = 2000000000000
 10 n = 10000000

 Ratio: 2n /10n = 200000 -> 0.0005% of 2n


2 2

 So, as n grows the lower order term become


insignificant.
COMPARISON OF GROWTH RATE
A SAMPLE COMPLEXITY EQUATION
 Do similar analysis for the equations below.
 f(n) = 2n2 + 10n + 3
 f(n) = 5n2 + 6n + 35
 Will you get different result?
 No,
 As n->∞, all lower order terms become so
insignificant that we can just ignore them.

 Order of growth:
 How the time grow with input size
 Leading term/Highest order term in the equation
MORE TO THINK
 Tell me what types of equation are the following
2?
 f(n) = 2n2
 f(n) = 5n2

 Does the coefficient impact the characteristics of


the curve?
 Can we ignore the coefficient?
 YES,
 Why:
 constant factors are less significant than the rate of growth
in determining computational efficiency for large inputs.
MORE TO THINK
 So, in terms of algorithm analysis, we can think
 f(n) = 2n2 + 10n + 3 ~ n2 or (n2)
 f(n) = 5n2 + 6n + 35 ~ n2 or (n2)
ASYMPTOTIC ANALYSIS
 Classifying functions into different category.
 Formally, given functions f and g of a variable n,
we define a binary relation
 f ∼ g ( as n → ∞ )
 f and g grows the same way as their input grows.
 In Asymptotic Analysis,
 we evaluate the performance of an algorithm in
terms of input size
 Do not measure the actual running time
 We calculate, how does the time (or space) taken by
an algorithm increases with the input size.
ASYMPTOTIC NOTATION
 There are 3 Asymptotic notations as follows:
 Big Theta
 f(n) = Θ(g(n)) means c1· g(n) is an upper bound on f(n) and
c2· g(n) is a lower bound on f(n), for all n ≥ n0. Thus there
exist constants c1 and c2 such that f(n) ≤ c1·g(n) and f(n) ≥
c2·g(n). This means that g(n) provides a nice, tight bound
on f(n).
 Big Oh
 f(n) = O(g(n)) means c· g(n) is an upper bound on f(n). Thus
there exists some constant c such that f(n) is always ≤ c ·
g(n), for large enough n (i.e. ,n ≥ n0 for some constant n0).
 Big Omega
 f(n) = Ω(g(n)) means c· g(n) is a lower bound on f(n). Thus
there exists some constant c such that f(n) is always ≥ c·
g(n), for all n ≥ n0.
BIG THETA- NOTATION
 Θ(g(n)) = {f(n): there exist positive constants c1, c2 and
n0 such that 0 ≤ c1g(n) ≤ f(n) ≤ c2g(n) for all n >= n0}

 The theta notation bounds a functions from above


and below, so it defines exact asymptotic behavior.
 Thus g(n) provides a nice, tight bound on f(n).
 Note:
 The definition of asymptotic is a line that approaches a curve but
never touches.
BIG THETA- NOTATION

 Easy way to get Theta notation is to


 Drop lower-order terms
 Ignore leading constant.
 So, an3+bn+c = = Θ(n3)
HOW DO WE SHOW THAT?
 f(n) = 2n2 + 10n + 3 = (n)
 To prove that we need to find a c1, c2 and n0 so
that
 c1n2<= 2n2 + 10n + 3 <= c2n2 for n>=n0

 Obviously
 2n2 <= 2n2 + 10n + 3 for any n
 c1 = 2

 To calculate c2 lets increase the power of each


term to the highest power
 2n2 + 10n + 3 <= 2n2 + 10n2 + 3n2 = 15n2
 c2 = 15
ANOTHER EXAMPLE
 f(n) = 5n2 + 6n + 35 = (n)
 To prove that we need to find a c1, c2 and n0 so
that
 c1n2<= 5n2 + 6n + 35 <= c2n2 for n>=n0

 Obviously
 5n2 <= 5n2 + 6n + 35 for any n>0
 c1 = 5

 To calculate c2 lets increase the power of each


term to the highest power
 5n2 + 6n + 35 <= 5n2 + 6n2 + 35n2 = 46n2
 c2 = 46 for n>0
MORE EXAMPLES
 f(n) = 3n3 + 5n + 6 = (n3)
 f(n) = n log n + 10n = (n log n)
 f(n) = 2 + 1/n = (1)
BIG O NOTATION
 The Big O notation defines an upper bound of
an algorithm, it bounds a function only from
above.
O(g(n)) = { f(n): there exist positive constants c and n0 such
that 0 <= f(n) <= cg(n) for all n >= n0}
BIG OMEGA - Ω NOTATION
 Just as Big O notation provides an asymptotic
upper bound on a function, Ω notation provides
an asymptotic lower bound.
Ω (g(n)) = {f(n): there exist positive constants c and n0
such that 0 <= cg(n) <= f(n) for all n >= n0}.

 Similar to the best case, the


Omega notation is the least
used notation among all
three.
RELATIONSHIP AMONG THOSE NOTATION
 For any two functions f(n) and g(n),
 we have f(n) = Θ(g(n)) if and only if f(n) = O(g(n)) and
f(n) = Ω(g(n)).
 Θ(g(n)) = O(g(n)) ∩ Ω(g(n)).
DIFFERENT FUNCTIONS
 Constant functions, f(n) = c
 Logarithmic functions, f(n) = log n

 Linear functions, f(n) = n

 Superlinear functions, f(n) = n lg n

 Quadratic functions, f(n) = n2

 Cubic functions, f(n) = n3

 Exponential functions, f(n) = cn

 Factorial functions, f(n) = n!

 n! ≥ 2n ≥ n3 ≥ n2 ≥ n log n ≥ n ≥ log n ≥ c
TRY THESE
 Is 2n+1 = O(2n)?
 Is 22n = O(2n)?
 For each of the following pairs of functions, either f(n)
is in O(g(n)), f(n) is in Ω(g(n)), or f(n) = Θ(g(n)).
Determine which relationship is correct and briefly
explain why.
 f(n) = logn2; g(n) = logn + 5
 f(n) =√n; g(n) = logn2
 f(n) = log2 n; g(n) = logn
 f(n) = n; g(n) = log2 n
 f(n) = n log n + n; g(n) = logn
 f(n) = 10; g(n) = log10
 f(n) = 2n; g(n) = 10n2
 f(n) = 2n; g(n) = 3n
THE BIG OH NOTATIONS
THE BIG OH NOTATION
 3n2 − 100n + 6 = O(n2), Because for c = 3, 3n2 > 3n2 − 100n + 6;

 3n2 − 100n + 6 = O(n3), Because for c = 1, n3 > 3n2 − 100n + 6 when


n > 3;
 3n2 − 100n + 6 ≠ O(n), Because for any c, c ×n < 3n2 when n > c;

 3n2 − 100n + 6 = Ω(n2), Because for c = 2, 2n2 < 3n2 − 100n + 6 when
n > 100;
 3n2 − 100n + 6 ≠ Ω(n3), Because for c = 3, 3n2 − 100n + 6 < n3 when n > 3;

 3n2 − 100n + 6 = Ω(n), Because for any c, cn < 3n2 − 100n + 6 when
n > 100c;

 3n2 − 100n + 6 = Θ(n2), Because both O and Ω apply;

 3n2 − 100n + 6 ≠ Θ(n3), Because only O applies;

 3n2 − 100n + 6 ≠ Θ(n), Because only Ω applies.


REFERENCE
 Chapter 2 + 3 (Cormen)

You might also like