Asymptotic Notations
Asymptotic Notations
Asymptotic Notations
1
Asymptotic Notations Properties
• Categorize algorithms based on asymptotic growth rate
• e.g. linear, quadratic, exponential
Limitations
• not always useful for analysis on fixed-size inputs.
2
• All results are for sufficiently large inputs.
Asymptotic Notations
Asymptotic Notations , O, , o,
We use to mean “order exactly”, (Tight Bound)
O to mean “order at most”, (Tight Upper Bound)
to mean “order at least”, (Tight Lower Bound)
o to mean “upper bound”,
• to mean “lower bound”,
3
Big-Oh Notation (O)
If f, g: N R+, then we can define Big-Oh as
For a given function g n 0, denoted by g n the set of functions,
g n f n : there exist positive constants c and no such that
0 f n cg n , for all n n o
f n g n means function g n is an asymptotic ally
upper bound for f n .
Intuitively:
Set of all functions whose rate of growth is the same as or lower than
that of g(n). f(n) is bounded above by g(n) for all sufficiently large n 4
Big-Oh Notation (O)
f(n) O(g(n))
7
Example
Prove that 2n2 = O(n3)
Proof:
Assume that f(n) = 2n2, and g(n) = n3
f(n) = O(g(n)) ?
Proof:
Assume that f(n) = n2, and g(n) = n2
f(n) = O(g(n)) ?
Now we have to find the existence of c and n 0
if we take, c = 1, n0= 1
Then
n2 ≤ c.n2 for c = 1 and n 1
9
Hence, n2 = O(n2), where c = 1 and n0= 1
Example
Prove that 1000.n2 + 1000.n = O(n2)
Proof:
Assume that f(n) = 1000.n2 + 1000.n, and g(n) = n2
We have to find existence of c and n 0 such that
0 ≤ f(n) ≤ c.g(n) for all n n0
Proof:
On contrary we assume that there exist some positive constants
c and n0 such that
0 ≤ n3 ≤ c.n2 for all n n0
n≤c
Proof:
Assume that f(n) = 2n + 10, and g(n) = n
f(n) = O(g(n)) ?
Now we have to find the existence of c and n 0
• Let’s check if
(1/3)n = O( 17)
(1/3)n ≤ c.17, which is true for c=1,n0 = 1
• Let’s check if
17 = O((1/3)n )
17 ≤ c. (1/3)n , which is true for c > 17. 3n
• And hence can’t be bounded for large n.
• That’s why (1/3)n is less in growth rate then 17.
13
Example
Prove or disprove 22n = O(2n )?
14
Example
Prove that : 8n2 + 2n - 3 = O(n2)
Proof:
Need c > 0 and n0 1 such that
8n2 + 2n - 3 ≤ c.n2 for n n0
15
Example
Can you bound 3n = O(2n ) ?
16
Example
Which of the following function is larger by order of growth? N log N
or N1.5?
Now, we can refer to the previously state result to figure out whether
f(N) or g(N) grows faster!
17
Big-Omega Notation ()
If f, g: N R+, then we can define Big-Omega as
Intuitively:
Set of all functions whose rate of growth is the same as or higher than
that of g(n).
18
Big-Omega Notation ()
f(n) (g(n))
Proof:
Assume that f(n) = 3n + 2 , and g(n) = n
f(n) = (g(n)) ?
At R.H.S a positive term is being added to 3n, which will make L.H.S ≤
R.H.S for all values of n, when c = 3.
Proof:
Assume that f(n) = 5.n2 , and g(n) = n
f(n) = (g(n)) ?
Proof:
Assume that f(n) = 5n2 + 2n - 3, and g(n) = n2
f(n) = (g(n)) ?
Proof:
Let f(n) = 100.n + 5, and g(n) = n2
Assume that f(n) = (g(n)) ?
Intuitively: Set of all functions that have same rate of growth as g(n).
When a problem is (n), this represents both an upper and lower
bound i.e. it is O(n) and (n) (no algorithmic gap) 24
Theta Notation ()
f(n) (g(n))
Proof:
Assume that f(n) = ½.n2 – ½.n, and g(n) = n2
f(n) = (g(n))?
We have to find the existence of c 1, c2 and n0 such that
c1.g(n) ≤ f(n) ≤ c2.g(n) for all n n0
Proof
Let f(n) = ½.n2 – 3.n, and g(n) = n2 f(n) = (g(n))?
We have to find the existence of c 1, c2 and n0 such that
c1.g(n) ≤ f(n) ≤ c2.g(n) n n0
c1. n2 ≤ ½.n2 – 3.n ≤ c2. n2
f n
e.g., 2n o n but 2n o n ..
2 2
2 lim
n g n
0
29
Example
Prove that 2n2 = o(n3)
Proof:
Assume that f(n) = 2n2 , and g(n) = n3
f(n) = o(g(n)) ?
Proof:
Assume that f(n) = n2 , and g(n) = n2
Now we have to show that f(n) ≠ o(g(n))
Since
f(n) < c.g(n) n2 < c.n2 1 ≤ c,
f n
n 2 ..
n2 n2
e.g., n but lim
n g n
2 2
32
Example
Prove that 5.n2 = (n)
Proof:
Assume that f(n) = 5.n2 , and g(n) = n
f(n) = (g(n)) ?
Proof:
Assume that f(n) = 5.n + 10, and g(n) = n
f(n) ≠ (g(n)) ?
Proof:
Let f(n) = 100.n, and g(n) = n2
Assume that f(n) = (g(n))
Now if f(n) = (g(n)) then there n0 for any c such that
c.g(n) < f(n) for all n n0
If f(n) = O(g(n)) and f(n) ≠ (g(n)), then we say that f(n) is asymptotically
slower growing than g(n).
If f(n) = (g(n)) and f(n) ≠ O(g(n)), then we say that f(n) is asymptotically
faster growing than g(n).
36
Usefulness of Notations
It is not always possible to determine behaviour of an algorithm
using -notation.
We may prove that an algorithm never uses more than e.n2 time
and never less than f.n time.
In either case we can neither claim (n) nor (n2) to be the order of
the time usage of the algorithm.
37
Big O and notation will allow us to give at least partial information
Usefulness of Notations
To express the efficiency of our algorithms which of the three
notations should we use?
Why?
If we know the worse case then we can aim to improve it and/or
avoid it.
38
Usefulness of Notations
Even though it is correct to say “7n - 3 is O(n3)”, a better statement is
“7n - 3 is O(n)”, that is, one should make the approximation as tight as
possible
Simple Rule:
Drop lower order terms and constant factors
7n-3 is O(n)
8n2log n + 5n2 + n is O(n2log n)
Answer:
• They may not be the same. There is this small matter of the
constant of proportionality.
• Suppose that A does ten operations for each data item, but
algorithm B only does three.
• It is reasonable to expect B to be faster than A even though both 40
have the same asymptotic performance. The reason is that
asymptotic analysis ignores constants of proportionality.
Big Oh Does Not Tell the Whole Story
Algorithm_A {
set up the algorithm; /*taking 50 time units*/
read in n elements into array A; /* 3 units per element */
for (i = 0; i < n; i++) {
do operation1 on A[i]; /* takes 10 units */
do operation2 on A[i]; /* takes 5 units */
do operation3 on A[i]; /* takes 15 units */
} TA(n) = 50 + 3n + (10 + 5 + 15)*n
} = 50 + 33*n
Algorithm_B {
set up the algorithm; /*taking 200 time units*/
read in n elements into array A; /* 3 units per element */
for (i = 0; i < n; i++) {
do operation1 on A[i]; /* takes 10 units */
do operation2 on A[i]; /* takes 5 units */ 41
} TB(n) = 200 + 3n + (10 + 5)*n
} = 200 + 18*n
Big Oh Does Not Tell the Whole Story
Both algorithms have time complexity O(n).
Algorithm A sets up faster than B, but does more operations on the data
Algorithm A is the better choice for small values of n.
For values of n > 10, algorithm B is the better choice
42
A Misconception
A common misconception is that worst case running time is somehow
defined by big-Oh, and that best case is defined by big-Omega.
43
Relations over Asymptotic Notations
Maximum rule: O(f(n)+g(n)) = O( max(f(n),g(n)) )
Additive and Multiplicative Property:
If e(n) = O(g(n)) & f(n) = O(h(n)) then e(n) + f(n) = O(g(n) + h(n))
If e(n) = O(g(n)) & f(n) = O(h(n)) then e(n) • f(n) = O(g(n) • h(n))
Dichotomy Property: If f(n) = O(g(n)) & g(n) = O(f(n)) then f(n) = (g(n))
If f(n) = (g(n)) and g(n) = (f(n)) then f(n) = (g(n))
Reflexive Property: If f(n) = O(f(n)) and f(n) = (f(n)) and f(n) = (f(n))
f(n) o(f(n)) and f(n) (f(n))
Symmetry over : f(n) = (g(n)) g(n) = (f(n))
Transitivity Property:
f(n) = (g(n)) & g(n) = (h(n)) f(n) = (h(n))
f(n) = O(g(n)) & g(n) = O(h(n)) f(n) = O(h(n))
f(n) = (g(n)) & g(n) = (h(n)) f(n) = (h(n)) 44
f(n) = o (g(n)) & g(n) = o (h(n)) f(n) = o (h(n))
f(n) = (g(n)) & g(n) = (h(n)) f(n) = (h(n))