Big O, Big Theta, Big Omega

Download as ppt, pdf, or txt
Download as ppt, pdf, or txt
You are on page 1of 66

Algorithms and Complexity

Zeph Grunschlag

Copyright Zeph Grunschlag, 2001-2002.

Agenda
Section 2.1: Algorithms
 

Pseudocode Recursive Algorithms (Section 3.4)

Section 2.2: Complexity of Algorithms Section 1.8: Growth of Functions


  

Big-O Big-; (Omega) Big-5 (Theta)

L8

Section 2.1 Algorithms and Pseudocode


DEF: An algorithm is a finite set of precise instructions for performing a computation or solving a problem. Synonyms for a algorithm are: program, recipe, procedure, and many others.

L8

Pseudo-Java
Possible alternative to text s pseudo-Java Start with real Java and simplify:
int f(int[] a){ int x = a[0]; for(int i=1; i<a.length; i++){ if(x > a[i]) x = a[i]; } return x; }

L8

Pseudo-Java Version 1
integer f(integer_array (a1, a2, x = a1 for(i =2 to n){ if(x > ai) x = ai } return x }
L8

, an) ){

Pseudo-Java version 2
INPUT: integer_array V = (a1, a2, begin x = a1 for(y V) if(x > y) x=y end OUTPUT: x
L8

, a n)

Algorithm for Surjectivity


boolean isOnto( function f: (1, 2, , n) (1, 2, , m) ){ if( m > n ) return false // can t be onto soFarIsOnto = true for( j = 1 to m ){ soFarIsOnto = false for(i = 1 to n ){ if ( f(i ) == j ) soFarIsOnto = true if( !soFarIsOnto ) return false; } } return true; L8 }

Improved Algorithm for Surjectivity


boolean isOntoB( function f: (1, 2, , n) (1, 2, , m) ){ if( m > n ) return false // can t be onto for( j = 1 to m ) beenHit[ j ] = false; // does f ever output j ? for(i = 1 to n ) beenHit[ f(i ) ] = true; for(j = 1 to m ) if( !beenHit[ j ] ) return false; return true; }
L8 8

Recursive Algorithms (Section 3.4)


Real Java: long factorial(int n){ if (n<=0) return 1; return n*factorial(n-1); }
L8 9

Recursive Algorithms
long factorial(int n){ if (n<=0) return 1; return n*factorial(n-1); }

Compute 5!

L8

10

Recursive Algorithms
long factorial(int n){ if (n<=0) return 1; return n*factorial(n-1); }

f(5)= 5f(4)

L8

11

Recursive Algorithms
long factorial(int n){ if (n<=0) return 1; return n*factorial(n-1); }

f(4)= 4f(3)

f(5)= 5f(4)

L8

12

Recursive Algorithms
long factorial(int n){ if (n<=0) return 1; return n*factorial(n-1); }

f(3)= 3f(2)

f(4)= 4f(3)

f(5)= 5f(4)

L8

13

Recursive Algorithms
long factorial(int n){ if (n<=0) return 1; return n*factorial(n-1); }

f(2)= 2f(1)

f(3)= 3f(2)

f(4)= 4f(3)

f(5)= 5f(4)

L8

14

Recursive Algorithms
long factorial(int n){ if (n<=0) return 1; return n*factorial(n-1); }

f(1)= 1f(0)

f(2)= 2f(1)

f(3)= 3f(2)

f(4)= 4f(3)

f(5)= 5f(4)

L8

15

Recursive Algorithms
long factorial(int n){ if (n<=0) return 1; return n*factorial(n-1); }

f(0)= 1

f(1)= 1f(0)

f(2)= 2f(1)

f(3)= 3f(2)

f(4)= 4f(3)

f(5)= 5f(4)

L8

16

Recursive Algorithms
long factorial(int n){ if (n<=0) return 1; return n*factorial(n-1); }

11= 1

f(2)= 2f(1)

f(3)= 3f(2)

f(4)= 4f(3)

f(5)= 5f(4)

L8

17

Recursive Algorithms
long factorial(int n){ if (n<=0) return 1; return n*factorial(n-1); }

21= 2

f(3)= 3f(2)

f(4)= 4f(3)

f(5)= 5f(4)

L8

18

Recursive Algorithms
long factorial(int n){ if (n<=0) return 1; return n*factorial(n-1); }

32= 6

f(4)= 4f(3)

f(5)= 5f(4)

L8

19

Recursive Algorithms
long factorial(int n){ if (n<=0) return 1; return n*factorial(n-1); }

46= 24

f(5)= 5f(4)

L8

20

Recursive Algorithms
long factorial(int n){ if (n<=0) return 1; return n*factorial(n-1); }

524= 120

L8

21

Recursive Algorithms
long factorial(int n){ if (n<=0) return 1; return n*factorial(n-1); } Return 5!

= 120

L8

22

Section 2.2 Algorithmic Complexity


Compare the running time of 2 previous algorithms for testing surjectivity. Measure running time by counting the number of basic operations .

L8

23

Running Time
Basic steps
Assignment Increment Comparison Negation Return Random array access Function output access etc.

In a particular problem, may tell you to consider other operations (e.g. multiplication) and ignore all others
L8 24

Running time of
boolean isOnto( function f: (1, 2, , n) (1, 2, , m) ){ if( m > n ) return false soFarIsOnto = true for( j = 1 to m ){ soFarIsOnto = false for(i = 1 to n ){ if ( f(i ) == j ) soFarIsOnto = true if( !soFarIsOnto ) return false } } return true; }
L8

st 1

algorithm

1 step OR: 1 step (assigment) m loops: 1 increment plus 1 step (assignment) n loops: 1 increment plus 1 step possibly leads to: 1 step (assignment) 1 step possibly leads to: 1 step (return)

possibly 1 step
25

Running time of
1 step (m>n) OR: 1 step (assigment) m loops: 1 increment plus 1 step (assignment) n loops: 1 increment plus 1 step possibly leads to: 1 step (assignment) 1 step possibly leads to: 1 step (return) possibly 1 step

st 1

algorithm

WORST-CASE running time: Number of steps = 1 OR 1+ 1+ m (1+ 1 + n (1+1 + 1 +1 + 1 ) +1 ) = 1 (if m>n) OR 5mn+3m+2
26

L8

Running time of
boolean isOntoB( function f: (1, 2, , n) (1, 2, , m) ){ if( m > n ) return false for( j = 1 to m ) beenHit[ j ] = false for(i = 1 to n ) beenHit[ f(i ) ] = true for(j = 1 to m ) if( !beenHit[ j ] ) return false return true }

nd 2

algorithm

1 step OR: m loops: 1 increment plus 1 step (assignment) n loops: 1 increment plus 1 step (assignment) m loops: 1 increment plus 1 step possibly leads to: 1 step possibly 1 step

.
27

L8

Running time of
1 step (m>n) OR: m loops: 1 increment plus 1 step (assignment) n loops: 1 increment plus 1 step (assignment) m loops: 1 increment plus 1 step possibly leads to: 1 step possibly 1 step

nd 2

algorithm

WORST-CASE running time: Number of steps = 1 OR 1+ m (1+ 1) + n (1+ 1) + m (1+ 1 + 1) + 1 = 1 (if m>n) OR 5m + 2n + 2

L8

28

Comparing Running Times


1. At most 5mn+3m+2 for first algorithm 2. At most 5m+2n+2 for second algorithm
Worst case when m e n so replace m by n: 5n 2+3n+2 vs. 8n+2 To tell which is better, look at dominant term: 5

n 2+3n+2

vs.

8 +2

So second algorithm is better.

L8

29

Comparing Running Times. Issues


1. 5n 2+3n+2 , 8n+2 are more than just
their biggest term. Consider n = 1. 2. Number of basic steps doesn t give accurate running time. 3. Actual running time depends on platform. 4. Overestimated number of steps: under some conditions, portions of code will not be seen.
L8 30

Running Times Issues Big-O Response


Asymptotic notation (Big-O, Big-; , Big-5) gives partial resolution to problems: 1. For large n the largest term dominates so 5n 2+3n+2 is modeled by just n 2.

L8

31

Running Times Issues Big-O Response


Asymptotic notation (Big-O, Big-; , Big-5) gives partial resolution to problems: 2. Different lengths of basic steps, just change 5n 2 to Cn 2 for some constant, so doesn t change largest term

L8

32

Running Times Issues Big-O Response


Asymptotic notation (Big-O, Big-; , Big-5) gives partial resolution to problems: 3. Basic operations on different (but welldesigned) platforms will differ by a constant factor. Again, changes 5n 2 to Cn 2 for some constant.

L8

33

Running Times Issues Big-O Response


Asymptotic notation (Big-O, Big-; , Big-5) gives partial resolution to problems: 4. Even if overestimated by assuming iterations of while-loops that never occurred, may still be able to show that overestimate only represents different constant multiple of largest term.

L8

34

Worst Case vs. Average Case


Worst case complexity: provides absolute guarantees for time a program will run. The worst case complexity as a function of n is longest possible time for any input of size n. Average case complexity: suitable if small function is repeated often or okay to take a long time very rarely. The average case as a function of n is the avg. complexity over all possible inputs of that length. Avg. case complexity analysis usually requires probability theory. (Delayed till later)
L8 35

Section 1.8 Big-O, Big-;, Big-5


Useful for computing algorithmic complexity, i.e. the amount of time that it takes for computer program to run.

L8

36

Notational Issues
Big-O notation is a way of comparing functions. Notation unconventional: EG: 3x 3 + 5x 2 9 = O (x 3) Doesn t mean 3x 3 + 5x 2 9 equals the function O (x 3) Which actually means 3x 3+5x 2 9 is dominated by x 3 Read as: 3x 3+5x 2 9 is big-Oh of x 3
L8 37

Intuitive Notion of Big-O


Asymptotic notation captures behavior of functions for large values of x. EG: Dominant term of 3x 3+5x 2 9 is x 3. As x becomes larger and larger, other terms become insignificant and only x 3 remains in the picture:

L8

38

Intuitive Notion of Big-O domain [0,2]


y = 3x 3+5x 2 9 y=x
3

y=x y=x

L8

39

Intuitive Notion of Big-O domain [0,5]


y = 3x 3+5x 2 9

y=x

y=x

y=x

L8

40

Intuitive Notion of Big-O domain [0,10]


y = 3x 3+5x 2 9

y=x

y=x y=x
L8

41

Intuitive Notion of Big-O domain [0,100]


y = 3x 3+5x 2 9

y=x

y=x y=x
L8

42

Intuitive Notion of Big-O


In fact, 3x 3+5x 2 9 is smaller than 5x for large enough values of x:
y = 5x
3

y = 3x 3+5x 2 9

y=x y=x
L8 43

Big-O. Formal Definition


f (x ) is asymptotically dominated by g (x ) if there s a constant multiple of g (x ) bigger than f (x ) as x goes to infinity: DEF: Let f , g be functions with domain Ru0 or N and codomain R. If there are constants C and k such  x > k, |f (x )| e C |g (x )| then we write: f (x ) = O ( g (x ) )

L8

44

Common Misunderstanding
It s true that 3x 3 + 5x 2 9 = O (x 3) as we ll prove shortly. However, also true are:
  

3x 3 + 5x 2 9 = O (x 4) x 3 = O (3x 3 + 5x 2 9) sin(x) = O (x 4)

NOTE: C.S. usage of big-O typically involves mentioning only the most dominant term. The running time is O (x 2.5) Mathematically big-O is more subtle.
L8 45

Big-O. Example
EG: Show that 3x 3 + 5x 2 9 = O (x 3). Previous graphs show C = 5 good guess. Find k so that 3x 3 + 5x 2 9 e 5x 3 for x > k

L8

46

EG: Show that 3x 3 + 5x 2 9 = O (x 3).


Find k so that 3x 3 + 5x 2 9 e 5x 3 for x > k 1. Collect terms: 5x 2 2x 3 + 9

L8

47

EG: Show that 3x 3 + 5x 2 9 = O (x 3).


Find k so that 3x 3 + 5x 2 9 e 5x 3 for x > k 1. Collect terms: 5x 2 2x 3 + 9 2. What k will make 5x 2 x 3 for x > k ?

L8

48

EG: Show that 3x 3 + 5x 2 9 = O (x 3).


Find k so that 3x 3 + 5x 2 9 e 5x 3 for x > k 1. Collect terms: 5x 2 2x 3 + 9 2. What k will make 5x 2 x 3 for x > k ? 3. k = 5 !

L8

49

EG: Show that 3x 3 + 5x 2 9 = O (x 3).


Find k so that 3x 3 + 5x 2 9 e 5x 3 for x > k 1. Collect terms: 5x 2 2x 3 + 9 2. What k will make 5x 2 x 3 for x > k ? 3. k = 5 ! 4. So for x > 5, 5x 2 x 3 2x 3 + 9
L8 50

EG: Show that 3x 3 + 5x 2 9 = O (x 3).


Find k so that 3x 3 + 5x 2 9 e 5x 3 for x > k 1. Collect terms: 5x 2 2x 3 + 9 2. What k will make 5x 2 x 3 for x > k ? 3. k = 5 ! 4. So for x > 5, 5x 2 x 3 2x 3 + 9 5. Solution: C = 5, k = 5 (not unique!)
L8 51

EG: Show that 3x 3 + 5x 2 9 = O (x 3).


Find k so that 3x 3 + 5x 2 9 e 5x 3 for x > k 1. Collect terms: 5x 2 2x 3 + 9 2. What k will make 5x 2 x 3 for x > k ? 3. k = 5 ! 4. So for x > 5, 5x 2 x 3 2x 3 + 9 5. Solution: C = 5, k = 5 (not unique!)
L8 52

Big-O. Negative Example


x 4 { O (3x 3 + 5x 2 9) : No pair C, k exist for which x > k implies C (3x 3 + 5x 2 9) u x 4 Argue using limits: 4

x x lim ! lim 3 2 x pg C (3 x  5 x  9) x pg C (3  5 / x  9 / x 3 ) x 1 ! lim ! lim x ! g x pg C (3  0  0) 3C xpg

x 4 always catches up regardless of C.


L8 53

Big-O and limits


LEMMA: If the limit as x g of the quotient |f (x) / g (x)| exists then f (x ) = O ( g (x ) ). EG: 3x 3 + 5x 2 9 = O (x 3 ). Compute:
3x 3  5 x 2  9 3  5 / x  9 / x3 lim ! lim !3 3 x pg x pg x 1

so big-O relationship proved.

L8

54

Little-o and limits


DEF: If the limit as x g of the quotient |f (x) / g (x)| = 0 then f (x ) = o (g (x ) ). EG: 3x 3 + 5x 2 9 = o (x 3.1 ). Compute:
3x 3  5 x 2  9 3 / x 0.1  5 / x1.1  9 / x 3.1 lim ! lim !0 3 .1 x pg x pg x 1

L8

55

Big-; and Big-5


Big-;: reverse of big-O. I.e. f (x ) = ;(g (x )) g (x ) = O (f (x )) so f (x ) asymptotically dominates g (x ). Big-5: domination in both directions. I.e. f (x ) = 5(g (x )) f (x ) = O (g (x )) Synonym for f = 5(g):

f (x ) = ;(g (x ))
f is of order g

L8

56

Useful facts
Any polynomial is big-5 of its largest term


EG: x 4/100000 + 3x 3 + 5x

9 = 5(x 4)

The sum of two functions is big-O of the biggest




EG: x 4 ln(x ) + x 5 = O (x 5) EG: 17x 4 ln(x ) = O (x 4 ln(x ))


57

Non-zero constants are irrelevant:




L8

Big-O, Big-;, Big-5. Examples


Q: Order the following from smallest to largest asymptotically. Group together all functions which are big-5 of each other:

1 1 x e x x  sin x, ln x, x  x , ,13  ,13  x, e , x , x x x 20 2 ( x  sin x)( x  102 ), x ln x, x(ln x) , lg 2 x


L8 58

A: 1.1 x 2.13  1 x 3. ln x, lg 2 x (change of base formula) 4. x  sin x, x  x ,13  x 5. x ln x 2 6. x(ln x) e 7. x 8. ( x  sin x)( x 20  102) x 9. e 10. x x
L8

Big-O, Big-;, Big-5. Examples

59

Incomparable Functions
Given two functions f (x ) and g (x ) it is not always the case that one dominates the other so that f and g are asymptotically incomparable. E.G: f (x) = |x 2 sin(x)| vs. g (x) = 5x 1.5

L8

60

Incomparable Functions
2500 2000

y = x2
1500

1000

y = |x 2 sin(x)|

y = 5x 1.5

500

L8

0 0 5 10 15 20 25 30 35 40 45 50

61

Incomparable Functions
x 10 4
4

3.5

2.5

y = x2

1.5

y = 5x 1.5 y = |x 2 sin(x)|

0.5

L8

20

40

60

80

100

120

140

160

180

200

62

Big-O A Grain of Salt


Big-O notation gives a good first guess for deciding which algorithms are faster. In practice, the guess isn t always correct. Consider time functions n 6 vs. 1000n 5.9. Asymptotically, the second is better. Often catch such examples of purported advances in theoretical computer science publications. The following graph shows the relative performance of the two algorithms:
L8 63

Running-time In days

Big-O A Grain of Salt


T(n) = 1000n 5.9
Assuming each operation takes a nano-second, so computer runs at 1 GHz

T(n) = n

Input size n
L8 64

Big-O A Grain of Salt


In fact, 1000n 5.9 only catches up to n 6 when 1000n 5.9 = n 6, i.e.: 1000= n 0.1, i.e.: n = 100010 = 1030 operations = 1030/109 = 1021 seconds } 1021/(3x107) } 3x1013 years } 3x1013/(2x1010) } 1500 universe lifetimes!
L8 65

Example for Section 1.8


Link to example proving big-Omega of a sum.

L8

66

You might also like