Lesson 1

Download as pdf or txt
Download as pdf or txt
You are on page 1of 120

Prof.

Young Park
1
2
Problem Solution Program

computer

3
 A problem is a question to which we seek an
answer.
 Example 1.1 sorting
 Example 1.2 searching
 An instance of the problem.
 Example 1.3
 Example 1.4
 Computational problems

4
 The sorting problem
 The searching problem
 The selection problem
 …
 …
 …

5
 Input (and store) data.
 Process data. (algorithms)
 Output data.

6
 A simple data
 Consists of atomic data items (values).
 A structured data
 Consists of collections of data items (values), all
related to each other in certain ways.
 A particular way of storing and organizing data so
that it can be used efficiently.
 Data Structures

7
 Arrays
 Linked lists
 Stacks
 Queues
 Binary heaps
 Hash tables
 Binary trees
 Binary search trees

8
 Skip lists
 Self-organizing lists
 Graphs
 Leftist heaps
 Skew heaps
 AVL trees
 Splay trees
 2-3 trees
 2-3-4 trees
 Red-Black trees
 B-trees

9
 Operations that manipulate the data items in
the data structures. (mini-algorithms!)
 Algorithms that use these operations.
 A general step-by-step procedure for producing
the solution to each instance of a problem.
 Example 1.5 Sequential Search

 Efficient data structures are a key to designing


efficient algorithms!
10
 Divide-and-Conquer
 Dynamic Programming
 Greedy Approach
 Backtracking
 Branch-and-Bound
 Randomization

11
 Efficient Solution =
 Efficient Data Structures +
 Efficient Operations +
 Efficient Algorithms

12
 An implementation of a solution (design)!
Program
Program
Data#1 (Structure#1)
Operations#1

Data Data#2 (Structure#2)


(Structures) Operations#2

Data#3 (Structure#3)
Algorithms Operations#3
Algorithms

.
Operations .
.

Data#n (Structure#n)
Operations#n

13
14
 Iterative algorithms
 Using iteration
 Recursive algorithms
 Using recursion

15
 Measure the efficiency of an algorithm in
terms of
 Time (required for an algorithm)
 Space (required for a data structure)

 Time complexity
 Space complexity

16
 Algorithm 1.1 Sequential Search - Iterative
 Algorithm 1.2 Add Array Members
 Algorithm 1.3 Exchange Sort
 Algorithm 1.4 Matrix Multiplication
 Algorithm 1.5 Binary Search – Iterative

 Algorithm 1.6 N-th Fibonacci Term-Recursive


 Algorithm 1.7 N-th Fibonacci Term -Iterative

17
 Algorithm 1.1 Sequential Search on an
unsorted (ordered) array
 Linear time O(N)
 Can we improve?
 Idea?
 Sorted (Ordered) for Better Search!

 Better Data Structures!

18
 Algorithm 1.5 Binary Search – Iterative – on
a sorted (ordered) array

 Divide-and-Conquer
 O(log N)

19
 Write an iterative Sequential Search
algorithm?
 Algorithm 1.1
 Write an iterative Binary Search algorithm?
 Algorithm 1.5

20
 (0,) 1, 1, 2, 3, 5, 8, 13, 21, 34, 55, 89, 144, …

 fib0 =0
 fib1 = 1
 fibn = fibn – 1 + fibn – 2 for n >= 2;

21
 Algorithm 1.6 N-th Fibonacci Term (Recursive)
 Divide-and-Conquer
 Top-down
 Exponential time O(2N)

 fib(n) = fib(n – 1) + fib(n – 2) for n >= 2;


 fib(0) =0
 fib(1) = 1
 Example: fib(5)?
 5

22
 fib(5)?

23
 This D&C-based recursive algorithm is
extremely inefficient – exponential time
O(2N) .

 Can we improve?
 Idea?
 Dynamic Programming

 Better Algorithm Paradigm!


24
 Algorithm 1.7 N-th Fibonacci Term (Iterative)
 Dynamic Programming
 Bottom-up
 Linear time O(N)

 The Array F F 012 … n


▪ F[i] = F[i – 1] + F[i – 2] for n >= 2;
▪ F[0]=0
▪ F[1] = 1

 Example: fib(5)?
 F[5]?
 5
25
 The study of all possible algorithms that can
solve a given problem.
 Determines a lower bound on the efficiency
of all algorithms for a given problem.

 Lower bound for fundamental problems:


 The sorting problem
 The searching problem
 The selection problem

26
 Tractable Problems
 A problem is called tractable if an efficient (polynomial
running time P) algorithm is possible.
▪ The sorting problem
▪ The searching problem
▪ The selection problem
▪ …
 Intractable (Hard) Problems
 A problem is called intractable if there is no efficient algorithm.
▪ NP and NP-complete problems
▪ NP-Hard problems
▪ …

27
28
 Algorithm analysis measures the efficiency of
an algorithm as a function of the input size.
 We want a measure that is independent of
 the computer,
 the programming language,
 the programmer, and
 all the complex details of the algorithm.

29
 In general, the running time of the algorithm
increases with the size of the input.
 The total running time is proportional to
how many times some basic operation is
done.
 Therefore, we analyze an algorithm’s time
efficiency by determining
 the number of some basic operation as a
function of the size of the input.
30
 A time complexity analysis determines how
many times the basic operation is done for each
value of the input size.
 There is no hard and fast rule for choosing the
basic operation.
 It is largely a matter of judgment and experience.

 Examples: Basic operations?


 Algorithm 1.1 Sequential Search - Iterative
 Algorithm 1.2 Add Array Members
 Algorithm 1.3 Exchange Sort
 Algorithm 1.4 Matrix Multiplication
31
 Every Case Time
 Worst Case Time
 Average Case Time
 Best Case Time

32
 T(n) = the number of times the algorithm
does the basic operation for an instance of
size n.
 Called the every-case time complexity of the
algorithm.
 Analysis of Algorithm 1.2 (Add Array
Members)
 Analysis of Algorithm 1.3 (Exchange Sort)
 Analysis of Algorithm 1.4 (Matrix
Multiplication)
33
 W(n) = the maximum number of times the
algorithm will ever do its basic operation for
an input size of n.
 Called the worst case time complexity of the
algorithm.
 Analysis of Algorithm 1.1 (Sequential Search)

34
 A(n) = the average (expected) value of the
number of times the algorithm does the basic
operation for an input size of n.
 Called an average-case time complexity analysis.
 Analysis of Algorithm 1.1 (Sequential Search)

35
 B(n) = the minimum number of times the
algorithm will ever do its basic operation for
an input size of n .
 Called the best-case time complexity of the
algorithm.
 Analysis of Algorithm 1.1 (Sequential Search)

36
 The asymptotic behavior of an algorithm for
very large problem sizes.
 How quickly the algorithm’s time/space
requirement grows as a function of the problem
size?
 Measure the efficiency of an algorithm as a
growth rate function of the algorithm.
 An estimating technique!
 But, proved to be useful!

37
 The asymptotic running time of the algorithm
A for the problem size n: GrowthRateTime(n)

The Asymptotic Efficiency of an Algorithm =


A Growth Rate of the Function of the Problem Size

How it grows?

38
 Types of algorithms?
 The time complexity (running time) of an
algorithm?
 Time complexity cases?
 Asymptotic behavior?
 Why not the exact behavior of an algorithm?

39
 Upper Bound
 Lower Bound

40
 For asymptotic upper bound
 Big-O
 For asymptotic lower bound
 Big-

41
 An asymptotic bound as function of the size
of the input, on the worst (slowest, most
amount of space used) an algorithm will take
to solve a problem.
 No input will cause the algorithm to use more
resources than the bound.

42
 Let f(n) be a function which is non-negative
for all integers n  0.

f(n) = O (g(n))
f(n) Є O (g(n))
“f(n) is big-oh g(n)”
if
there exist a constant c  0 and a constant n0
such that
f(n)  c  g(n) for all integers n  n0.
43
 Drop all but the most significant terms.
 O(n2 + n log n + n + 1)  O(n2 )
 O(n log n + n + 1)  O(n log n) What dominates?
 O(n + 1)  O(n)
 Drop constant coefficients.
 O(2 n2)  O(n2)
 O(1024)  O(1)

44
 log n = O(n)
 n = O(n)
 100 n + 10 log n = O(n)

 Example 1.7
 Example 1.8
 Example 1.11

45
 A constant growth rate O(1)
 A logarithmic growth rate O(log N)
 A logarithmic squared growth rate O(log 2 N)
 A linear growth rate O(N)
 A linear-logarithmic (?) growth rate O(N log N)
 A quadratic growth rate O(N2)
 A cubic growth rate O(N3)
 A polynomial growth rate O(Nk) for a constant k.
 An exponential growth rate O(2N)
 A factorial growth rate O(N!)

46
O(N!)
O(2N)
O(N3)
O(N2)
O(N log N)
O(N)
O(log 2 N)
O(log N)
O(log (log N))
O(1)

47
 Figure 1.3

48
 O(1)
 O(log N)
 O(N)
 O(N log N)
 O(N2)
 O(2N)

49
 The amount of space or time is independent
of the amount of data.
 If the amount of data doubles, the amount of
space or time will stay the same!
 Example:
 An item can be added to the beginning of a linked
list of n items in constant time independent of the
number of items in the list.

50
 If the amount of data doubles, the amount of
space or time will increase by 1!
 Example:
 The worst-case time for binary search is
logarithmic in the size of the array of n items.

51
 If the amount of data doubles, the amount of
space or time will also double!
 Example:
 The time needed to print all of the values stored in
an array is linear in the size of the array of n items.

52
 If the amount of data doubles, the amount of
space or time will quadruple!
 Example:
 The amount of space needed to store a two-
dimensional square (N by N)array is quadratic in
the number N of rows.

53
 If the amount of data increase by 1, the
amount of space or time will double!
 Example:
 The number of moves required to solve the
Towers of hanoi puzzle is exponential in the
number N of disks used.

54
 An asymptotic bound as function of the size
of the input, on the best (fastest, least
amount of space used) an algorithm will take
to solve a problem.
 No algorithm can use fewer resources than the
bound.

55
 Let f(n) be a function which is non-negative
for all integers n  0.
f(n) = (g(n))
f(n) Є (g(n))
“f(n) is (big-)omega g(n)”
if
there exist a constant c  0 and a constant n0
such that
c  g(n)  f(n) for all integers n  n0.
56
 n = (n)
 n2 = (n)
 2n = (n)

 Example 1.13
 Example 1.14
 Example 1.15

57
 Let f(n) be a function which is non-negative
for all integers n  0.

f(n) =  (g(n))
f(n) Є  (g(n))
“f(n) is (big-) theta g(n)”
if and only if
f(n) is O (g(n)) and f(n) is (g(n))

58
59
 Polynomial Functions
 If f(n) = am nm + … +a1 n + a0 then f(n) = O(nm)
 If f(n) = am nm + … +a1 n + a0 then f(n) = (nm)
 If f(n) = am nm + … +a1 n + a0 then f(n) = (nm)

60
 Notations for Asymptotic Behavior?
 Big-O Notation for (Asymptotic) Upper
Bound?
 Big- Notation for (Asymptotic) Lower
Bounds?

61
The Order (Growth Rate) of an Algorithm is more important
than the Speed of a Computer.

62
63
 Depends on the type of an algorithm!

1. Iterative algorithms
 Summation
2. Recursive algorithms
 Recurrence equation/relation

64
 Iterative algorithms
 Summation

65
 Single assignment statement
 O(1)
 Simple expression
 O(1)
 Statement1; Statement2; …;Statementn
 The maximum of O(Statement1), O(Statement2),
…, and O(Statementn).

66
 IF Condition THEN Statement1 ELSE
Statement2;
 O(Condition) + The maximum of O(Statement1)
and O(Statement2).
 The maximum of O(Condition), O(Statement1)
and O(Statement2).

67
 FOR (i=1; i<=N; i++) Statement
 O(NStatement) where N= The number of loop
iterations.
 FOR (S1 ; S2 ; S3) Statement
 O(S1 + S2(N+1) + S3N + StatementN)
 The maximum of O(S1), O(S2 (N+1)), O(S3  N)
and O(Statement  N)

68
 WHILE (condition) Statement
 O(NStatement) where N= The number of loop
iterations.

69
1 sum = 0;
2 for (i=1; i<=n; i++) i=1,n 1 = 1+1+1+…+1 = n
3 sum += n;

Total = O(n)

70
1 sum1 = 0;
2 for (i=1; i<=n; i++) i=1,n j=1,n 1
= n + n + n +… + n
3 for (j=1; j<=n; j++)
= n2
4 sum1 ++;

Total = O(n2)
71
1 sum2 = 0;
2 for (i=1; i<=n; i++) i=1,n j=1,i 1
3 for (j=1; j<=i; j++) =1+2+3+…+n
= n(n+1)/2
4 sum2 ++;

Total = O(n2)

72
1 sum1 = 0;
2 for (i=1; i<=n; i++)
3 for (j=1; j<=n; j++)
4 sum1 ++;
5 sum2 = 0;
6 for (i=1; i<=n; i++)
7 for (j=1; j<=i; j++)
8 sum2 ++;
Total = O(n2)
73
1 sum = 0;
2 for (j=1; j<=n; j++)
3 for (i=1; i<=j; i++)
4 sum ++;
5 for (k=0; k<=n; k++)
6 A[k] = k;
Total = O(n2)

74
Assume n = 2k
1 sum1 = 0;
2 for (i=1; i<=n; i*=2) i=1,2,4,8,…,n (j=1,n 1)
3 for (j=1; j<=n; j++) = i=0,log n (j=1,n 1 )
= n (log n+1)
4 sum1 ++;
i = 1, 2, 4, 8, …, n i = 20 21 22 23 … 2log n

Total = O(n log n)


75
Assume n = 2k
1 sum2 = 0; i = 1, 2, 4, 8, …, n
2 for (i=1; i<=n; i*=2)
3 for (j=1; j<=i; j++) i=1,2,4,8,…,n (j=1,i 1)
4 sum2 ++; =1+2+4+8+…+n
= 20 + 21 + 22 +23 + … +2log n
= i=0,log n 2i
i=0,n 2i = 2 (n+1) -1 = 2n - 1
Example A.3

2log2n = n Total = O(n)


Example A.8 76
1 sum1 = 0;
2 for (i=1; i<=n; i*=2)
3 for (j=1; j<=n; j++)
4 sum1 ++;
5 sum2 = 0;
6 for (i=1; i<=n; i*=2)
7 for (j=1; j<=i; j++)
Total = O(n log n)
8 sum2 ++;
77
 Algorithm 1.5
 Binary Search – Iterative

 Calculate Big-O?
 O(log N)

78
 Recursive algorithms

 Recurrence equation/relation

79
 The running time of an recursive algorithm
can often be described by a recurrence
relation or equation.
 A mathematical formula that generates the terms
in a sequence from previous terms.

80
1 unsigned int Factorial (unsigned int n) 1
2 { 2
3 if (n == 0) 3 O(1)
4 return 1; 4 O(1)
5 else 5
6 return n * Factorial (n-1); 6 Factorial (n-1)
7 } 7 }

Recurrence Equation

T(n) = O(1) if n=0


T(n) = T(n-1) + O(1) if n>0

81
Recurrence Equation T(n) = O(1) if n=0
Substitution
T(n) = T(n-1) + O(1) if n>0

T(n) = T(n-1) + O(1)


= T(n-2) + O(1) + O(1)
= T(n-3) + O(1) + O(1) + O(1)
= T(n-4) + O(1) + O(1) + O(1) + O(1)
.


.
= T(n-n) + O(1) + O(1) + ……. + O(1)
= O(1) + n x O(1)
= O(n)
i=1,n 1 = n
82
1 unsigned int Factorial (unsigned int n) 1
2 { 2
3 if (n == 0) 3 O(1)
4 return 1; 4 O(1)
5 else 5
6 return n * Factorial (n-1); 6 Factorial (n-1)
7 } 7 }

T(n) = O(1) if n=0 Total = O(n)


T(n) = T(n-1) + O(1) if n>0

83
 T(n) = T(n-1) + 1 if n>0
 T(0) = 1
 T(n) = T(n-1) + 1
 = T(n-2) + 1 + 1
 = T(n-3) + 1 + 1 + 1
 = T(n-4) + 1 + 1 + 1 + 1
 .
 .
 = T(n-n) + 1 + 1 + ……. + 1
 =1+ nx1
 =1+n
 = O(n)

 O(n)
84
 T(n) = T(n-1) + n if n>0
 T(0) = 1
 T(n) = T(n-1) + n
 = T(n-2) + (n-1) + n
 = T(n-3) + (n-2) + (n-1) + n
 = T(n-4) + (n-3) + (n-2) + (n-1) + n
 .
 .
 = T(n-n) + 1 + 2 + ……. + (n-2) + (n-1) + n
 = 1 + 1 + 2 + ……. + (n-2) + (n-1) + n
 = 1 + n (n+1)/2
 = O(n2)
 O(n2)
 Example B.21

85
 T(n) = T(n/2) + 1 if n>1
 T(1) = 1
 T(n) = T(n/21) + 1
 = T(n/22) + 1 + 1
 = T(n/23) + 1 + 1 + 1
 = T(n/24) + 1 + 1 + 1 + 1
 .
 .
 = T(n/2log n) + 1 + 1 + ……. + 1
 = 1 + log n x 1
 = 1 + log n
 = O(log n)
 O(log n)
 Example B.1

86
 T(n) = 2 T(n-1) T(0) = 1
 O(2n)
 T(n) = n T(n-1) T(0) = 1
 O(n!)

87
 T(n) = aT(n/b) + c* nk
 T(1) = d
 T(n) = O(nk) if a < bk
 T(n) = O(nk log n) if a = bk
 T(n) = O(nlog b a) if a > bk

 Theorem B.5 A Master Theorem


 Example B.26
 Example B.27
88
 T(n) = T(n/2) + 1 if n>1
 T(1) = 1
 a=1 b=2, k=0 => a = b^k case!
 O(n^0 log n)
 O(log n)
 Example B.1

89
 T(n) = 2 T(n/2) + n T(1) = 1
 O(n log n )
 T(n) = 2 T(n/2) + 1 T(1) = 1
 O(n )

90
 T(n) = 2 T(n/2) + n T(1) = 1
 O(n log n )
 T(n) = 2 T(n/2) + 1 T(1) = 1
 O(n )
T(n) = 2 T(n/21) + n T(n) = 2 T(n/21) + 1
= 2 [2 T(n/22) + n/2] + n = 2 [2 T(n/22) + 1] + 1
= 22 T(n/22) + n + n = 22 T(n/22) + 2 + 1
= 22 [2 T(n/23) + n/22] + n + n = 22 [2 T(n/23) + 1] + 2 + 1
= 23 T(n/23) + n + n + n = 23 T(n/23) + 22 + 2 + 1
. .
. .
= 2log n T(n/2log n) + n + … + n + n + n = 2log n T(n/2log n) + 2log n-1 + … + 22 + 2 + 1
= n T(1) + (log n) n = 2log n + 2log n-1 + … + 22 + 2 + 1
= n + n log n = 2log n +1 -1 = 2 n- 1
= O(n log n) = O(n)

91
1 int recursive (int n) {
2 if(n == 1)
3 return (1);
4 else
5 return (recursive (n-1) + recursive (n-1));
6 }

T(n) = 2 T(n-1) + 1
O(2n)
92
 Algorithm 2.1
 Binary Search – Recursive

 Calculate Big-O?
T(n) = 1 if n=1
T(n) = T(n/2) + 1 if n>1

 O(log N)

93
 T(n) = O(1) if n=1
 T(n) = T(n/2) + O(n) if n>1

94
 T(n) = O(1) if n=1
 T(n) = T(n/3) + O(1) if n>1

95
 T(n) = O(1) if n=1
 T(n) = 8T(n/2) if n>1

x logay = y logax Example A.8

96
 T(n) = O(1) if n=1
 T(n) = 8T(n/2) + n2 if n>1

97
 T(n) = O(1) if n=1
 T(n) = 7T(n/2) if n>1

98
 T(n) = O(1) if n=1
 T(n) = 7T(n/2) + n2 if n>1

99
 T(n) = O(1) if n=1
 T(n) = 2T(n-1) + O(1) if n>1

100
 T(n) = 0 if n=1
 T(n) = T(n-1) + 2/n if n>1

Example B.22
Example A.8

101
 Exercise 25 (Appendix B)

102
 T(n) = T(n-1) + (1)
 T(n) = O(n)

 T(n) = T(n-1) + (n)


 T(n) = O(n2)

103
 T(n) = T(n/2) + (1)
 T(n) = O(log n)

104
 T(n) = T(n/2) + (1)
 T(n) = O(log n)

 T(n) = 2 T(n/2) + (n)


 T(n) = O(n log n)

105
106
 If an O(n2) algorithm takes 3.1 msec to run on
an array of 200 elements, how long would
you expect it to take to run on a similar array
of 40,000 elements?

 c  2002 =3.1 msec  c = 3.1 / 2002


 c  400002 = (3.1 / 2002 ) 400002 = 124000 msec

 124000 msec = 124 seconds

107
 If an O(n log n) algorithm takes 3.1 msec to
run on an array of 200 elements, how long
would you expect it to take to run on a similar
array of 40,000 elements?
 1240 msec 1.24 seconds

 c  (200 log200) =3.1 msec  c = 3.1 / (200 log200)


 c  (40000 log40000) = (3.1 /200 log200) (40000 log40000)
= 1240 msec

108
 If an O(n) algorithm takes 3.1 msec to run on
an array of 200 elements, how long would
you expect it to take to run on a similar array
of 40,000 elements?
 620 msec .620 seconds

 c  200 =3.1 msec  c = 3.1 / 200


 c  40000 = (3.1 / 200) 40000 = 620 msec

109
 If an O(log n) algorithm takes 3.1 msec to run
on an array of 200 elements, how long would
you expect it to take to run on a similar array
of 40,000 elements?
 6.2 msec .0062 seconds

 c  log200 =3.1 msec  c = 3.1 / log200


 c  log40000 = (3.1 / log200) log40000 = 6.2 msec

110
 Suppose you have a computer that requires 1
minute to solve problem instances of size n
=1,000. Suppose you buy a new computer
that runs 1,000 times faster than the old one.
What instance sizes can be run in 1 minute,
assuming the following time complexities
T(n) = n for our algorithm?

 c  1000 = 1 min = c/1000  n


 n = 106
111
 Suppose you have a computer that requires 1
minute to solve problem instances of size n
=1,000. Suppose you buy a new computer that
runs 1,000 times faster than the old one. What
instance sizes can be run in 1 minute, assuming
the following time complexities T(n) = n2 for our
algorithm?

 104.5
 c  10002 = 1 min = c/1000  n2

112
113
 So far, we considered the worst/average cost for
a single operation
 How about the cost for a series/sequence of N
operations?
 N times the worst-case cost of any one operation.
 Or better cost?
 Observation?
 Any one particular operation may be slow, but the
average time over a sufficiently large number of
operations is fast.

114
 The amortized cost of an operation is the
total cost of the N operations divided by N.

 Amortized analysis!

 Self-Adjusting Data Structures!

115
 Amortized Behavior?
 Asymptotic behavior?

116
 Space requirements for the data structure
itself.

117
One can often achieve a reduction in
time requirements if one is willing to
sacrifice space requirements
or vice versa (tradeoff).

118
 Chapter 1:
 1.1
 1.2
 1.3 (1.3.1 & 1.3.2 only)
 1.4 (1.4.1 & 1.4.2 only)
 1.5
 Appendix A & B
 A.1, A.5
 B.3

119
Prof. Young Park
120

You might also like