0% found this document useful (0 votes)
27 views74 pages

Module 1 Introduction to Algorithms.pptx

Uploaded by

tanmayr2006
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
27 views74 pages

Module 1 Introduction to Algorithms.pptx

Uploaded by

tanmayr2006
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 74

Analysis of Algorithms

Introduction to
Algorithms

1
Analysis of Algorithms

• Module 1: Introduction to Algorithms


• Module 2 : Divide and Conquer Approach
• Module 3 : Greedy Approach
• Module 4: Dynamic Programming
• Module 5 : Backtracking & Branch and Bound
• Module 6 : String Matching

2
Learning Objectives

• (1) Algorithm?
• (2) Characteristics of Algorithms
• (3) Algorithm Examples
• (4) Performance Analysis
• (5) Asymptotic Analysis

3
Algorithms
• An algorithm is a finite set of precise instructions for
performing a computation or for solving a problem.
• “ A procedure for solving a mathematical problem in a
finite number of steps that frequently involves
repetition of an operation. ”
• “ An algorithm is a finite, definite, effective procedure,
with some input and some output. ”

4
Algorithms (Computer)
• Finite set of instructions defined in specific order to do
certain computation and carry out some predefined
task.
• It is a step by step procedure to solve an intended
problem.
• Algorithm is an abstraction of a program to be
executed on a physical machine ( Model of
Computation)

5
Eg. Euclid’s algorithm
• Eg. Finding the GCD of two numbers

int GCD(int a, int b)


{
if (b == 0)
return a;
return GCD(b, a % b);
}
•GCD(270,192) = GCD(192,78) = GCD(78,36) =
GCD(36,6) = GCD(6,0) = 6
•GCD(270,192) = 6
7
Problem Solving
Process
Problem
Problem Algorithm
Specification

Output
Program
(Solution)

7
Characteristics of Algorithm
• Input
• Output
• Definiteness: Each instruction must be clear and
unambiguous
• Finiteness
• Effectiveness: Every instruction must be basic (Simple
instructions)

8
Classic algorithms
• Basic data structure implementation
• Sorting
• Searching
• Tree algorithm
• Graph algorithms
• String processing

Emphasizes critical thinking, problem-solving, and


last code.
9
Why to study algorithms?
• Internet: Web searching, Routing, Distributed file sharing etc.
• Computers: Databases, Networking, whether forecasting etc.
• Computer Graphics: Virtual reality, Creating video games etc.
• Social Networks: Recommendations, advertising, news feeds
• Multimedia: MP3, JPG, Face recognition
• Healthcare: Findingstructural patterns to identify and
cure diseases
• Security: Cell phones, e-commerce, voting machines, …
• For building smart robots

10
Design of algorithms
• Divide-and-conquer • Iterative Algorithms
• Greedy Approach • Recursive Algorithms
• Dynamic • Brute Force Method
programming • Network flow
• Branch and bound • Randomized algorithms
• Backtracking • Approximation algorithms
• String Matching

12
The problem of sorting

• Input: Sequence 〈a1, a2, …, an〉 of numbers.


• Output: Permutation 〈a'1, a'2, …, a'n〉 such
that
a'1 ≤ a'2 ≤ … ≤ a'n.
• Example:
• Input: 8 2 4 9 3 6
• Output: 2 3 4 6 8 9

12
Insertion Sort
• Idea: like sorting a hand of playing cards
• Start with an empty left hand and the cards facing down on
the table.
• Remove one card at a time from the table, and insert it into
the correct position in the left hand
• compare it with each of the cards already in the hand, from right
to left
• The cards held in the left hand are sorted
• these cards were originally the top cards of the pile on the table

13
Insertion
Sort
To insert 12, we need to
make room for it by moving
first 36 and then 24.

14
Insertion
Sort

15
Insertion
Sort

16
Insertion
Sort input array

5 2 4 6 1 3
Sorted Unsorted
At each iteration, the array is divided in two sub-arrays:

left sub-array right sub-array

sorted unsorted

17
Insertion
Sort

18
Insertion Sort
Alg.: 1 2 3 4 5 6 7 8

INSERTION-SORT(A) a1 a2 a3 a4 a5 a6 a7 a8
for j ← 2 to n
key
do key ← A[ j ]
Insert A[ j ] into the sorted sequence A[1 . . j -1]

i←j-1
while i > 0 and A[i] > key
do A[i + 1] ← A[i] i ←
i–1
A[i + 1] ← key
• Insertion sort – sorts the elements in place
19
Loop Invariant for Insertion Sort
Alg.: INSERTION-SORT(A)
for j ← 2 to n
do key ← A[ j ]
Insert A[ j ] into the sorted sequence A[1 . . j -1]

i←j-1
while i > 0 and A[i] > key
do A[i + 1] ← A[i] i ←
i–1
A[i + 1] ← key
Invariant: at the start of the for loop the elements in A[1 . . j-1]
are in sorted order
20
Proving Loop Invariants
• Proving loop invariants works like induction
• Initialization (base case):
• It is true prior to the first iteration of the loop

• Maintenance (inductive step):


• If it is true before an iteration of the loop, it remains true before the
next iteration

• Termination:
• When the loop terminates, the invariant gives us a useful property that
helps show that the algorithm is correct
• Stop the induction when the loop terminates

21
Loop Invariant for Insertion Sort
• Initialization:
• Just before the first iteration, j = 2:
the subarray A[1 . . j-1] = A[1], (the
element originally in A[1]) – is sorted

22
Loop Invariant for Insertion Sort
• Maintenance:
• the while inner loop moves A[j -1], A[j -2], A[j -3], and
so on, by one position to the left until the proper position
for key (which has the value that started out in A[j]) is
found
• At that point, the value of key is placed into this position.

23
Loop Invariant for Insertion Sort
• Termination:
• The outer for loop ends when j = n + 1 ⇒ j-1 = n
• Replace n with j-1 in the loop invariant:
• the subarray A[1 . . n] consists of the elements originally in A[1 . .
n], but in sorted order

j-1j

• The entire array is sorted!

Invariant: at the start of the for loop the elements in A[1 . . j-1]
are in sorted order
24
Analysis of Insertion Sort
INSERTION-SORT(A)
for j ← 2 to n
do key ← A[ j ]
Insert A[ j ] into the sorted sequence A[1 . . j -1]
i←j-1
while i > 0 and A[i] > key
do A[i + 1] ← A[i]
i←i–
1 A[i + 1] ←
tj: #key
of times the while statement is executed at iteration j

8
Best Case Analysis
• The array is already sorted “while i > 0 and A[i] > key”
• A[i] ≤ key upon the first time the while loop test is run
(when i = j -1)

• tj = 1

• T(n) = c1n + c2(n -1) + c4(n -1) + c5(n -1) + c8(n-1) =


(c1 + c2 + c4 + c5 + c8)n + (c2 + c4 + c5 + c8)

= an + b = Θ(n)
Worst Case Analysis
• The array is in reverse sorted order “while i > 0 and A[i] > key”
• Always A[i] > key in while loop test
• Have to compare key with all elements to the left of the j-th position
⇒ compare with j-1 elements ⇒ tj = j

∑j =
using
=>∑ j = ∑
n n
n(n+1) n(n+1)
n −1 =>
n(n−1)
(j−1)=

28
Comparisons and Exchanges in Insertion Sort
INSERTION-SORT(A)
cost times
for j ← 2 to
c1 n
n
do key ← A[ j c2 n-1
]
Insert A[ j ] into the sorted sequence A[1 . . j -1] 0 n-1
2
i←j- ≈ n /2 comparisons c4 n-1
1
while i > 0 and A[i] > key
c
5 ∑ n
tj

j=2
n
do A[i + 1] ← c6 (t
j= j
−1)
A[i]
i←i– ≈ n2/2 exchanges c7 ∑ n2
(t
j= j

n-1 −1)
2
1 c8
A[i + 1] ←
28
key
Analysis of Insertion Sort

29
Performance Analysis of algorithms
• The theoretical study of computer-program
performance and resource usage.

Complexity

Time Space

Variable
Compilation Runtime Fixed Part Part
30
Time Complexity
• Amount of computer time required for completion
• The amount of time needed to run a program
is a compilation time and the run time.
• Compilation time is not depend upon instance
characteristics

31
Running time
• The running time depends on the input: an already
sorted sequence is easier to sort.
• Parameterize the running time by the size of the input,
since short sequences are easier to sort than long
ones.
•Generally, we seek upper bounds on the running time,
because everybody likes a guarantee.

32
Kinds of analyses
• Worst-case: (usually)
• T(n) =maximum time of algorithm on any input of size n.

• Average-case: (sometimes)
• T(n) =expected time of algorithm over all inputs of size n.
• Need assumption of statistical distribution of inputs.

• Best-case:
• Cheat with a slow algorithm that works fast on some input.

33
Space Complexity
• An estimate of space requirement for an algorithm to
complete the task

• Amount of memory that is needed to run the program


• Fixed Part: Space requirementfor code, variables,
constants
• Variable Part: Space requirementbased on instance
characteristics eg. Size of variable, stack space

• S(P) = C + Sp
P- problem, C- constant, Sp- Variable Space
34
Space Complexity Example
Algorithm Area(r) Algorithm Sum(a,n)
{ {
a = pi * r *r; s = 0;
return(a); for i= 0 to n do
} s = s + a[i];
return(s);
C=3
}
Sp = 0
C=3
S(P) = 3
Sp = n
S(P) >= (3 + n)
35
Space Complexity Example
Algorithm RSum(a,n) • Instance is
{ characterized by n
if n <= 0 then • Sp includes space for
return 0; formal parameters, local
variables and return
else address
return((a,(n-1))+a[n]); • Each call of Rsum()
} returns at least 3 words
(variables)
• Depth of recursion (n+1)
• S(P) >= 3(n+1)
36
Algorithm for factorial of number
Algorithm Factorial(n)
//Input: Any number n
//Output: n = n x (n-1) x (n-2) x … x 2 x
1 start
If (n == 1) then
return 1
Else
return (n * Factorial(n-1))
end

37
Ideal
Solution
•Express running time as a function of the input
size n (i.e., f(n)).
•Compare different functions corresponding
to running times.
•Such an analysis is independent of machine
time, programming style, etc.

38
Asymptotic Analysis
•To compare two algorithms with running times
f(n) and g(n), we need a rough measure that
characterizes how fast each function grows.
•Hint: use rate of growth
•Compare functions in the limit, that is,
asymptotically!
(i.e., for large values of n)

39
Asymptotic Notation
• O notation: asymptotic “less than”:

• f(n)=O(g(n)) implies: f(n) “≤” g(n)

• Ω notation: asymptotic “greater than”:

• f(n)= Ω (g(n)) implies: f(n) “≥” g(n)

• Θ notation: asymptotic “equality”:

• f(n)= Θ (g(n)) implies: f(n) “=” g(n)

40
Asymptotic notations
• O-notati
on

O(g(n)) is the set of functions


with smaller or same order of
growth as g(n)

46
Big-O Notation
• We say fA(n)=30n+8 is order n, or O (n)
It is, at most, roughly proportional to n.
• fB(n)=n2+1 is order n2, or O(n2). It is, at most,
roughly proportional to n2.
• In general, any O(n2) function is faster-
growing than any O(n) function.

42
Visualizing Orders of Growth
• On a graph, as
you go to the
right, a faster

Value of function →
growing fA(n)=30n+8
function
eventually
becomes 2
f (n)=n +1
larger... B

Increasing n →

43
More Examples

• n4 + 100n2 + 10n + 50 is O(n4)
• 10n3 + 2n2 is O(n3)
• n3 - n2 is O(n3)
• constants
• 10 is O(1)
• 1273 is O(1)

44
Asymptotic notations (cont.)
•Ω-
notation

Ω(g(n)) is the set of functions


with larger or same order of
growth as g(n)

51
Asymptotic notations (cont.)

Θ-notation
Θ(g(n)) is the set of functions
with the same order of growth
as g(n)

46
Common orders of magnitude

47
Common orders of magnitude

48
Simple rules for O-notation
Theorem: Let f(n) = am nm + am-1 nm-1 + . . . + a1n + a0 , am≠ 0 then
f(n) = O(nm)
1. Constant terms are expressed as O(1)
• Eg. O(c) = O(1)

2. Multiplicative constants are omitted


• Eg. O(c . n) = c. O(n) = O(n)

3. Addition is performed by taking the maximum


• Eg. O(T1) + O(T2) = O(T1+ T2 ) = max(O(T1), O(T2))
• T1= n and T2= n2 then O(n2)

4. Multiplication is not changed but often written more compactly


• Eg. O(T1) . O(T2) = O(T1 . T2 )

49
Simple rules for O-notation
If f(n) = 3n2 + 10n + 10, then
O( T(n)) = O (3n2 + 10n + 10) = O (3n2) = O(n2)
Let n = 10,

2 3(10)2
Running time for 3n : = 73.2%
3(10)2+10(10)+10

10(10)
10n : 3(10)2+10(10)+10
= 24.4%

10
10: 3(10)2=+10(10)+10
2.4%

50
Simple rules for O-notation
If f(n) = 3n2 + 10n + 10, then
O( T(n)) = O (3n2 + 10n + 10) = O (3n2) = O(n2)
Let n = 100,

2 3(100)2
Running time for 3n : = 96.7%
3(100)2+10(100)+10

10(100)
10n : 3(100)2+10(100)+10
= 3.2%
10
10: 2
= 0.1%
3(100) +10(100)+10

51
Properties
• Theorem:
f(n) = Θ(g(n)) ⇔ f = O(g(n)) and f = Ω(g(n))
• Transitivity:
• f(n) = Θ(g(n)) and g(n) = Θ(h(n)) ⇒ f(n) = Θ(h(n))
• Same for O and Ω
• Reflexivity:
• f(n) = Θ(f(n))
• Same for O and Ω
• Symmetry:
• f(n) = Θ(g(n)) if and only if g(n) = Θ(f(n))
• Transpose symmetry:
• f(n) = O(g(n)) if and only if g(n) = Ω(f(n))

65
Examples
•Show that f(n) = 3n+2 is O(n).

•Show ∃c,n0: f(n) ≤ c g(n) , ∀n>n0 .

• f(n) ≤ c g(n)
• 3n+2 <= c . g(n)
• 3n+2 <= 3n + 2n for all n>=1
• 3n+2 <= 5n
• So c =5 and g(n) = n, n0 = 1

53
Examples

•Show that f(n) =3n+2 is Ω(n).


•Show ∃c,n0: c g(n) ≤ f(n), ∀n>n0.

• c g(n) ≤ f(n)
• c . g(n) <= 3n+2
• 3n <= 3n + 2 n>=n0
• So Ω (g(n)) = Ω (n) for c =3 and n0 = 1

54
Examples
•Show that f(n) =3n+2 is θ(n).
• Show ∃c1, c2,n0: c1 g(n) ≤ f(n) ≤ c2 g(n) , ∀n>n0.

• c1 g(n) ≤ f(n) ≤ c2 g(n)


• c1 g(n) ≤ 3n+2 ≤ c2 g(n)
• 3n ≤ 3n + 2 ≤ 5nn>=1
• So θ (g(n)) = θ(n) for c1 =3, c2 =5 and n0 = 1

55
Examples
•Show that T(n) = 10n2+2n+1 is O(n2).
•Show ∃c,n0: f(n) ≤ c g(n) , ∀n>n0 .

• f(n) ≤ c g(n)
• 10n2+2n+1 <= c . g(n)
• 10n2+2n+1 <= 10n2+2n2+n2 for all n>=1
• 10n2+2n+1 <= 13n2
• So c =13 and g(n) = n2, n0 = 1

56
Examples
2 2
•Show that T(n) = 10n +2n+1 is Ω(n ).
•Show ∃c,n0: c g(n) ≤ f(n) , ∀n>n0 .

• c g(n) ≤ f(n)
• c . g(n) <= 10n2+2n+1
• 10n2 <= 10n2+2n+1 for all n>=1
• So Ω (g(n)) = Ω (n2) for c =10 and n0 = 1

57
Examples
•Show that f(n) = 10n2+2n+1 is θ(n2).
• Show ∃c1, c2,n0: c1 g(n) ≤ f(n) ≤ c2 g(n) , ∀n>n0.

• c1 g(n) ≤ f(n) ≤ c2 g(n)


• c1 g(n) ≤ 10n2+2n+1 ≤ c2 g(n)
• 10n2 ≤10n2+2n+1≤ 13n2 for all n>=1
• So θ (g(n)) = θ(n2) for c1 =10, c2 =13 and n0 = 1

58
Mathematical Induction
• A powerful, rigorous technique for proving that a
statement S(n) is true for every natural number n, no
matter how large.
• Suppose
• S(k) is true for fixed constant k
• Often k = 0
• S(n) 🡪 S(n+1) for all n >= k
• Then S(n) is true for all n >= k

59
Proof By Induction
• Claim: S(n) is true for all n >= k
• Basis:
• Show formula is true when n = k

• Inductive hypothesis:
• Assume formula is true for an arbitrary n

• Step:
• Show that formula is then true for n+1

74
Example
• Prove that: 2n + 1 ≤ 2n for all n ≥ 3
• Basis step:
• n = 3: 2 * 3 + 1 ≤ 23 ⇔ 7 ≤ 8 TRUE
• Inductive step:
• Assume inequality is true for n, and prove it for (n+1):
2n + 1 ≤ 2n must prove: 2(n + 1) + 1 ≤ 2n+1
2(n + 1) + 1 = (2n + 1 ) + 2 ≤ 2n + 2 ≤

≤ 2n + 2n = 2n+1,since 2 ≤ 2n for n ≥ 1

75
Induction Example:
•Prove 1 + 2 + 3 + … + n = n(n+1) / 2
• Basis:
• If n = 0, then 0 = 0(0+1) / 2
• Inductive hypothesis:
• Assume 1 + 2 + 3 + … + n = n(n+1) / 2
• Step (show true for n+1):
1 + 2 + … + n + n+1 = (1 + 2 + …+ n) +
(n+1)
= n(n+1)/2 + n+1 = [n(n+1) + 2(n+1)]/2
= (n+1)(n+2)/2 = (n+1)(n+1 + 1) / 2
62
Induction Example:

•Prove a0 + a1 + … + an = (an+1 - 1)/(a - 1) for all


a≠1
• Basis: show that a0 = (a0+1 - 1)/(a - 1)
a0 = 1 = (a1 - 1)/(a - 1)
• Inductive hypothesis:
• Assume a0 + a1 + … + an = (an+1 - 1)/(a - 1)
• Step (show true for n+1):
a0 + a1 + … + an+1 = a0 + a1 + … + an + an+1
= (an+1 - 1)/(a - 1) + an+1 = (an+1+1 - 1)/(a - 1)

63
Example
• Associate a "cost" with each statement.
• Find the "total cost“by finding the total number of times
each statement is executed.
Algorithm 2
Algorithm 1
Cost Cost
arr[0] = 0; c1 for(i=0; i<N; i++) c2
arr[1] = 0; c1 arr[i] = 0; c1
arr[2] = 0; c1
... ... -------------
arr[N-1] = 0; c1 T(n)= (N+1) x c2 + N x c1 =
----------- (c2 + c1) x N + c2
T(n)= c1+c1+...+c1 = c1 x N

64
Another Example
• Algorithm 3 Cost
sum = 0; c1
for(i=0; i<N; i++) c2
for(j=0; j<N; j++) c2
sum += arr[i][j]; c3

T(n)= c1 + c2 x (N+1) + c2 x N x (N+1) + c3 x N2

65
Back to Our Example
Algorithm 1 Algorithm 2
Cost Cost
arr[0] = 0; c1 for(i=0; i<N; i++) c2
arr[1] = 0; c1 arr[i] = 0; c1
arr[2] = 0; c1
...
arr[N-1] = 0; c1

T(n)= c1+c1+...+c1 = c1 x N T(n)= (N+1) x c2 + N x c1 =


(c2 + c1) x N + c2

• Both algorithms are of the same order: O(N)

66
Example (cont’d)

Algorithm 3 Cost
sum = 0; c1
for(i=0; i<N; i++) c2
for(j=0; j<N; j++) c2
sum += arr[i][j]; c3

T(n)= c1 + c2 x (N+1) + c2 x N x (N+1) + c3 x N2 = O(N2)

67
Example (cont’d)

Algorithm 4
• s: 1 3 6 10 15 ?
A()
• i: 1 2 3 4 5 . . . k .
{ . .n
2
i = 1, s=1; • T(n)= 1 + 2 +3 + . . . + k = 𝑘 (𝑘+1)

while (s<=N) 2
𝑘 2(𝑘+1)
• T(n)= 𝑘 +𝑘 > n
{ i++; >n
2
s= s + i; • T(n)=
} • T(n)= 𝑘2 > n
} • T(n)= 𝑘 = 𝑛
• 𝑂( 𝑛) 68
Example (cont’d)

Algorithm 5
A() • T(n)= 𝑖2 > n
{ • T(n)= i = 𝑛
for (i=1; i2<N;i++) • 𝑂( 𝑛)
{
print(i);
}
}

69
Example (cont’d)

Algorithm 6
A() • i: 1 2 4 8 16 . . .N
{ • 0 1
2 2 2
2
2
3 4
2 2
k

for (i=1; i<N;i=i*2) • T(n)= 2k = n


{ • T(n)= 𝑘 = log 𝑛
print(i); • 𝑂(log 𝑛)
}
}

70
Example (cont’d)
Algorithm 7
A() • i: 1 2 3 ...n
{ • j: 1 time 4 9 . .. 2
for (i=1; i< =N; i++) n
• k: n/2 *1 n/2 *4 n/2 *9 . . . n/2
2
for (j=1; j< = i ; j++) * n22
𝑛
for (k=1; k< =n/2; k++) • T(n)= 2 (1 + 4 +9 + . . . + n )
print(k); • T(n)= 𝑛2 (12 + 22 +32 + . . . + n2)
}
• T(n)= 𝑛2 { 𝑛(𝑛+1)(2𝑛+1)
6 }
• 𝑂(n4)

71
Example (cont’d)
Algorithm 8
A() • i: 1 2 3 ...n
{ • j: 1 time 2 3 . ..n
for (i=1; i< =n; i++)
• k: 100 100*2 100*3 . . . 100*
for (j=1; j< = i; j++)
• T(n) = 100 n+ 2*100 +3*100 + . . . +
for (k=1; k< =100; k++) n*100

print(k); = 100(1 + 2 +3 + . . . + n)
} • ={ }
100 ∗ 𝑛∗2 (𝑛+1)
• 𝑂(n2)

72
Example (cont’d)
Algorithm 9
A()
{
for (i=n/2; i< =n; i++) ---------> n/2
for (j=1; j< = n/2; j++) ---------> n/2
for (k=1; k< =n; ---------> log(n
k*2) )
𝑛 𝑛
print(k); T(n) = * * log 𝑛
2 2
}
𝑂(n2 log 𝑛)

73
Reference
• T. H. Coreman , C.E. Leiserson,R.L. Rivest, and C.
Stein, “Introduction to algorithms”, Fourth edition ,
PHI publication .

88

You might also like