ADA Lec 001-005
ADA Lec 001-005
Algorithm
Lecture No.01
Prepared by : Afroj Alam
Course Outline
• Introduction of Algorithms and its notation
• Basics algorithms and its analysis
• Asymptotic notations
• Recursion and recurrence relations
• Divide-and-conquer approach
• Sorting; Search trees
• Hashing
2
Course Outline (Cont !!!)
• Binary Search Tree
• Graph algorithms; Shortest paths; Network flow; Disjoint Sets;
Polynomial and matrix calculations;
• Greedy approach
• Dynamic programming
• String matching algorithms
• NP complete problems
3
What is an algorithm
• An algorithm is a step by step procedure to solve a problem
• Plate form independent
• Can be implement in any procedural language
• Simplest form (for experts) is Pseudo code
• Pseudo code can be written in any language syntax, but here in this
course we will used C/C++ syntax
• Two important factors
• Space Tradeoff (Refer to less use of memory i.e. RAM)
• Time Tradeoff (Refer to less time of Micro processor for execution)
Features of algorithm
• Complete
• Finite
• At least one Output
• 0,1 or more inputs
• Correct
• Clarity
Example-1.
• Write an algorithms or Pseudo code to read two number and
display the largest number.
Algorithms Notations
• Input Statements
• Read
• Output statement
• Write
14
Assignment
• Write an algorithm which read three numbers and print the
smallest number. Also write a C language program .
• Write an algorithm which read an array of 10 integers and
count the even numbers. Also write a C language program .
15
Analysis and designing of algorithm
Lecture No.02
Prepared by : Afroj Alam
Criteria for analysis of algorithms
• Why analyze algorithms?
• evaluate algorithm performance
• compare different algorithms
• Analyze what about them?
• running time, memory usage, solution quality
• worst -case and “best” case
Complexity analysis
• What is the "best" Algorithm
• You can considered an algorithm is best on the basis of
answers of following questions.
• How "complicated" is the algorithm
• Time Complexity
• Space Complexity
Important Point to note
• Programs depend on the operating systems, machine,
compiler/interpreter used, etc.
• Analysis of algorithms compare algorithms and not programs.
• Performance of Algorithms should be measured before its
implementation as program in any language.
• Algorithms should be checked on different machines.
Example-1.
• Consider following four pseudo codes (in C)
• Purpose of all pseudo codes is same.
• We are analysing them on the basis of time and space
complexity.
PesuedoCode-1.
main()
• Facts:
{ 1. Three variables ( 6 bytes)
int a,b,c; 2. Three assignment process
a=2; 3. One Calculation
b=3; 4. One output statement
c=a+b;
cout<< c;
}
PesuedoCode-2
• Facts:
main() 1. Two variables ( 4 bytes)
{ 2. Two assignment process
int a,b; 3. One Calculation
a=2; 4. One output statement
b=3;
cout<< a+b;
}
PesuedoCode-3.
main() • Facts:
{ 1. Three variables ( 6 bytes)
int a,b,c; 2. One input statement
cin>>a,>>b; 3. One Assignment
c=a+b; 4. One Calculation
cout<<c; 5. One output statement
}
PesuedoCode-4.
main() • Facts:
{ 1. Two variables ( 4 bytes)
int a,b; 2. One input statement
cin>>a>>b; 3. One Calculation
cout<<a+b; 4. One output statement
}
Comparison of Pseudo codes
Facts Pseudo 1 Pseudo 2 Pseudo 3 Pseudo 4
Variables 3 (6 bytes) 2(4 bytes) 3 (6 bytes) 2 (4bytes)
No of 3 2 1 0
Assignments
No of Calculation 1 1 1 1
Input Statements 0 0 1 1
Output 1 1 1 1
Statements
10
Which one Pseudo code is best and how
1
Asymptotic Analysis
To compare two algorithms with running
times f(n) and g(n), we need to find how
fast each function grows.
Hint: use rate of growth
Compare functions in the limit, that is,
asymptotically!
(i.e., for large values of n)
2
Functions
Algorithm can be analyzed mathematically by
functions. Therefore, the actual analysis is to
establish a relative order among functions.
Given two functions, there are usually points where
one function is smaller than the other function, so it
does not make sense to claim, for instance, f(N)
<g(N).
Thus, we compare their relative rates of growth.
Graph of linear and binary search
f(N)
g(N)
Functions
Although 1000 N is larger than N^2 for small values
of N, N^2 grows at a faster rate, and thus N^2 will
eventually be the larger function.
There may be some points in which f(N) > g(N).
Rate/order of Growth
Consider the example of buying elephants and
goldfish:
Cost: cost_of_elephants + cost_of_goldfish
Cost ~ cost_of_elephants (approximation)
The low order terms in a function are relatively
unimported for large n
n4 + 100n2 + 10n + 50 ~ n4
6
Rate/order of Growth
Let f(n) = 3n2+5n+6
Important point to be noted when you are finding
order of growth
Remove all constants of function f(n) for an
algorithm, such as
remove 6 from above f(n)
Remove low order terms such as
Remove or ignore 5n from above f(n)
Remove coefficient of high order term such
Remove or ignore 3 from above f(n)
7
Rate of Growth ≡Asymptotic
Analysis
8
Example
Suppose you are designing a web site to process
user data (e.g., financial records).
Suppose program A takes fA(n)=30n+8
microseconds to process any n records, while
program B takes fB(n)=n2+1 microseconds to
process the n records.
Which program would you choose, knowing
you’ll want to support millions of users?
9
Visualizing Orders of Growth
On a graph, as
you go to the
right, a faster
growing
Value of function
fA(n)=30n+8
function
eventually
becomes fB(n)=n2+1
larger...
Increasing n
10
Common asymptotic notation or
Big-O Notation
We say fA(n)=30n+8 is order n, or O(n).
It is, at most, roughly proportional to n.
fB(n)=n2+1 is order n2, or O(n2). It is, at most,
roughly proportional to n2.
In general, an O(n2) algorithm will be slower than
O(n) algorithm.
Warning: an O(n2) function will grow faster than
an O(n) function.
11
Examples (Order of Growth)
12
Common orders of magnitude
13
Common orders of magnitude
Constant time algorithm
Constant Time
If the time taken by the algorithm is O(C), then
the algorithm will run in a constant time, C,
regardless of the size of its input.
An example of such an algorithm is one which
would accept an input and then halt without
doing anything else.
Array: accessing any element
Fixed-size stack: push and pop methods
Fixed-size queue: enqueue and dequeue
methods
Linear Time Algorithm
Linear Time
If the algorithm runs will grow larger linearly
proportional to the size of the input.
This algorithm is said to be “big-O of n”,
written O(n), and the algorithm is called linear.
In a linear algorithm, doubling the input size
makes the time increase approximately
twofold (or less).
An example of such algorithm is the linear
search, traversing, find minimum .
Quadratic Time Algorithm
Quadratic Time
If the largest term in a formula is no more than
a constant times n^2, then the algorithm is
said to be “big-O of n^2”, written O(N^2), and
the algorithm is called quadratic.
In a quadratic algorithm, doubling the input
size makes the number of operations increase
approximately fourfold (or less).
Logarithmic Time algorithm
Logarithmic Time
An algorithm that has the time grow
logarithmically to the size of the input. An
example of an algorithm that runs in is the
binary search algorithm.
If the largest term in the formula is a constant
times a logarithm of n, then the algorithm is
“big-O of the logarithm of n,” written O(log n),
and the algorithm is called logarithmic.
Exponential Time algorithm
Exponential Time
Exponential functions grow very quickly, so
exponential algorithms are only useful for
small problems.
All exponential functions belong to the
same order of growth regardless of the base
of the exponent.
Why exponential Time is bad?
Lecture No.04
Prepared by : Afroj Alam
Running Time of Algorithms
• Running time
• depends on input size n
• size of an array
• # of elements in a matrix
2
Analysis of non recursive algorithm
• for Loops
• The running time of a for loop is at most the running time of the
statements inside the for loop (including tests) times the number of
iterations.
Let
Example 1. running time of basic
for(i=0 ;i<n ;i++) operations is constant
a[i]=0; C
Note: 2. and loop is iterated n
(number of basic steps in times then
loop body) * (number of Total Running time will be
iterations) n*c
3
Nested for Loops
• The total running time of a statement inside a nested for loops is the
running time of the statement multiplied by the product of the sizes of
all the for loops.
• Example Let
for(i=0 ;i<n; i++) 1. running time of basic
operations is constant C
for(j=0 ;j<=n ;j++) 2. Running time of outer
K++; loop (i.e. i) is n
3. And running time of
Inner loop (i.e. j) is n
4. Total Running time will
be
n*n*c=n2*c
4
Consecutives for Loops
• The total running time of consecutive loops is the sum of
running of all consecutives loops
• Example Let
for(i=0 ;i<n ;i++) 1. running time of basic
operations is constant C
K++; 2. Running time of loop
for(j=0 ;j<n ;j++) (i.e. i) is n
K++; 3. And running time of loop
(i.e. j) is m
4. Total Running time will
be
n*c + m * c
5
Recursive Analysis
6
Recursive Function
• The recursive function is
• a kind of function that calls itself, or
• a function that is part of a cycle in the sequence of function calls.
f1 f1 f2 … fn
Problems Suitable for Recursive Functions
• Some problem have a straightforward solution.
• The other cases problem can be redefined in terms of problems
that are closer to the simple cases.
• The problem can be reduced entirely to simple cases by calling
the recursive function.
• If this is a simple case
solve it
else
redefine the problem using recursion
Splitting a Problem into Smaller Problems
Recursive Call
10
Tracing Recursive Functions
= 4 * (3 * (2 * (1 * 1)))
= 4 * (3 * (2 * 1))
= 4 * (3 * 2)
=4*6 Substitution
= 24 phase
11
The complete recursive factorial example
int main(void)
{
int num, fact;
printf("Enter an integer between 0 and 7> ");
scanf("%d", &num);
fact = factorial(num);
printf("The factorial of %d is %d\n", num, fact);
}
int factorial (int n)
{
if (n == 0) //base case
return 1;
else
return n * factorial (n-1); //recursive case
} 12
Example 2: Multiplication
int main(void) {
int num1, num2;
Expansion
phase
multiply(5,4) = 5 + multiply(5, 3)
= 5 + (5 + multiply(5, 2))
= 5 + (5 + (5 + multiply(5, 1)))
= 5 + (5 + (5 + 5))
= 5 + (5 + 10)
= 5 + 15
= 20 Substitution
phase
15
Analysis of Running time of algorithms
(Best, Average and Worst cases)
16
Types of Algorithm Complexity
• Worst Case Complexity:
• the function defined by the maximum number of steps taken on any
instance(array) of size n
• Best Case Complexity:
• the function defined by the minimum number of steps taken on any
instance(Array) of size n
• Average Case Complexity:
• the function defined by the average number of steps taken on any
instance(Array) of size n
17
Types of Algorithm Complexity
(Example: Linear Search)
5 7 2 6 9 12 1
• Worst Case Complexity:
• You want to search 1 in above array which is at location N
• You need N steps
• Best Case Complexity:
• You want to search 5 in above array which is at location 1
• You need 1 steps
• Average Case Complexity:
• You want to search 2 or 9 etc in above array
• You need 3 steps for 2 and 5 steps for 9
18
Best, Worst, and Average Case Complexity
Number Worst Case
of steps Complexity
Average Case
Complexity
Best Case
Complexity
N
(input
size)
19
Assignment
• Exercise-1.
• Write a pseudo code which find the sum of two 3*3 matrices and then calculate its
running time.
• Exercise-2.
• Write a pseudo code which read a number N and print whether it is prime or not .
After that, calculate the run time complexity
20
Searching Algorithms
Lecture Objectives
Learn how to implement the sequential search algorithm
Best case
Average Number of
1+N
=
Comparisons
2
Binary Search Algorithm
Can only be performed on a sorted list !!!
key = 89
key = 34
Binary Search Algorithm (Cont’d)
key = 22
Performance of Binary Search Algorithm
• Every iteration of the while loop makes two key (that is,
item) comparisons
Performance of Binary Search Algorithm (Cont’d)