DAA Module1
DAA Module1
Algorithms(CSE2007)
MODULE 1
Fundamentals of Algorithmic
problem solving
1-1
What is an algorithm?
word algorithm comes from Persian author,
Abu Ja’far Mohammed ibn Musa al Khowarizmi
Who wrote a textbook on mathematics
An algorithm is an unambiguous step-by-step procedure
to solve the given problem in finite number of steps by
accepting set of legitimate inputs to produce the desired
output.
After producing the result, algorithm shouldterminate
1-2
Notion of Algorithm
problem
algorithm
1-3
Importance of writing algorithm?
Writing an algorithm is just like preparing plan to solve
the problem.
Algorithm just gives the solution but not the answer.
Importance of writing algorithms are:
We can save the resources like time, human effort & cost
Debugging will be easier.
Since they are written using pseudo code, any technical person
can understand easily
Can be used as an design document
Once algorithm is written & understood neatly, easily can be
converted into executable program by using any programming
language
1-4
Features/ Properties of Algorithm
1-5
Algorithm Design And Analysis Process
6
Analysis of algorithms
• How good is the algorithm?
• correctness
• time efficiency
• space efficiency
1-7
Important problem types
• sorting
• searching
• string processing
• graph problems
1-8
Sorting (I)
• Rearrange the items of a given list in ascending order.
• Input: A sequence of n numbers <a1, a2, …, an>
• Output: A reordering <a´1, a´2, …, a´n> of the input
sequence such that a´1≤ a´2 ≤ … ≤ a´n.
• Why sorting?
• Help searching
• Algorithms often use sorting as a key subroutine.
• Sorting key
• A specially chosen piece of information used to guide
sorting. E.g., sort student records by names.
1-9
Sorting (II)
• Examples of sorting algorithms
• Selection sort
• Bubble sort
• Insertion sort
• Merge sort
• Heap sort …
• Evaluate sorting algorithm complexity: the number of key
comparisons.
• Two properties
• Stability: A sorting algorithm is called stable if it preserves the
relative order of any two equal elements in its input.
• In place : A sorting algorithm is in place if it does not require
extra memory, except, possibly for a few memory units.
1-10
Searching
• Find a given value, called a search key, in a given set.
• Examples of searching algorithms
• Sequential search
• Binary search …
Input: sorted array a[i] < … < a[j] and key x;
m (i+j)/2;
while i < j and x != a[m] do
if x < a[m] then j m-1
else i m+1;
if x = a[m] then output a[m];
1-12
Graph Problems
• Informal definition
• A graph is a collection of points called vertices, some of
which are connected by line segments called edges.
• Types of graphs
• Adjacency matrix
• Cost adjacency matrix
• Spanning tree
1-13
Graphs
• Formal definition
• A graph G = <V, E> is defined by a pair of two sets: a
finite set V of items called vertices and a set E of
vertex pairs called edges.
• Undirected and directed graphs (digraphs).
• Complete, dense, and sparse graphs
• A graph with every pair of its vertices connected by
an edge is called complete, K|V|
1 2
3 4
1-14
Graph Representation
• Adjacency matrix
• n x n boolean matrix if |V| is n.
• The element on the ith row and jth column is 1 if
there’s an edge from ith vertex to the jth vertex;
otherwise 0.
• The adjacency matrix of an undirected graph is
symmetric.
• Adjacency linked lists
• A collection of linked lists, one for each vertex, that
contain all the vertices adjacent to the list’s vertex.
0111 2 3 4
0001 4
0001 4
0000
1-15
Weighted Graphs
• Weighted graphs
• Graphs or digraphs with numbers assigned to the
edges.
5
1 2
6 7
9
3 4
8
1-16
Graph Properties -- Paths and Connectivity
• Paths
• A path from vertex u to v of a graph G is defined as a sequence
of adjacent (connected by an edge) vertices that starts with u
and ends with v.
• Simple paths: All edges of a path are distinct.
• Path lengths: the number of edges, or the number of vertices –
1.
• Connected graphs
• A graph is said to be connected if for every pair of its vertices
u and v there is a path from u to v.
• Connected component
• The maximum connected subgraph of a given graph.
1-17
Graph Properties -- Acyclicity
• Cycle
• A simple path of a positive length that starts and
ends a the same vertex.
• Acyclic graph
• A graph without cycles
• DAG (Directed Acyclic Graph)
1 2
3 4
1-18
Algorithm design strategies
1-19
Analysis Framework
The process of finding the efficiency of an algorithm is
called as analysis of algorithm.
We can find the efficiency of an algorithm in 2 ways
1) Time efficiency/complexity
2) Space efficiency/complexity
We do analysis of algorithms because of following reasons
1) To compare different algorithms for the same task
2) To predict the performance in a new environment
3) To specify the range of inputs on which algorithm works
properly
20
Time complexity or time efficiency
• Time complexity of an algorithm is the amount of time
taken by the program to run completely & efficiently
• Factors on which time complexity of an algorithm
depends are
1) Speed of computer
2) Choice of programming language
3) Compiler used
4) Choice of algorithmic design technique
5) Number of inputs/outputs
6) Size of the inputs/outputs
21
Time complexity or time
efficiency
• By considering the number of inputs given to the
algorithm & size of the inputs given to the
algorithm, time efficiency is normally computed
by considering the base operation or basic
operation
• The statement/instruction which is consuming
more time is called as basic operation
22
Time complexity or time efficiency
• To find the time efficiency, we need to
1) Identify the base operation
2) Find the time taken by the basic operation to execute once
3) Find, how many times, this basic operation is executing
• Basic operation: the operation that contributes the most towards
the running time of the algorithm
n is the input size
T(n) ≈ copC(n)
running time Total number of
of the algorithm Time taken by the basic times, the basic
operation to execute once operation is
executing
Note: Different basic operations may cost differently!
Order of growth
• As the value of n(input size) increases, time required
for execution also increases i.e., behavior of algorithm
changes with the increase in the value of n. This
change in the behavior is called order of growth
• Most important: Order of growth within a constant
multiple as n→∞
• Example:
• How much faster will algorithm run on computer that is
twice as fast?
• How much longer does it take to solve problem of double
input size?
Order of growth for few values of n
Input size and basic operation examples
Problem Input size measure Basic operation
Visiting a vertex or
Typical graph problem #vertices and/or edges
traversing an edge
Basic asymptotic efficiency
classes
1 constant
log n logarithmic
n linear
n log n n-log-n
n2 quadratic
n3 cubic
2n exponential
n! factorial
Space Complexity or Space Efficiency
• Space complexity of an algorithm is the amount of space
consumed by the program to run completely & efficiently.
• Total amount of space consumed by the program is calculated by
sum of the following components
• Fixed space requirements
• Variable space requirements
• Fixed space requirements are the requirements that do not depend
on the number of inputs & outputs and also size of inputs &
outputs of the program.
• Program space ( instruction space to store the code)
• Data space ( space for constants, variables, structures, etc)
• Stack space ( space for parameters, local variables, return values, etc)
Fixed space
requirements
= Program space + Data Space + Stack Space
28
Space Complexity or Space Efficiency
Variable space requirement
Along with fixed space requirements, it also includes the
extra space required
1) When function uses recursion
2) Dynamically allocated arrays, structures, etc
So, space ‘S’ of a program ‘P’ on a particular instance ‘I’ is
denoted by SP(I) =Fixed space requirements + space used during recursion
+ space used by run time variables
Total space ‘S’ of a program ‘P’ is given by
S(P)=c + SP(I)
29
Asymptotic Notations
• Mathematical notations used to express the time
complexity of an algorithm is called asymptotic
notations
• The 3 different asymptotic notations are
1) O (Big-oh) for worst case
2) Ω (Big-Omega) for best case
3) Θ (Big-Theeta) for average case
30
O (Big-oh) notation
• Big-oh is the formal method of expressing the upper
bound of an algorithm’s running time.
• It is a measure of longest amount of time it could possibly
take for the algorithm to complete.
• “The function, t(n) is said to be in O(g(n)), which is
denoted by t(n)ЄO(g(n)) such that, there exist a positive
constant c & positive integer n0 satisfying the constraint
t(n)≤c g(n) for all n ≥ n0 “
31
Big-oh Time
Prior analysis
Post analysis
Ω (Big-Omega) notation
• Big-omega (Ω) is the formal method of expressing the
lower bound of an algorithm’s running time.
• It is a measure of minimum amount of time it could
possibly take for the algorithm to complete.
• “The function, t(n) is said to be in Ω(g(n)), which is
denoted by t(n)Є Ω(g(n)) such that, there exist a positive
constant c & positive integer n0 satisfying the constraint
t(n) ≥ c g(n) for all n ≥ n0 “
33
Big-omega Time
Post analysis
Prior analysis
Θ (Big-Theeta) notation
• Big-Theeta (Θ) is the formal method of expressing the
average case efficiency of the algorithm.
• “The function, t(n) is said to be in Θ (g(n)), which is
denoted by t(n)Є Θ (g(n)) such that, there exist a
positive constants c1, c2 & positive integer n0 satisfying
the constraint c2g(n) ≤ t(n) ≤ c1 g(n) for all n ≥ n0 “
35
Big-theta
Time
Example: Sequential search
38
Linear Search
• A[10], k, i=0;
• While(i<n)
•{
• If(a[i]==k)
• Return I;
• Else i++;
• }
• If(i>=n)
• Printf(“element not found”);
39
Mathematical Analysis of non-recursive algorithms
General Plan for Analysis of non-recursive algorithms
• Decide the number of input parameters given to the algorithm & decide
the size of inputs. (identify the problem size)
• Identify algorithm’s basic operation
• Decide whether we need to find only average case (Θ) or all the 3 cases
separately.
• For this, check whether the number of times the basic operation is executed depends
only on the problem size or not.
• If it depends only on the problem size, then we need to find only average case Θ
• If it also depends on some other additional properties, then we need to find all 3 cases
separately.
• Find the total number of times, the basic operation is executing & express
it in sum expressions
• Simplify the expressions using standard formulas & rules of sum
manipulations & obtain their order of growth.
41
Example 1: Maximum element
ALGORITHM:- fact(n)
{ If n==0 then
return 1
else
return n*fact(n-1)
}
45
Analysis of example3
46
47
Solving the recurrence for M(n)
M(n) = M(n-1) + 1, M(0) = 0
M(n) = M(n-1) + 1
M(N-1) = M(n-1-1)+1= M(N-2)+1
= (M(n-2) + 1) + 1 = M(n-2) + 2
IF disk == 1, THEN
move disk from source to dest
ELSE
Hanoi(disk - 1, source, aux, dest) // Step 1
move disk from source to dest // Step 2
Hanoi(disk - 1, aux, dest, source) // Step 3
END IF
END Procedure
Recurrence for number of moves:
M(n) = 2M(n-1) + 1
Solving recurrence for number of
moves
M(n) = 2M(n-1) + 1, M(1) = 1
M(n) = 2M(n-1) + 1
M(N-1) = 2M(N-1-1)+1 = 2M(N-2)+1
= 2(2M(n-2) + 1) + 1 = 2^2*M(n-2)+2+1
= 2^2*M(n-2) + 2^1 + 2^0
= 2^2*(2M(n-3) + 1) + 2^1 + 2^0
= 2^3*M(n-3) + 2^2 + 2^1 + 2^0
= 2I M(N-I)+ 2^(n-2) + … + 2^1 + 2^0
= 2^(n-1)*M(1) + 2^(n-2) + … + 2^1 + 2^0
= 2^(n-1) + 2^(n-2) + … + 2^1 + 2^0
= 2^n -1
• M(n) = 2M(n-1) + 1, M(1) = 1
• = 2^2*M(n-2) + 2^1 + 2^0
• = 2^3*M(n-3) + 2^2 + 2^1 + 2^0
• = 2I M(N-I)+2I-1 +……. 2^0
S = a(rn - 1)/r-1-- a=1, r=2, n=I
S= 1(2i - 1)/(2-1) = 2I - 1
= 2I M(N-I)+ 2I - 1
I=N-1
59
• = 2I M(N-I)+ 2I - 1
• I = n-1
• 2n-1 M(N-(N-1))+ 2N-1 - 1
• 2n-1 M(1)+ 2N-1 - 1
• 2n-1 + 2n-1 - 1
• 2* 2n-1 – 1 = 2*(2n / 2) – 1= 2n - 1
60
Tree of calls for the Tower of Hanoi
Puzzle