Algorithm Short Notes38387373md
Algorithm Short Notes38387373md
Algorithm Short Notes38387373md
Algorithm
1 ASYMPTOTIC
NOTATION
Note:
To find the time complexity of an algorithm, find the loops and also consider larger loops.
Space complexity is dependent on two things input size and some extra space (stack space link, space list etc).
Algorithm
Worst Case
The input class for which the algorithm does maximum work and hence, take maximum time.
Best Case
The input class for which the algorithm does minimum work hence, take minimum time.
Average Case
Average case can be calculated form best case to worst case.
1.7.2 Ω - Notation
f (n) = (g (n))
f (n) C g (n) (a b)
Example: 3n = (2n )
1.7.3 - Notation
If f(n) ≤ g(n)
And
f(n) ≥ g(n)
f(n) = g(n)
∴ f(n) = (g(n))
Example:
f(n) = 2n2, g(n) = n+10
f(n) > g(n) here
so, f(n) = Ω(g(n)) or g(n)=O(f(n))
Algorithm
= 1+ (2 )½ + (3)½ + .......
2 n 3 2 −1
=
3
= 2n 2 − 2
3
3 3
1.5
= O(n )
= O(n n)
Example 3. Arrange the following functions in increasing order.
f1 = nlogn, f2 = n, f3 = 2n , f4 = 3n , f5 = n!, f6 = nn , f7 = log n, f8 =100nlogn
→ f7 f 2 fl = f8 f3 f 4 f5 f6
Algorithm
Question.
Which of following is TRUE?
(1) 2log2 n = O(n 2 ) TRUE
(2) n2 23log2 n = O(n5 ) TRUE
(3) 2n = O(22n ) TRUE
(4) log n = O(log log n) FALSE
(5) log log n = O(n log n) TRUE
Solution:
(1) 2log2 n = O(n 2 )
= n log 2 2
=n
Algorithm
= n = O(n2 )
(2) n2 23log2 n = O(n5 )
= n 2 n 3 log 2 2
= n 2 n3
= n5
= n5 = O(n5 )
(3) 2n = 22n
2n = 2n.2n
2n 22n
2n = O(22n ) True
Algorithms
Solution.
Here 1 multiply, 1 division, 1 addition
⸫ O (1) [no loops, no recursion]
Solution.
i=1, 2, 22, 23 ..., 2k
→
2k ≤ n
k log 2 ≤ log n
k ≤ log n
log 2
⸫ k ≤ log 2 n
k = log2 n
So, this will execute log2 n+1 time and Complexity O ( log 2 n )
Example 2:
For (i=1; i ≤ n; i=i*3)
printf(“Aaveg”);
Solution.
So, this will execute log3 n +1 time and complexity O ( log3 n )
➢ i = 1→2→4→8→16→...→n
i = n→n/2→n/4→n/8→...1
Example 3:
for (i = 1; i ≤ n; i++)
{
for (j=1; j ≤ 10; j++)
{
printf(“Dhananjay”);
}
}
Solution.
This will execute 10⸳n times and complexity O(n)
Example 4:
for (i = 1; i <= n; i = i*3)
for (j = 1; j ≤ n; j++)
printf(“Prapti”);
Solution.
( )
Total n log3 n +1 time execute and Complexity = O ( n log3 n )
Algorithm
T (0) = C Constant
T(n) = C; n=0
T(n) = T (n - 1) + C; n > 0
Example 2:
void fun (in + n) T (n)
{
if (n > 0) C1 time
{
for (i = 1; i <= n; i + 1) n time
printf(“Hello”);
fun (n - 1); T (n - 1)
}
}
Solution.
T(n) = C1 + n - 1 + T (n - 1)
= T (n - 1) + n n>0
T (0) = C n=0
Example 3:
void fun (in + n) T(n)
{
if (n > 0) C1
{
for (i = 1; i < = n; i = i*2) log2 n
Algorithm
printf(“Divyajyoti”);
fun (n - 1); T (n - 1)
}
}
[ T (n - 2) + C] + C
T (n) = T (n - 2) + 2C
= T (n - 3) +3C
T (n) = T (n - k) + kC
⸫n–k=1
T (n) = T (1) + (n - 1) C
= C + (n - 1) C
T (n) = O (n)
Example (2)
T (n) = T (n - 1) + C⸳n
T (1) = C
Solution.
⸫ T (n) = T (n - 1) + C⸳n
= [T (n - 2) + C⸳(n – 1)] + C⸳n
= [T (n - 3) + C⸳(n – 2)] + C (n - 1) + C⸳n
= T (n - 3) + (n - 2)⸳C + (n - 1)⸳C + n⸳C
= T (n - k) + C (n – k +1) + C (n – k + 2) + ... + C (n – k + k)
⸫ n – k =1
T (n) = T (1) + T (2) + C (3) + C (4) + ... + C (n - 1) + C (n)
= C + C (2) + (3) C + 4 (C) + ... + (n - 1) C + (n)⸳C
= C [1 + 2 + 3 + ... + n]
Algorithm
(n + 1)
= C⸳n
2
2
= O (n )
Example (3)
T (n) = T (n/2) + C
T (1) = 1
Solution.
T (n) = T (n/2) + C
= [T (n/22) + C] + C
= T (n/4) + 2C
= T (n/23) + 3C
T (n) = T (n/2k) + kC
= (n/2k) = 2
T (n) = T (2) + ( log 2 n - 1) C
= 1 + ( log 2 n - 1) C
= O (log n)
Example (4)
T (1) = 1
T (n)= 2T (n/2) + C
n
Solution. T(n) = 2 2T + C + C
2
2
n
= 22 T 2 + 22 C + C
2
2 n
= 2 2T 3 + C + 2C + C
2
n
= 23 T 3 + 22 C + 2C + C
2
n + 2k − 1 C + 2k − 2 C + ... + 21 C + C
= 2k T k
2
Algorithm
n
k
= 1 n = 2k
2
→ T (n) = nT (1) + 2k - 1⸳ C + 2k - 2⸳ C + ... + 2C + C
= 2k + 2k - 1⸳ C + 2k – 2 + ... + 2C + C
= 2k + C (2k-1 + 2k-2 + ... + 21 + 20)
(2k −1)
= 2k + C
2 −1
= 2k + C (2k - 1)
= 2k + 2k⸳ C – C
= n⸳C
= O (n)
If a > bk or log b a k
(
T (n) = Θ nlogb a )
Question 2. T (n) = 2T n + n
2
Solution. a = 2, b = 2, k = 1, p = 0
T(n) = Θ (n ⸳log n)
Solution. a = 2, b = 2, k = 1, p = 1
T(n) = T + C
Question 4. n
2
Solution.
T(n) = Θ (n2log n)
T(n) = 2T + C
n
(1)
2
T (1) = C
n
k
=1 → n = 2k → k = log 2 n
2
Total Work done = C + 2C + 22C + 23C + ... + 2kC
= C (1 + 2 + 22 + ... + 2k)
2k+1 −1
= c
2 −1
= C (2k+1 - 1
= C (2⸳2k - 1)
= C (2n - 1)
= O (n)
Algorithm
T(n) = 2T + n
n
(2)
2
n
−1;n − 2k ;k − log 2 n
2k
⸫ n + n + n + ... + n
=kn
= n nlog 2 n
= O(nlog 2 n)
T(n) = 4T + n
n
(3)
2
n = 2k, k = log2 n
= n 1+ 2 + 22 + 23 + ...+ 2k
2k+1 −1
= n
2 −1
(
= n 2 2 k−1 )
= n ( 2n ) − 1
( )
= O n2
T (n) = T +T + n
n 2n
(4) T(1) = 1
3 3
n
−1; n − 3i; i − log3 n 3
3i
= n + n +...+ log3 n T (n)
= (n + n + n +...+ log3 n) ≥ n + ... + log3 n
Ω(n⸳log 3/ 2 n )
Algorithm
Int n, A[n];
Algorithm Rsum(A, n)
{
if (n = 1) return (A(1));
else;
return (A[n] + RSum(A, (n–1));
}
Algorithm A(n)
{
if (n = 1) return;
else;
{
A n ;
2
}
}
Recurrence relation
T(n) = C; n = 1
T(n) = T n + C; n > 1
2
Time Complexity = O(log n)
Space Complexity
• Space complexity will depend on number of activation record pushed into the stack
Suppose, n = 16
A (1)
A (2)
For n = 2K we are pushing
A (4)
the ‘K’ activation record
A (8)
A (16)
Algorithm
Space Complexity
n = 2K
log n = Klog2 2
K = log n
2
Example 3
Algorithm A(n)
{
if (n = 2) return;
else;
return (A n );
}
Solution:
T (n) = 1; n =2
T (n) = T ( n )+C; n > 2
Time Complexity = O (loglogn)
Space Complexity
Suppose n = 16
A(1)
A(2)
A(4)
A(16)
n
For 2 2k manner we are pushing in stack
n
2 2k 2
n
log 2 2 log 2 2
2k
n 2K
K log n
2
Recurrence Relation:
1 if n = 1 or n = 2
T (n) = n
2
2T +1; n 2
Time Complexity:
T (n) = O(n)
Space Complexity:
Algorithm
Recurrence relation:
1 if n = 1
T (n) = n
T 2 + 1; n 1
Time Complexity:
T (n) = T + C
n
2
T (n) = O(log n)
Space Complexity:
1 ; n =1
T (n) = n
T 2 + C; n 1
Algorithm
Time Complexity:
Space Complexity:
Moves = m + n (Always)
Note:
Best Case comes in comparisons no effect on moves.
Algorithm
Note:
• In GATE exam if merge sort given then always consider outplace.
• If array size very large, merge sort preferable.
• If array size very small, then prefer insertion sort.
• Merge sort is stable sorting technique.
Algorithm
th
Example 1: In Quick for sorting n elements, the n smallest element is selected as pivot. what is the worst-case time
16
Complexity?
Solution.
T (n) = T + T
n 15n
+O (n)
16 16
= (solve by recursive tree method)
Example 2: The median of n elements can be found in O (n) time then, what is the time complexity of quick sort algo in
which median selected as pivot?
Solution.
T (n) = O (n) + C + O (n) + T (n/2) + T (n/2)
Time Complexity:
T(n) = O(n2)
Space complexity:
Space Complexity = O(n)
Quick sort Choose pivot element place in (nlogn) (nlogn) (n2) No Yes
correct position
Merge sort Divide to equal parts recursively sort (nlogn) (nlogn) = (nlogn) = n Yes No
each sub part & marge them n logn log n
Heap sort Build heap(max) delete max place (nlogn) (nlogn) (nlogn) No Yes
Selection sort Find position of min element (n2) (n2) (n2) No Yes
from [1 to n]
Insertion sort Insert a [i + 1] into correct position (n) (n2) (n2) Yes Yes
❑❑❑
Algorithm
3 GREEDY
TECHNIQUE
3.1 Greedy Technique
• Greedy method is an algorithm design strategy used for solving problems where solution are seen as result of making
a sequence of decisions.
• A problem may contain more than one solution.
3.2 Terminology
Maximum edges =
V(V-1)
2
E
V(V-1)
2
E C.V2 C is constant
Note:
E = O(V2)
Log E = O(logV)
3.8.2 Graph Representation
Graph Representation
• For more edges (Dense Graph) Adj. matrix is better (density more).
• For less edge (sparse graph) Adj list is better.
Matrix List
(1) Finding degree of vertex Time Complexity O(V) Every Case O(1) Best Case
O(V1) Worst Case
O(V2) Every Case O(V+2E) Worst Case
(2)Finding total edges Time Complexity O(V) Best Case
O(1) O(V-1) Worst Case
(3) Finding 2-vertices adjacent (or)not Time Complexity O(1) Best Case
3.9.2 Bellman-Ford
• Time Complexity = O(EV)
• If negative edge weight cycle then for some vertices Incorrect answer.
❑❑❑
Algorithm
4 PROGRAMMING
DYNAMIC
4.2 Terminology
3. For i 0 to n −1
For j 0 to m −1
If ( p[i] =q[ j])then
L[i, j]=1+ L (i −1, j −1];
else
L[i, j]=max{Li, j −1, L[i −1, j]}
}
• Time complexity of step 1 = O(n)
• Time complexity of step 2 = O(m)
• Time complexity of step 3 = O(mn)
• Total Time complexity = O(n) + O(m) + O(mn)
= O(mn)
• Space complexity = O[(M+1).(n+1)]
= O(mn)