Computational Complexity
Computational Complexity
Computational Complexity
Deterministic Time Complexity: Deterministic algorithms are those where given the
same input, they will always produce the same output and follow the same sequence of steps. This
means that for any given input size, a deterministic algorithm will always take the same amount of
time to produce the result. For example, an algorithm with a time complexity of 𝑂(n2 ) means that its
runtime increases quadratically as the size of the input (𝑛) increases.
P:- P is the set of those deterministic algorithms which are taking polynomial time.
For example:- Bubble sort, Linear Search, Merge Sort etc.
NP:- NP is the set of non-deterministic algorithms which take polynomial time.
P is the subset of NP(as can be seen in the above fig.) which shows that the deterministic algorithms
of today were a part of non-deterministic algorithms.
Approximation algorithm:-
It is used to solve NP-complete optimization problems.
A problem which is having a number of solutions out of which we have to select the best solution as
the optimal solution.
The optimisation version of the Vertex Cover Problem is to find a vertex cover of minimum size
in G.
Suppose then that edge {b, c} is chosen. The two incident vertices are added to the cover and all other
incident edges are removed from consideration:
Iterating now for edges {e, f} and then {d, g}:
The resulting vertex cover is shown on the left and the optimal vertex on the right (the lighter
coloured vertices are in the cover:
These probability expectations are only over the random choices made by the algorithm independent
of the input. Thus, independent repetitions of Monte Carlo algorithms drive down the failure
probability exponentially.
Las Vegas:
1. A Las Vegas algorithm is a randomized algorithm that always produces a
correct result.
2. Its running time is a random variable whose expectation is bounded say by
a polynomial. They are said to be "Probably fast but deterministically
accurate".
Example:
//Las Vegas algorithm
repeat:
k = RandInt(n)
if A[k] == 1,
return k;
Randomized QuickSort: Randomized QuickSort always sorts an input array and expected worst
case time complexity of QuickSort is O(n log n).
Properties:
1. It always gives correct output.
2. Running time is bounded by the input size.
3. It informs about the failure.
Quick Sort Example for Las Vegas
Algorithm Search_Repeat(A,a)
{
for(i=0; i < n; i++)
if(A[i] = a)
return true
}
Need at least n/2 + 2 Steps in the worst case, Since the first n/2 + 1 Elements may all be
distinct. O(n)
Algorithm MC_Repeat(A,a,x)
{
i=0, flag = False
while( i <= x ) {
k = random() mod n+1;
i=i+1
if ( element a is found )
flag = True
}
return flag
}
x number of iteration is performed (fixed). It may not return the correct output.