Analysis of Algorithms
Analysis of Algorithms
• Let C(n) be the number of times this operation needs to be executed for
algorithm.
Visiting a vertex or
Typical graph problem #vertices and/or edges
traversing an edge
Worst-Case, Best-Case, and Average-Case Efficiencies
• Many algorithms for which running time depends not only on an input size but
also on the specifics of a particular input.
• The best-case efficiency of an algorithm is its efficiency for the best-case input
of size n, which is an input of size n for which the algorithm runs the fastest
among all possible inputs of that size.
• Neither the worst-case analysis nor its best-case counterpart yields the
necessary information about an algorithm’s behavior on a “typical” or “random”
input. This is the information that the average-case efficiency.
Worst-Case, Best-Case, and Average-Case Efficiencies
Worst-Case, Best-Case, and Average-Case Efficiencies
• Better performance: the item we search for is in the first position; examines
one position.
• Worst performance : item not in the array or in the last position; examine all
positions.
• The efficiencies of some algorithms may differ significantly for inputs of the
same size. For such algorithms, we need to distinguish between the worst-
case, average-case, and best-case efficiencies.
Asymptotic Analysis
• Asymptotic analysis refers to computing the running time of any operation in
mathematical units of computation.
• Calculate how the time (or space) taken by an algorithm increases with the
input size.
Asymptotic Notation
• The three most commonly used notations are Big O, Big Omega, and Big
Theta.
Asymptotic Notations and Basic Efficiency Classes
A way of comparing functions that ignores constant factors and small input sizes.
• O(g(n)) is the set of all functions with a lower or same order of growth as
g(n).
Big O-notation : Asymptotic Upper Bound
• Ω(g(n)), stands for the set of all functions with a higher or same order of growth as
g(n).
• Θ(g(n)) is the set of all functions that have the same order of growth as g(n).
• Θ(g(n)) is the set of all functions that have the same order of growth as g(n).
Time efficiency:
Algorithm and Analysis of Bubble Sort
= (n-1+1) (n-1)
2
Time efficiency: θ(n2)
Algorithm and Analysis of Bubble Sort
• Best Case Time Complexity : T(n) = O(n)
• Average Case
Algorithm and Analysis of Binary Search
• Best Case : Item in the middle , one check , O(1)
• Worst Case : Item in the last possible division, the maximal number of times an
array of length N can be divided is log2 N , O(log N).
• Average Case : Item is found after performing half of the possible number of
divisions ½ log2 N, O(log N).
References
1. Robert Lafore, 2nd Edition, Data Structures and Algorithms in java.
3. https://www.tutorialspoint.com/data_structures_algorithms/asymptotic_an
alysis.html