Chap 2 - Asymptotic Analysis

Download as pdf or txt
Download as pdf or txt
You are on page 1of 11

‭CSE 221: Algorithms [ARN] Ariyan Hossain‬

‭Asymptotic Analysis‬
‭RAM Model:‬

‭ he RAM (Random Access Machine) model is a simplified computational model used to‬
T
‭analyze algorithms. This model assumes a single processor and instructions execute one after‬
‭another, with no concurrent operations. The RAM model assumes that each instruction‬
‭(operations, data access, data movement) takes the same amount of time as any other‬
‭instruction.‬

‭ he RAM model contains instructions commonly found in real computers: Arithmetic (add,‬
T
‭subtract, multiply), Data Movement (assign), Control Flow (return), and Comparison (logical‬
‭operators) take constant time.‬

‭To compare algorithms, let us define a few objective measures:‬

‭●‬ E
‭ xecution times? Not a good measure as execution times are specific to a particular‬
‭computer.‬

‭●‬ N
‭ umber of statements executed? Not a good measure, since the number of statements‬
‭varies with the programming language as well as the style of the individual programmer.‬

‭●‬ I‭deal solution? Let us assume that we express the running time of a given algorithm as a‬
‭function of the input size n (i.e., f(n)) and compare these different functions‬
‭corresponding to running times. This kind of comparison is independent of machine time,‬
‭programming style, etc.‬

‭3‬
‭CSE 221: Algorithms [ARN] Ariyan Hossain‬

‭ he running time of an algorithm on a particular input is the number of instructions and data‬
T
‭accesses executed. How we account for these costs should be independent of any particular‬
‭computer. A constant amount of time is required to execute each line. One line might take more‬
‭or less time than another line, but we’ll assume that each execution of the k‬‭th‬ ‭line takes c‬‭k‬ ‭time,‬
‭where c‬‭k‬ ‭is a constant. The running time of the algorithm‬‭is the sum of running times for each‬
‭statement executed. We usually denote the running time of an algorithm on an input of size n by‬
‭T(n).‬

‭ e’ll usually concentrate on finding only the worst-case running time, that is, the longest‬
W
‭running time for any input of size n. Because:‬

‭●‬ T ‭ he worst-case running time of an algorithm gives an upper bound on the running time‬
‭for any input. If you know it, then you have a guarantee that the algorithm never takes‬
‭any longer.‬
‭●‬ ‭For some algorithms, the worst case occurs fairly often. For example, in searching a‬
‭database for a particular piece of information, the searching algorithm’s worst case often‬
‭occurs when the information is not present in the database.‬

‭Order of growth:‬

‭ e will consider only the leading term of a formula (e.g. an‬‭2‭)‬ since the lower-order terms are‬
W
‭relatively insignificant for large values of n. We also ignore the leading term’s constant‬
‭coefficient, since constant factors are less significant than the rate of growth in determining‬
‭computational efficiency for large inputs.‬

‭Typical Running Time Functions:‬

‭‬ 1
● ‭ (constant running time): Instructions are executed once or a few times‬
‭●‬ ‭log N (logarithmic): A big problem is solved by cutting the original problem into smaller‬
‭sizes, by a constant fraction at each step‬
‭●‬ ‭N (linear): A small amount of processing is done on each input element‬
‭●‬ ‭N logN: A problem is solved by dividing it into smaller problems, solving them‬
‭independently, and combining the solution‬
‭●‬ ‭N‭2‬ ‬ ‭(quadratic): Typical for algorithms that process‬‭all pairs of data items (double nested‬
‭loops)‬
‭●‬ ‭N‭3‬ ‬ ‭(cubic): Processing of triples of data (triple‬‭nested loops)‬
‭●‬ ‭N‭K‬ ‬ ‭(polynomial)‬
‭●‬ ‭2‭N‬ ‬ ‭(exponential): Few exponential algorithms are appropriate‬‭for practical use‬

‭4‬
‭CSE 221: Algorithms [ARN] Ariyan Hossain‬

‭Comparison of Orders:‬

‭log (log n) < log n < √n < n < n log n < n‬‭3/2‬‭< n‬‭2‬ ‭< n‬‭2‬ ‭log n < n‬‭3‬ ‭< 2‬‭n‬ ‭< e‬‭n+1‬‭< n! < n‬‭n‬

‭Asymptotic Analysis:‬

‭ symptotic analysis is a method of analyzing functions when their arguments tend towards‬
A
‭infinity. Asymptotic analysis is used to approximate functions that cannot be exactly calculated.‬
‭The three most common types of asymptotic analysis are:‬

‭ orst Case Complexity:‬‭The function is defined by‬‭the maximum number of steps taken on‬
W
‭any instance of size n. It represents the upper bound of an algorithm's complexity. It is‬
‭represented by Big O notation (O()).‬

‭ est Case Complexity‬‭: The function is defined by the‬‭minimum number of steps taken on any‬
B
‭instance of size n. It represents the lower bound of an algorithm's complexity. It is represented‬
‭by Omega notation (Ω()).‬

‭ verage Case Complexity:‬‭The function is defined by‬‭the average number of steps taken on‬
A
‭any instance of size n. It represents the tight bound of an algorithm's complexity. It is‬
‭represented by Theta notation (Θ()).‬

‭5‬
‭CSE 221: Algorithms [ARN] Ariyan Hossain‬

I‭t’s hard to estimate the running time exactly. Best case depends on the input whereas the‬
‭average case is difficult to compute. So we usually focus on worst case analysis which is easier‬
‭to compute and usually close to the actual running time‬

‭Strategy‬‭: find a function (an equation) that, for‬‭large n, is an upper bound to the actual function‬

‭ e can sometimes determine the exact running time of an algorithm but the extra precision is‬
W
‭rarely worth the effort of computing it. For large enough inputs, the multiplicative constants and‬
‭lower-order terms of an exact running time are dominated by the effects of the input size itself.‬

‭Big Oh, O: Asymptotic Upper Bound:‬

‭ function f(n) can be represented in the order of g(n) that is O(g(n)), if there exists a value of‬
A
‭positive integer n as n‬‭0‬ ‭and a positive constant c‬‭such that −‬

‭f(n) = O(g(n)) when 0 ≤ f(n) ≤ c.g(n) for n ≥ n‬‭0‬ ‭in‬‭all case‬

‭6‬
‭CSE 221: Algorithms [ARN] Ariyan Hossain‬

‭7‬
‭CSE 221: Algorithms [ARN] Ariyan Hossain‬

‭8‬
‭CSE 221: Algorithms [ARN] Ariyan Hossain‬

‭Big Omega, Ω: Asymptotic Lower Bound:‬

‭ function f(n) can be represented in the order of g(n) that is Ω(g(n)), if there exists a value of‬
A
‭positive integer n as n‬‭0‬ ‭and a positive constant c‬‭such that −‬

‭f(n) = Ω(g(n)) when 0 ≤ f(n) ≥ c.g(n) for n ≥ n‬‭0‬ ‭in‬‭all case‬

‭9‬
‭CSE 221: Algorithms [ARN] Ariyan Hossain‬

‭10‬
‭CSE 221: Algorithms [ARN] Ariyan Hossain‬

‭Theta, θ: Asymptotic Tight Bound:‬

‭ function f(n) can be represented in the order of g(n) that is Θ(g(n)), if there exists a value of‬
A
‭positive integer n as n‬‭0‬ ‭and a positive constant c‬‭such that −‬

‭f(n) = Θ(g(n)) when 0 ≤ c‬‭1‭.‬ g(n) ≤ f(n) ≤ c‬‭2‬‭.g(n) for‬‭n ≥ n‬‭0‬ ‭in all case‬
‭Or‬
‭f(n) = Θ(g(n)) when f(n) = O(g(n)) AND f(n) = Ω(g(n))‬

‭11‬
‭CSE 221: Algorithms [ARN] Ariyan Hossain‬

‭12‬
‭CSE 221: Algorithms [ARN] Ariyan Hossain‬

‭Some Properties:‬

‭13‬

You might also like