Chap 2 - Asymptotic Analysis
Chap 2 - Asymptotic Analysis
Chap 2 - Asymptotic Analysis
Asymptotic Analysis
RAM Model:
he RAM (Random Access Machine) model is a simplified computational model used to
T
analyze algorithms. This model assumes a single processor and instructions execute one after
another, with no concurrent operations. The RAM model assumes that each instruction
(operations, data access, data movement) takes the same amount of time as any other
instruction.
he RAM model contains instructions commonly found in real computers: Arithmetic (add,
T
subtract, multiply), Data Movement (assign), Control Flow (return), and Comparison (logical
operators) take constant time.
● E
xecution times? Not a good measure as execution times are specific to a particular
computer.
● N
umber of statements executed? Not a good measure, since the number of statements
varies with the programming language as well as the style of the individual programmer.
● Ideal solution? Let us assume that we express the running time of a given algorithm as a
function of the input size n (i.e., f(n)) and compare these different functions
corresponding to running times. This kind of comparison is independent of machine time,
programming style, etc.
3
CSE 221: Algorithms [ARN] Ariyan Hossain
he running time of an algorithm on a particular input is the number of instructions and data
T
accesses executed. How we account for these costs should be independent of any particular
computer. A constant amount of time is required to execute each line. One line might take more
or less time than another line, but we’ll assume that each execution of the kth line takes ck time,
where ck is a constant. The running time of the algorithmis the sum of running times for each
statement executed. We usually denote the running time of an algorithm on an input of size n by
T(n).
e’ll usually concentrate on finding only the worst-case running time, that is, the longest
W
running time for any input of size n. Because:
● T he worst-case running time of an algorithm gives an upper bound on the running time
for any input. If you know it, then you have a guarantee that the algorithm never takes
any longer.
● For some algorithms, the worst case occurs fairly often. For example, in searching a
database for a particular piece of information, the searching algorithm’s worst case often
occurs when the information is not present in the database.
Order of growth:
e will consider only the leading term of a formula (e.g. an2) since the lower-order terms are
W
relatively insignificant for large values of n. We also ignore the leading term’s constant
coefficient, since constant factors are less significant than the rate of growth in determining
computational efficiency for large inputs.
1
● (constant running time): Instructions are executed once or a few times
● log N (logarithmic): A big problem is solved by cutting the original problem into smaller
sizes, by a constant fraction at each step
● N (linear): A small amount of processing is done on each input element
● N logN: A problem is solved by dividing it into smaller problems, solving them
independently, and combining the solution
● N2 (quadratic): Typical for algorithms that processall pairs of data items (double nested
loops)
● N3 (cubic): Processing of triples of data (triplenested loops)
● NK (polynomial)
● 2N (exponential): Few exponential algorithms are appropriatefor practical use
4
CSE 221: Algorithms [ARN] Ariyan Hossain
Comparison of Orders:
log (log n) < log n < √n < n < n log n < n3/2< n2 < n2 log n < n3 < 2n < en+1< n! < nn
Asymptotic Analysis:
symptotic analysis is a method of analyzing functions when their arguments tend towards
A
infinity. Asymptotic analysis is used to approximate functions that cannot be exactly calculated.
The three most common types of asymptotic analysis are:
orst Case Complexity:The function is defined bythe maximum number of steps taken on
W
any instance of size n. It represents the upper bound of an algorithm's complexity. It is
represented by Big O notation (O()).
est Case Complexity: The function is defined by theminimum number of steps taken on any
B
instance of size n. It represents the lower bound of an algorithm's complexity. It is represented
by Omega notation (Ω()).
verage Case Complexity:The function is defined bythe average number of steps taken on
A
any instance of size n. It represents the tight bound of an algorithm's complexity. It is
represented by Theta notation (Θ()).
5
CSE 221: Algorithms [ARN] Ariyan Hossain
It’s hard to estimate the running time exactly. Best case depends on the input whereas the
average case is difficult to compute. So we usually focus on worst case analysis which is easier
to compute and usually close to the actual running time
Strategy: find a function (an equation) that, forlarge n, is an upper bound to the actual function
e can sometimes determine the exact running time of an algorithm but the extra precision is
W
rarely worth the effort of computing it. For large enough inputs, the multiplicative constants and
lower-order terms of an exact running time are dominated by the effects of the input size itself.
function f(n) can be represented in the order of g(n) that is O(g(n)), if there exists a value of
A
positive integer n as n0 and a positive constant csuch that −
6
CSE 221: Algorithms [ARN] Ariyan Hossain
7
CSE 221: Algorithms [ARN] Ariyan Hossain
8
CSE 221: Algorithms [ARN] Ariyan Hossain
function f(n) can be represented in the order of g(n) that is Ω(g(n)), if there exists a value of
A
positive integer n as n0 and a positive constant csuch that −
9
CSE 221: Algorithms [ARN] Ariyan Hossain
10
CSE 221: Algorithms [ARN] Ariyan Hossain
function f(n) can be represented in the order of g(n) that is Θ(g(n)), if there exists a value of
A
positive integer n as n0 and a positive constant csuch that −
f(n) = Θ(g(n)) when 0 ≤ c1. g(n) ≤ f(n) ≤ c2.g(n) forn ≥ n0 in all case
Or
f(n) = Θ(g(n)) when f(n) = O(g(n)) AND f(n) = Ω(g(n))
11
CSE 221: Algorithms [ARN] Ariyan Hossain
12
CSE 221: Algorithms [ARN] Ariyan Hossain
Some Properties:
13