Algorithms Analysis and Design Lec1,2
Algorithms Analysis and Design Lec1,2
Characteristics:
1. Start
2. Initialize max to the first element of the list.
3. For each element num in the list:
a. If num is greater than max, update max to num.
4. End
Pseudocode
Characteristics:
1
• Focuses on logic rather than syntax
FUNCTION findMax(list):
max ← list[0]
max ← num
RETURN max
Flowchart
Characteristics:
2
Summary
Suppose, for example, we have four algorithms to solve a problem P: A1, A2,
A3, and A4, and each of them has an execution time of T1, T2, T3, and 4.
Which one will you choose?
3
Not good:
a) It is necessary to implement the algorithm, which may be difficult.
b) Results may not be indicative of the running time on other inputs
not included in the experiment.
c) To compare two algorithms, the same hardware and software
environments must be used and the same inputs.
2. Count the number of statements executed?
Algorithm 1 Algorithm 2
arr[1] = 0; arr[i] = 0;
arr[2] = 0;
...
arr[N-1] = 0;
4
also in terms of other factors (e.g., memory requirements, programmer's effort
etc.).
Determine how running time increases as the size of the problem increases.
Time complexity
Space complexity
Notice that the term complexity here has nothing to do with an algorithm being
simple or complicated.
We consider what is called worst case, average case, and best-case complexity.
Theoretical Analysis
5
• Algorithms uses a high-level description of the algorithm instead of an
Implementation.
• Characterizes running time as a function of the input size, n
• Considers all possible inputs.
• Theoretical a analysis allows us to evaluate the speed of an algorithm
independent of the hardware/ software environment.
Primitive Operations:
Examples:
• Evaluating an expression
• Assigning a value to a variable
• Indexing into an array
• Calling a method Returning from a method.
Algorithm 1 Algorithm 2
Cost Cost
arr[1] = 0; c1 arr[i] = 0; c1
arr[2] = 0; c1
6
...
arr[N-1] = 0; c1
----------- -------------
c1+c1+...+c1 = c1 x N (N+1) x c2 + N x c1 =
(c2 + c1) x N + c2
The concepts worst case, average case, and best-case complexity will be
clarified by simple examples. sequential search of an array
input:
output:
7
... ...
found = false;
i=1
if key = ai
found = true
else
i++
if found
index = i
else index = -1
input size: n
operation to count: the comparison between the key and an array element.
space complexity: n
Time complexity:
The best case is 1 comparison, and it happens if the key equals the 1st array
element.
The worst case is n comparisons, and it happens if the key equals the last
array element, or if the key is not found.
8
average number of comparisons =
In the worst-case analysis, we calculate the upper bound on the running time of
an algorithm.
For Linear Search, the worst case happens when the element to be searched (x)
is not present in
the array.
Motivation
9
is easy to verify that (numerically, graphically, or otherwise) for n large enough
(n >>1) f(n)will be smaller than g(n). Hence, algorithm B is more efficient
(faster) than algorithm A for large values of n. This is true regardless of the
actual values of the constants.
To perform the asymptotic analysis We find the worst case number of primitive
operations executed as a function of the input size, We express this function
with big Oh notation
10
Order of magnitude (asymptotic) notation
Definition O
Let f(n) and g(n) be 2 functions of the integer n. We say that f(n) is O(g(n)) if a
positive constant c and a positive integer n0 such that:
f(n) ≤ c g(n) n ≥ n0. This means that the order of the function f(n) ≤ that of
g(n).
Equivalent Definition
𝑓(𝑛)
f(n) is O(g(n)) if lim (
𝑔(𝑛)
) = 𝑐 ≠ ∞⬚ (i.e. c=0 or = constant)
𝑛→∞
• This means that the order of the function f(n) ≤ that of g(n) (i.e. g(n)
function will grow faster than an f(n) function).
• This means the time of algorithm f(n) will be slower than the time of
algorithm g(n). algorithm.
11
Example
Assume n> n0=8. Then cn = 31n = 30n + n > 30n+8, so 30n+8 < cn.
Note 30n+8 isn’t less than n anywhere (n>0). And It isn’t even
less than 31n everywhere. But it is less than 31n everywhere to the right of
n=8.
12
Some useful Mathematical facts
1. =
2. L’Hôpital’s Rule:
3. Sterling formula: n!
4.
5.
6.
Example: Show that (log n)2 is o(n).
You can easily verify that the following statements are true:
6n+5 is O(n)
3 n2 + 2 n is O(n2)
106 n2 is O(n2)
log n is O(n)
13
log log n is O(log n )
2n is not O(n1000)
❖ c>0, O(cf)=O(f+c)=O(f−c)=O(f)
14
Example: The following are some of the common functions listed in
increasing order of magnitude.
n!
15
Big O Rules
Some example
16
Relatives of Big O
Definition Θ
• If fO(g) and gO(f) then we say “g and f are of the same order” or “f
is (exactly) order g” and write f(g).
This means that the order of the function f(n) = that of g(n).
Definition Ω
f(n) is Ω(g(n)) if f(n) ≥ c g(n) n ≥ n0. This means that the order of the
function f(n) ≥that of g(n).
This means that the order of the function f(n) ≥ that of g(n).
17
Definition o
f(n) is o(g(n)) if f(n) < c g(n) n ≥ n0. This means that the order of the
function f(n) < that of g(n).
f(n) is o(g(n)) if
This means that the order of the function f(n) < that of g(n).
18
Asymptotic Analysis of an algorithm: The asymptotic analysis of an algorithm determines the
running time in big Oh notation.
To perform the asymptotic analysis, We find the worst case number of primitive operations
executed as a function of the input size, We express this function with big Oh notation
Note that algorithm arrayMax executes 7 n - 2 primitive operations in the worst case.
Since constant factors and lower order terms are eventually dropped anyhow, we can ignore
them when counting primitive operations.
19
Running time of various statements
Example1
i = 0;
while (i<N) {
X=X+Y; // O(1)
result = mystery(X); // O(N), just an example...
i++; // O(1)
}
• The body of the while loop: O(N)
• Loop is executed: N times
N x O(N) = O(N2)
20
Example2
if (i<j)
for ( i=0; i<N; i++ )
X = X+i;
else
X=0;
Max ( O(N), O(1) ) = O (N)
Example3
for (i=1, i≤n; i++)
for (j=1, j ≤n; j++)
print “hello”
O(n)xO(n)=O(n2)
Example 4
21
Exercises compute the worst case running two algorithms
22