Lecture 2 DSA

Download as pdf or txt
Download as pdf or txt
You are on page 1of 31

Data Structure and Algorithms

CS-201
Lecture # 02
Definition of Algorithm

• An algorithm is any well-defined computational


procedure that takes some values or set of values as
input and produces some values or set of values as
output
In other words, we can say

• A sequence of computational steps that transforms


the input into output
Algorithms

• Properties of algorithms:
• Input from a specified set,
• Output from a specified set (solution),
• Definiteness (assuredness) of every step in the
computation,
• Correctness of output for every possible input,
• Finiteness of the number of calculation steps,
• Effectiveness of each calculation step and
• Generality for a class of problems.

4
A question?
• Suppose computers were infinitely fast and
computer memory are free (no issue in cost)

• Is there any reason to study algorithm ?

• Yes
In reality

• Computers may be fast, but they are not


infinitely fast

• Memory may be cheap, but it is not free

• Computing time is therefore a bounded


resource and so is the space in memory
Complexity
• In general, we are not so much interested in the
time and space complexity for small inputs.

• For example, while the difference in time


complexity between linear and binary search is
meaningless for a sequence with n=10

• What if it is huge for n=230 ?

7
8
Complexity
• Comparison: time complexity of algorithms A and B

Input Size Algorithm A Algorithm B

n 5,000n 1.1n

10 50,000 3

100 5,00,000 13,781

1,000 50,00,000 2.5x1041

1,000,000 5x109 4.8x1041392

9
Complexity

• This means that algorithm B cannot be used for large


inputs, while algorithm A is still feasible.

• So, what is important is the growth of the complexity


functions.

• The growth of time and space complexity with


increasing input size n is a suitable measure for the
comparison of algorithms.

10
Asymptotic Efficiency Algorithm
• When the input size is large enough so that the rate of
growth/order of growth of the running time is relevant.

• That is, we are concerned with how the running time of


an algorithm increases as the size of the input increases
without bound.

• Usually, an algorithm that is asymptotically more


efficient will be the best choice.

11
Asymptotic Notation

The notations we use to describe the asymptotic


running time of an algorithm are defined in terms of
functions whose domains are the set of natural
numbers N = {0, 1, 2,…. }

12
Asymptotic Notation

Let f(n) and g(n) be two positive functions,


representing the number of basic calculations
(operations, instructions) that an algorithm takes (or
the number of memory words an algorithm needs).

13
Asymptotic Notation
• O – Big O
• The notation Ο(n) is the formal way to express the upper
bound of an algorithm's running time. It measures the
worst-case time complexity or the longest amount of
time an algorithm can possibly take to complete.
•  - Theta (θ)
• The notation θ(n) is the formal way to express both the
lower bound and the upper bound of an algorithm's
running time.
•  – Omega
• The notation Ω(n) is the formal way to express the lower
bound of an algorithm's running time. It measures the
best-case time complexity or the best amount of time an
algorithm can possibly take to complete.
14
Example: Find sum of array elements
Algorithm arraySum (A, n)
Input array A of n integers
Output Sum of elements of A # operations
sum  0 1
for i  0 to n − 1 do n+1
sum  sum + A [i] n
return sum 1

• Input size: n (number of array elements)


• Total number of steps: 2n + 3 = f(n)
Example: Find max element of an
array
Algorithm arrayMax(A, n)
Input array A of n integers
Output maximum element of A # operations
currentMax  A[0] 1
for i  1 to n − 1 do n
if A [i]  currentMax then n -1
currentMax  A [i] n -1
return currentMax 1

▪ Input size: n (number of array elements)


▪ Total number of steps: f(n)=3n+2-2=3n
O-Notation
• For a given function g(n)
• O(g(n)) = {f(n) : there exist positive constants c and
n0 such that 0  f(n)  c. g(n) for all n  n0 }

Naturally: Set of all functions whose rate of growth is the


same as or lower than that of
c. g(n)

17
O-Notation

g(n) is an asymptotic upper bound for f(n).


 f(n) = O(g(n)).

18
Example
• O(g(n)) = {f(n) : there exist positive constants c
and n0 such that 0  f(n)  c g(n) for all n  n0 }

• f(n) = 5n+2
• 0  5n+2  6n
• n=0 → 020 (no) So n0=2, c=6, g(n)=n
• n=1 → 076 (no) f(n)= O(g(n))
• n=2 → 01212 (yes)
• n=3 → 01718 (yes)

19
Big-O Notation
(Examples)
1. f(n)=n/2 –3
0  n/2 –3  n/2; n0=6; c=1/2; f(n)=O(n)

2. f(n) = n2-n
0  n2–n  n2; n0=0; c=1; f(n)=O(n2)

3. f(n) = n(n+1)/2 =O(n2) Find yourself

20
 - Notation
• For a given function g(n)
• (g(n)) = {f(n) : there exist positive constants c
and n0 such that 0  c g(n)  f(n) for all n  n0}

Intuitively: Set of all functions whose rate of growth is


the same as or higher than that of c.g(n).

21
 - Notation

g(n) is an asymptotic lower bound for f(n).


f(n) = (g(n)).

22
Example
• (g(n)) = {f(n) : there exist positive constants c
and n0 such that 0  c g(n)  f(n) for all n  n0}

• f(n) = 5n+2
• 0  5n  5n+2
• n=0 → 002 (yes) So n0=0, c=5, g(n)=n
• n=1 → 057 (yes) f(n)=  (g(n))
• n=2 → 01012 (yes)

23
 - Notation (Big Theta)

For a given function g(n),

• (g(n)) = {f(n) : there exist positive constants c1 , c2 ,


and n0 such that
0  c1 g(n)  f(n)  c2 g(n) for all n  n0

• f(n)  (g(n)) f(n) = (g(n))

24
 - Notation

g(n) is an asymptotically
tight bound for f(n).

f(n) and g(n) are nonnegative, for large n.


25
Example
• (g(n)) = {f(n) :  positive constants c1, c2, and n0,
such that n  n0, 0  c1g(n)  f(n)  c2g(n) }

Example
5n+2 = (n)
Determine the positive constant n0, c1, and c2 such
that the above conditions satisfies
5n  5n+2  6n

26
Example Contd ...

• (g(n)) = {f(n) :  positive constants c1, c2, and n0,


such that n  n0, 0  c1g(n)  f(n)  c2g(n) }

• 5n  5n+2 is true for all n  0


• 5n+2  6n is true for all n  2
5n  5n+2  6n is true for all n  2
Thus;
 c1=5, c2=6, n0=2, g(n)=n
 f(n)= (g(n)) = (n)
27
Relations Between , O, 
For any two functions f(n) and g(n),

we have f(n) = (g(n)) if and only if


f(n) = O(g(n)) and f(n) = (g(n))

• That is
(g(n)) = O(g(n))  (g(n))

28
Relations Between , O, 

29
The Growth of Functions
• “Popular” functions g(n) are
n.log n, 1, 2n, n2, n!, n, n3, log n

• Listed from slowest to fastest growth:


• 1 Constant
• log n Logarithm
• n Linear
• n log n Log Linear
• n2 Quadratic
• n3 Cubic
• 2n Exponent
• n! Factorial

30
Comparing Growth Rates

2n n3
n2 n log2 n

T(n)

log2 n

Problem Size

You might also like