Lecture 1

Download as ppt, pdf, or txt
Download as ppt, pdf, or txt
You are on page 1of 81

MA6351-PROBABILITY AND

STATISTICS
By
Dr. B. Krishna Kumar
PROFESSOR
DEPARTMENT OF MATHEMATICS
ANNA UNIVERSITY, CHENNAI – 25

1
UNIT I – RANDOM VARIABLES
• Discrete and Continuous Random
Variables
• Moments – Moment Generating
Functions
• Binomial, Geometric, Poisson and
Negative Binomial Distributions
• Uniform, Exponential, Gamma and
Weibull Distributions
2
LECTURE 1 - OVERVIEW

PROBABILITY BASICS

RANDOM VARIABLE

DISCRETE DISTRIBUTIONS

3
Probability Basics
Sample Space and Events
Mutually Exclusive Events
Axioms of Probability
Independent Events
Total Probability Theorem
 Bayes’ Theorem

4
Definition
• Random Experiment
An experiment whose outcome is not predictable
with certainty is said to be random.
• Sample Space (S)
The set of all possible outcome of an experiment
is called the sample space.
• Events (E)
Any subset of a sample space is known as an
event.
5
Example 1
If the experiment consist of flipping two
coins, then
S={(H,H);(H,T);(T,H);(T,T)}
E=Getting a Head on the first Toss
={(H,H);(H,T)}
F=Getting a Tail in the second toss
={(H,T);(T,T)}
6
• EUF = {(H,H); (T,H); (H,T);(T,T)} = S
• E∩F = {(H,T)}
More Examples:
• If the random experiment is the number of
tosses required until the first head appears
then S={1,2,3,4,….}
• If the random experiment consist of
measuring the lifetime of a component
then S={ t: 0 ≤ t ≤ ∞}.
7
Mutually Exclusive Events
• Two events E and F are said to be
mutually exclusive if
E∩F = Φ
Example:
Consider an experiment of tossing a die,
S = {1,2,3,4,5,6}
E = Getting an even number = {2,4,6}
F = Getting an odd number = {1,3,5}
8
Descriptive Definition -
Probability
Consider an experiment, with sample space S,
is repeatedly performed under exactly the
same conditions. For each event E of the
sample space S, let n(E) denote the number of
times in the first n repetetion of the experiment
that the event E occurs. Then,

n( E )
P(E) = Probability of an event E = lim
n  n
9
Axiomatic Definition - Probability
• Axiom 1 : 0  P ( E )  1
• Axiom 2 : P ( S )  1
• Axiom 3 :
For any sequence of pairwise mutually
exclusive events, E1, E2, ….
(i.e., Ei ∩ Ej = Φ for i ≠ j)
  
P Ei    P ( Ei )
 i 1  i 1
10
Proposition

• P(Ec)=1 - P(E)
• If E  F then P(E) < P(F)
• P(EUF) = P(E) + P(F) - P(E ∩ F)
If E and F are mutually exclusive,
then P(EUF) = P(E) + P(F), since
P(E∩F) = Φ

11
Conditional Probability
P( E  F )
• If P(F) > 0, then P( E / F ) 
P( F )
Therefore, P ( E  F )  P ( E / F ) P ( F )
Example :
In an experiment of rolling a die, let
E = Getting the outcome as 3 = {3}
F = Getting an odd number = {1,3,5}
P(E)=1/6, P(F)= 1/2, P(E∩F)= 1/6
P(E/F) = 1/3
12
Independent Events
• Two events E and F are said to be
independent if
P(E∩F) = P(E) P(F)
Example :
Consider an experiment of selecting a
card at random from a deck of 52 cards.
E = Card selected is an ace; P(E)=4/52
F = Card selected is a spade; P(F)=13/52
P(E∩F) = 1/52 = P(E)P(F)
13
Definition
The events A1,A2,A3,…An are said to be
mutually exclusive and exhaustive if
n

A  A  A ...  A
i 1
i 1 2 n S

and
Ai  A j   for i  j

14
A1 …
A3

A2 …
An

S
15
Total Probability Theorem

A1 …
A3
S B

A2 …
An
n n
P ( B )   P ( B  Ai )   P ( B / Ai ) P ( Ai )
i 1 i 1
16

A2
A1 .
S An B .

BAYES’ THEOREM
P ( B / Ai ) P ( Ai )
P ( Ai / B )  n

 P( B / A ) P( A )
i 1
i i
17
Proof of Bayes’ Theorem
P( Ai  B) P( B  Ai )
P( Ai / B)  
P( B) P( B)
P( B / Ai ) P( Ai )

P( B)
(By Conditional Probability)
P( B / Ai ) P( Ai )
 n
 P( B / Ai ) P( Ai )
i 1

(By Total Probability Theorem)


18
Remarks
1. If {Ai, i=1, 2, …,n} is a sequence of
mutually exclusive events, then
 n  n
P A    P ( A )
 i 1 i  i 1 i

2. If {Ai, i=1, 2, …,n} is a sequence of


independentn events, then
 n

P A    P ( A )
 i 1 i  i 1 i

19
Random Variables
Discrete and Continuous
Distribution Function
Mean and Variance
Moments
Moment Generating Function
20
Random Variable - Definition
• A random variable, X is a real valued function
defined on a sample space S, i.e.,
X :S 

∙ s1 X
∙s2

S 21
Example 2
• Consider an experiment of tossing a coin thrice,
then
S = {(H,H,H); (H,T,H); (H,H,T); (T,H,H);
(H,T,T); (T,H,T); (T,T,H); (T,T,T)}
X = No: of heads obtained
and
X(HHH)=3; X(HTH)=2; X(HHT)=2; X(THH)=2
X(HTT)=1; X(THT)=1; X(TTH)=1; X(TTT)=0
22
X :S 
(HHH)
X
(HTH)
(HHT)
(THH)
(THT)
(TTH)
(HTT)
(TTT) 0 1 2 3

s 
23
Random Variable (RV)

Discrete RV Continuous RV
(If the range space, (If the range space,
RX is discrete) RX is continuous)

Probability Mass Probability Density


Function (PMF) Function (PDF)

24
Distribution Function

Let X be a random variable and x be


a number. The cumulative distribution
function (CDF) of X is defined by

FX ( x)  P[ X  x]

25
Properties of CDF

1. FX ( x) is a non - decreasing function, ie.,


if x1  x2 then FX ( x1 )  FX ( x2 )
2. 0  FX ( x)  1
3. FX ()  1
4. FX ()  0
5. P (a  X  b)  FX (b)  FX (a )
6. P ( X  a )  1  FX (a )
26
Example 3
If X is a random variable with CDF

0, x0

 1 1
FX ( x)   x  , 0  x 
 2 2
 1
1, x
2
find P(0  X  1 / 4) & P( X  1 / 4)
Solution :
P (0  X  1 / 4)  FX (1 / 4)  FX (0)  1 / 4
P ( X  1 / 4)  1  FX (1 / 4)  1 / 4 27
Probability Mass Function
Let X be a random variable such that
RX = {x1, x2, x3, x4….}
and
pX(xk) = P(X=xk)
Then the ordered pair (xk,pX(xk)) is called
the PMF provided
pX(xk) >= 0 and ∑k pX(xk) = 1
NOTE: FX ( x)  p
xk  x
X ( xk )
28
Example 4
• X = Number of heads obtained in tossing a
coin thrice
• RX = {0,1,2,3}
• The corresponding PMF is given by

xk 0 1 2 3

pX(xk) 1/8 3/8 3/8 1/8

29
• The CDF of X is given by

0, x0
1 / 8, 0  x 1

F X ( x)  4 / 8, 1 x  2
7 / 8, 2 x3

1, x3

30
Example 5
If the random variable X takes values 1,2,3
and 4 such that
2P(X=1) = 3P(X=2) = P(X=3) = 5P(X=4)
Find the PMF of X.
Solution:
Let P(X=3) = k, then the PMF of X is
given by

31
xk 1 2 3 4
pX(xk) k/2 k/3 k k/5

Using the property


p
k
X ( xk )  1,

p X (1)  p X (2)  p X (3)  p X (4)  1


k k k 30
  k  1  k 
2 3 5 61
xk 1 2 3 4
pX(xk) 15/61 10/61 30/61 6/61
32
Example 6
A random variable X has the following
probability distribution

xk -2 -1 0 1 2 3

pX(xk) 0.1 k 0.2 2k 0.3 3k

Find the CDF of X

33
From the property of PMF,
0.1 + k + 0.2 + 2k + 0.3 + 3k =1
Therefore, k=1/15. Hence the PMF is

xk -2 -1 0 1 2 3
pX(xk) 1/10 1/15 2/10 2/15 3/10 3/15

34
The CDF of X is given by

0, x  2
1 / 10,  2  x  1

5 / 6, 1  x  0

F X ( x)  11 / 30, 0  x  1
1 / 2, 1 x  2

4 / 5, 2 x3
1, x3

35
Example 7

• Determine the PMF of the random


variable Y if the CDF is given by
0, y2
0.3, 2  y  4

FY ( y )  
0.8, 4  y  6
1, y6

36
Solution 7
pY(yk) = FY(yk) - FY(yk-1)

The PMF of Y is given by

yk 2 4 6
pY(yk) 0.3 0.5 0.2

37
Probability Density Function
Let X be a RV on a sample space S whose
range space RX is a interval on the real line.

A function f(x) is called the PDF of X


provided 

f(x) ≥ 0 and 

f ( x)  
RX
f ( x)  1

38
Remarks
1. That is, f(x) is nonnegative and the total area
under its graph is 1
f(x)
x=a x=b
b

2. P (a  X  b)   f ( x)dx
a

dFX ( x)
3. f ( x) 
dx
39
Example 8
Consider the function
2 x , 0  x  b
f ( x)  
0, otherwise
For what values of b is f(x) a legitimate
PDF?

40
Solution 8
For f(x) to be a valid PDF in the specified
range,
b

 f ( x)dx  1
0
b

   1
2 b 2
2 xdx [ x ]0 b
0

Thus b=1.

41
Example 9
The PDF of a continuous random variable
is given by
1
3 0  x 1

2
f ( x)   1 x  2
3
0 otherwise


Find the CDF
42
Solution 9
x

Note that FX ( x )   f (u ) du
0
. The CDF is given by
0 x0
x
 1 dy  x
0 3
0  x 1
3
 1
FX ( x)   1 x
2 2x 1
 3 dy   3 dy  3  3 1 x  2
0 1

1 1 2
2
 dy   dy  1 x2
 0 3 1
3
43
Example 10
The CDF of a random variable Y is given
by
0, y0
 2
FY ( y )  3 y  2 y , 0  y  1
3

1, y  1

Determine the PDF of Y.

44
Solution 10
dFY ( y )
f ( y) 
dy

Using the above property, the PDF of Y


is given by

f ( y)  6 y  6 y , 0  y  1
2

45
Expectation of a Random
Variable
If X is a random variable, then the
expectation(mean) of X is given by

 xk p X ( xk ) X : discrete
 k
E[X]   X   
  xf X ( x)dx X : continuous


46
Proposition
For a nonnegative random variable X
with CDF FX ( x), the expected value
is given by

E[ X ]   P[ X  x]dx
0

  [1  FX ( x)]dx
0
47
Proof:

Since P( X  x)   f X (u )du,
x



0 P( X  x)dx  0  x f X (u )du dx
  u
 
0 [1 P ( X  x )]dx  0  0  f X (u )du
 dx 

 

 [1 F
0
X ( x)]dx   uf X (u )du  E ( X )
0 48
Moment of a Random Variable
The n th moment of a random variable X is
defined by
 x p X ( xk )
n
k X : discrete
 k
E( X )   
n

  x n f X ( x)dx X : continuous

Note that the mean of X is the first moment
of X .
49
Variance of a Random Variable
The variance of a random variable X, denoted
by  X or Var(X) , is defined by
2

 ( xk   X ) 2 p X ( xk ) X : discrete
 k
X 2

  ( x   X ) 2 f X ( x)dx X : continuous


var(X)= E(x-mu)2 - second moment about


the mean. 50
Remarks :
1. The standard deviation of a
random variable, denoted by
 X , is the positive square root
of Var(X).
2. Var ( X )  E ( X )  ( E ( X ))
2 2

51
Example 11
A test engineer discovered that the
CDF of the lifetime of an equipment in
years is given by
0 x0
FX ( x)   x /5
1  e 0 x
• What is the expected lifetime of the
equipment?
• What is the variance of the lifetime of
the equipment 52
Solution 11
The expected lifetime of the equipment is
given by

E[ X ]   P[ X  x]dx
0
 
  [1  FX ( x)]dx   e x /5
dx  5
0 0

53
To evaluate the Variance, we first find
the PDF
1 x / 5
d  e x0
f X ( x)  FX ( x)   5
dx 0 otherwise

Thus the second moment of X is given by


 
1 2 x /5
E[ X ]   x f X ( x)dx   x e dx
2 2


50
54
Using Bernoulli's rule,

1 x e 2 1 / 5
2 xe 1 / 5
2e 
1 / 5
E[ X ]  
2
  3
5   1 / 5 (1 / 5) (1 / 5)  0
2

1
 2 1 / 5 1 / 5
  5 x e  50 xe  250e
5
1 / 5

0

250
  50
5
55
Finally, the variance of X is given by

 X  E[ X ]  E[ X ]
2 2 2

 50  25  25

56
Moment Generating Function

The moment generating function of


a random variable X is defined as
 e txk p X ( xk ), X : Discrete
 k
M X (t )  E (e tX )   
  e tx f ( x)dx, X : Continuous


57
Properties of MGF

1. The nth moment of a random variable X is


the coefficient of tn/n! in the power series
expansion of MX(t) in terms of t., i.e,

 n n
t E[ X ]
M X (t )  
n 0 n!

58
Proof :
M X (t )  E[e tX ]
 tX t 2 X 2 t 3 X 3 
 E 1     ...
 1! 2! 3! 
2 2 3 3
tE[ X ] t E[ X ] t E[ X ]
1     ...
1! 2! 3!
Since E[cX ]  cE[ X ], for any constant c.

t n E[ X n ]
M X (t )  
n 0 n!
59
2. Conversely, the nth moment of X can be
obtained by successively differentiating
MX(t) as
 d n

E[ X ]   n M X (t )
n

 dt  t 0

60
2 2 3 3
tE[ X ] t E[ X ] t E[ X ]
M X (t ) 1     ...
1! 2! 3!
2 2 3
E[ X ] 2tE[ X ] 3t E[ X ]
M X (t ) 
'
   ...
1! 2! 3!
In general,
n 1
n! E[ X n
] t ( n  1)! E[ X ]
M X (t ) 
n
  ...
n! (n  1)!
Therefore
n
M (t )
X  E[ X ]
n
t 0
61
Example 12
Obtain the moment generating function of
the random variable X having the PDF

1
 , 1  x  2
f ( x)   3
0, otherwise

62
Solution 12

M X (t )  E (e )   e f ( x)dx
tx tx

2 tx 2
1 e
  e dx 
tx

1
3 3t 1

 e 2 t  e t
 , t0
  3t
 1, t 0

63
Example 13
Find the MGF of a random variable X
whose moments are E[Xn] =(n+1)! 2n
Solution:

t n E[ X n ] (n  1)!2 n n
M X (t )    t
r 0 n! n n!
  
  (n  1)(2t ) n   n(2t ) n   (2t ) n
n 0 n 0 n 0

2t 1 1
  
(1  2t ) 1  2t (1  2t ) 2
2

64
Example 14
If the random variable has the MGF
MX(t)=2/(2-t), determine the variance of X.
Solution:
2 4
M (t ) 
'
X , M X (t ) 
''

(2  t ) 2
(2  t ) 3

' 1
E[ X ]  M X (t )  ,
t 0 2
'' 1 1
E[ X ]  M X (t )
2
 , Var ( X ) 
t 0 2 4
65
Discrete Distribution
Bernoulli and Binomial
Geometric
Poisson
Negative Binomial

66
Bernoulli Distribution
A random variable X is said to follow
Bernoulli distribution with parameter p if
its PMF is given by
p ( 0 )  P ( X  0)  1  p
p (1)  P ( X  1)  p

where 0 < p < 1.

67
The CDF FX(x) of a Bernoulli random
variable is given by

0, x0

FX ( x)  1  p, 0  x  1
1, x 1

68
1-p
pX(xk) p

0 1 x
1
FX(x)
1-p

0 1 x
69
The mean and variance are
given by
E ( X )  1 p  0  q  p
E( X )  1  p  0  q  p
2 2 2

Var ( X )  p  p  p (1  p )  pq
2

70
The moment generating function
is given by

M X (t )   e p ( xk )
txk

 e p (1)  e p (0)
t .1 t .0

 pe  q
t

71
Binomial Distribution

A discrete random variable X is said


to follow Binomial distribution if
n i
p (i )  P( X  i ) Ci p q
n i
, i  0,1 2, ...n

where p+q=1

72
pX(xk)

0 1 n-1 n x

FX(x)

0 1 x
73
The mean of a Binomial random variable is
given by
n n
 n  i n i
E ( X )   ip (i )  i  p q
i 0 i 0  i 
n
n!
 i p i q n i
i  0 i!( n  i )!
n
n!
 p i q n i
i  0 (i  1)!( n  i )!
n
(n  1)!
 np  p i 1q n i
i  0 (i  1)!( n  i )!

 np( p  q ) n 1  np 74
Similarly, the second moment and hence
the variance of X is given by
n
E ( X )   i p (i )
2 2

i 0
n
n i
  i   p (1  p )
2 n i

i 0 i 
 n(n  1) p  np
2

Var ( X )  npq
75
The corresponding moment generating function
is given by
 n  i n i
M X (t )   e p (i )   e   p q
it it

i i i 
 n  t i n i
   ( pe ) q  ( pe  q )
  t n

i i 

Exercise: Verify the first and second moment from


the MGF

76
Example 15

Five fair coins are tossed. If the


outcomes are assumed to be
independent, find the PMF of the
number of heads obtained.

77
Solution 15

Let X denote the number of heads in


tossing the coin five times. Then X is a
binomial random variable with n=5 and
p=1/2. The PMF of X is given by

X=xk 0 1 2 3 4 5
px(xk) 1/32 5/32 10/32 10/32 5/32 1/32

78
Example 16
For a binomial distribution with mean 6 &
standard deviation 2 , find the first two
terms of the distribution.
Solution:
Given np=6, npq=2, we get p=2/3, q=1/3,
and n=9.
P(X  0)  1/3 ,
9

8
 9  2  1  2
P(X  1)      
1  3  3  37
79
Example 17 - in back side of note
If the probability of success is 0.09, how
many trials are needed to have a
probability of atleast one success to be 1/3
or more.
1
sol: P (atleast one success) 
3
1 1
1  P( X  0)   1  (0.91) 
n

3 3
2
(0.91) 
n
 n 4
3
80
Geometric Distribution
• A discrete random variable X that
represents the number of Bernoulli
trials until first success is said to follow
Geometric distribution with parameter
p, where p denotes the probability of
success
• The PMF of X is given by
i 1
P ( X  i )  (1  p ) p, i  1,2,3,...
81

You might also like