0% found this document useful (0 votes)
12 views

Lecture11 Slides

probailty Lecture Notes

Uploaded by

philopateer
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
12 views

Lecture11 Slides

probailty Lecture Notes

Uploaded by

philopateer
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

Lecture 13

Joint Probability Distributions

Definition: The function f(x, y) is a joint density function of the continuous random variables X
and Y if

1. f(x, y) ≥ 0, for all (x, y),


∞ ∞

2. ∫∫
−∞ −∞
f ( x, y )dxdy = 1,

3. P[(X, Y) ∈ A] = ∫∫ f ( x, y )dxdy, for any region A in the xy plane.


A

Definition: The marginal distributions of X alone and of Y alone are


∞ ∞

=g ( x) ∫=
f ( x, y )dy and h( y ) ∫
−∞ −∞
f ( x, y )dx

for the continuous case.

Statistical Independence

Definition: Let X and Y be two random variables, discrete or continuous, with joint probability
distribution f(x, y) and marginal distributions g(x) and h(y), respectively.
The random variables X and Y are said to be statistically independent if and only if

f(x, y) = g(x)h(y)
for all (x, y) within their range.

Definition: Let X 1 ,X 2 , . . . , X n be n random variables, discrete or continuous, with joint


probability distribution f(x 1 , x 2 , . . . , x n ) and marginal distribution f 1 (x 1 ), f 2 (x 2 ), . . . ,
f n (x n ), respectively. The random variables X 1 ,X 2 , . . . , X n are said to be mutually
statistically independent if and only if
f(x 1 , x 2 , . . . , x n ) = f 1 (x 1 )f 2 (x 2 ) · · · f n (x n )
for all (x 1 , x 2 , . . . , x n ) within their range.

Example 12.2: A privately owned business operates both a drive-in facility and a walk-in
facility. On a randomly selected day, let X and Y, respectively, be the proportions
of the time that the drive-in and the walk-in facilities are in use, and suppose that
the joint density function of these random variables is
2
 (2 x + 3 y ), 0 ≤ x ≤ 1,0 ≤ y ≤ 1,
f(x, y) =  5
 0, elsewhere.
PHM111s - Probability and Statistics
(a) Verify that f(x, y) is a joint density function.
1 1 1
(b) Find P[(X, Y) ∈ A], where A = {(x, y) | 0 < x < , < y < }.
2 4 2
(c) Find the marginal distribution of X alone and of Y alone g(x) and h(y).

Solution: (a) The integration of f(x, y) over the whole region is


∞ ∞ 1 1
2
∫−∞ −∞∫ f ( x, =
y )dxdy ∫ ∫ (2 x + 3 y )dxdy
0 0 5
1
2 x 2 6 xy x =1
= ∫( + ) |x = 0 dy
0 5 5
1
2 6y 2 y 3y2 1 2 3
= ∫( + )dy = ( + ) |0 = + = 1
0 5 5 5 5 5 5

(b) To calculate the probability, we use


1 1 1
P[(X, Y) ∈ A] = P (0 < X < , < Y < )
2 4 2
1/2 1/2
2
= ∫ ∫ (2 x + 3 y )dxdy
1/4 0 5

1/2
2 x 2 6 xy x =1/2 1/2
1 3y
∫1/4 5
= ( +
5
) | x=0
dy ∫
=+ (
1/4 10 5
)dy

y 3 y 2 1/2 13
=
( + ) |1/4 = .
10 10 160

(c) By definition,
∞ 1
2 4 xy 6 y 2 y =1 4 x + 3
g ( x) =∫ f ( x, y )dy =∫ (2 x + 3 y )dy =( + ) |y=0 = ,
−∞ 0 5 5 10 5
for 0 ≤ x ≤ 1, and g(x) = 0 elsewhere. Similarly,
∞ 1
2 2(1 + 3 y )
h( y ) = ∫ f ( x, y )dx = ∫ (2 x + 3 y )dx = ,
−∞ 0 5 5
for 0 ≤ y ≤ 1, and h(y) = 0 elsewhere.

Example 12.6: Suppose that the shelf life, in years, of a certain perishable food product packaged
in cardboard containers is a random variable whose probability density function is
 e− x , x > 0,
given by f(x) = 
 0, elsewhere.
Let X 1 , X 2 , and X 3 represent the shelf lives for three of these containers selected
independently and find P(X 1 < 2, 1 < X 2 < 3, X 3 > 2).

PHM111s - Probability and Statistics


Solution: Since the containers were selected independently, we can assume that the random
variables X 1 , X 2 , and X 3 are statistically independent, having the joint probability density
f(x 1 , x 2 , x 3 ) = f(x 1 )f(x 2 )f(x 3 ) = e − x e − x e − x = e − x − x − x ,
1 2 3 1 2 3

for x 1 > 0, x 2 > 0, x 3 > 0, and f(x 1 , x 2 , x 3 ) = 0 elsewhere. Hence


∞ 3 2

P(X 1 < 2, 1 < X 2 < 3,X 3 > 2) = ∫ ∫ ∫ e − x − x − x dx1dx2 dx3


1 2 3

2 1 0

= (1 − e −2 )(e −1 − e −3 )e −2 =
0.0372.
Definition: Let X and Y be random variables with joint probability distribution f(x, y). The mean,
or expected value, of the random variable g(X, Y) is
∞ ∞

μ g(X,Y) = E[g(X, Y)] = ∫ ∫ g ( x, y ) f ( x, y )dxdy if X and Y are continuous.


−∞ −∞

Note that if g(X, Y) = X , we have


∞ ∞ ∞

E(X) = ∫ ∫ xf ( x, y )dydx = ∫ xg ( x)dx (continuous case),


−∞ −∞ −∞

where g(x) is the marginal distribution of X.


Similarly, we define
∞ ∞ ∞

E(Y) = ∫∫
−∞ −∞
yf ( x, y )dxdy =
−∞
∫ yh( y )dy (continuous case),

where h(y) is the marginal distribution of the random variable Y.


Definition: Let X and Y be random variables with joint probability distribution f(x, y). The
covariance of X and Y is
∞ ∞

σ XY = E[( X − µ X )(Y − µY )] = ∫ ∫ (x − µ X
)( y − µY ) f ( x, y )dxdy , X and Y are continuous.
−∞ −∞

Theorem: The covariance of two random variables X and Y with means µ X and µY , respectively,
is given by
Cov(X, Y) = σ XY = E ( XY ) − µ X µY = E ( XY ) − E ( X ) E (Y ).
Theorem: Let X and Y be two independent random variables. Then
E(XY) = E(X)E(Y).
Corollary: Let X and Y be two independent random variables. Then σ XY = 0.

Definition: Let X and Y be random variables with covariance σ XY and standard deviations σ X
and σ Y , respectively. The correlation coefficient of X and Y is

σ XY
ρ XY = . −1 ≤ ρ XY ≤ 1
σ Xσ Y

PHM111s - Probability and Statistics


=σX E( X 2 ) − E 2 ( X ) = σ XY E ( XY ) − E ( X ) E (Y ) = σy E (Y 2 ) − E 2 (Y )
Example 12.10: The fraction X of male runners and the fraction Y of female runners who
compete in marathon races are described by the joint density function
 8 xy, 0 ≤ y ≤ x ≤ 1,
f(x, y) = 
 0, elsewhere.
Find the correlation coefficient between X and Y.
Solution: We first compute the marginal density functions. They are
 4 x3 , 0 ≤ x ≤ 1,
g(x) = 
 0, elsewhere.
 4 y (1 − y 2 ), 0 ≤ y ≤ 1,
and h(y) = 
 0, elsewhere.
From these marginal density functions, we compute
1 1
4 8
µ X = E ( X ) = ∫ 4 x 4 dx = and µY = E (Y ) = ∫ 4 y 2 (1 − y 2 )dy = .
0 5 0 15
1 1
2 2 1
E( X 2 ) = ∫0 4 x dx = and E (Y 2 ) = ∫ 4 y 3 (1 − y 2 )dy =
1− = ,
5

3 0 3 3
2 2
2  4 2 1  8 11
σ =−   = and σ Y2 =−   = .
2

3  5  75 3  15  225
X

From the joint density function given above, we have


1 1
4
= ∫=
∫ 8 x y dxdy
2 2
E ( XY ) .
0 y 9
4 4 8 4
Then σ XY =
E ( XY ) − µ X µY = − ( )( ) =− .
9 5 15 225
σ XY −4 / 225 −4
Hence, ρ XY =
= = .
σ Xσ Y (2 / 75)(11 / 225) 66
Theorem: If X and Y are random variables with joint probability distribution f(x, y) and a, b, and
c are constants, then
σ aX2 + bY + c = a 2σ X2 + b 2σ Y2 + 2abσ XY .
Corollary 1: Setting b = 0, we see that
σ= 2
aX + c
a= σ X a 2σ 2 .
2 2

Corollary 2: Setting a = 1 and b = 0, we see that


σ X2=+c
σ= 2
X
σ 2.
Corollary 3: Setting b = 0 and c = 0, we see that
= σ aX2 a= σ X a 2σ 2 .
2 2

Corollary 4: If X and Y are independent random variables, then


σ= 2
aX + bY
a 2σ X2 + b 2σ Y2 .
Corollary 5: If X and Y are independent random variables, then
PHM111s - Probability and Statistics
σ=
2
aX − bY
a 2σ X2 + b 2σ Y2 .
Example: The continuous random variables X & Y have joint density function given by:
C (6 − x − y ), 0 < x < 2 & 2 < y < 4,
f(x, y) = 
 0, elsewhere.
(a) Calculate C.
(b) P( X + Y > 3)
(c) Find the marginal distribution of X alone and of Y alone g(x) and h(y).
(d) Check independence of X & Y.
(e) Find Cov(2X, 5Y).
(f) Find the correlation coefficient between X and Y.
∞ ∞ 4 2

Solution: (a) ∫∫
−∞ −∞
=
f ( x, y )dxdy ∫ ∫ C (6 − x − y )dxdy
2 0
= 1
4
x2
⇒ C ∫ (6 x − − xy ) |xx == 02 dy =
1
2 2
4

⇒ C ∫ (10 − 2 y )dy =
1
2

⇒ C[(10 y − y 2 ) | yy == 42 ] =
1
1
⇒ C (20 − 12) =1 ⇒ C =
8
(b) To calculate the probability, we use
P[(X, Y) ∈ A] = P( X + Y > 3)
3 2 4 2
1 1
∫2 3∫− y 8 (6 − x − y )dxdy + ∫3 ∫0 8 (6 − x − y )dxdy
3 2 4 2
1 x 1 x
⇒ ∫ (6 x − − xy ) | dy + ∫ (6 x − − xy ) | dy
= x 2= x 2
x=3− y x=0
2 8 3 2 8 2
−1 3 2 14 19
⇒ ∫
16 2
( y − 8 y + 7)dy + ∫
83
(10 − 2 y )dy =
24
Method 2:
3 3− y
1
P( X + Y > 3) = 1 − ∫ ∫ (6 − x − y )dxdy
2 0 8
3
1 x2 19
= 1 − ∫ (6 x − − xy ) |30− y dy =
2 8 2 24

PHM111s - Probability and Statistics


(c) By definition,
∞ 4
1 1
g ( x)= ∫
−∞
f ( x, y )dy= ∫ 8 (6 − x − y )dy=
2 4
(3 − x),

for 0 ≤ x ≤ 2, and g(x) = 0 elsewhere. Similarly,


∞ 2
1 1
h( y )= ∫ f ( x, y )dx= ∫ (6 − x − y )dx= (5 − y ),
−∞ 0 8 4
for 2 ≤ y ≤ 4, and h(y) = 0 elsewhere.
1 1 1
(d) f(x, y) = (6 − x − y ) ≠ (3 − x) . (5 − y ) = g(x)h(y)
8 4 4
⇒ X & Y are not independent.
(e) Cov(2X, 5Y) = σ 2 X ,5= Y
E (2 X *5Y ) − µ2 X µ5=Y
10[ E ( XY ) − µ X µY ]
Solution: from the marginal density functions:
1
 (3 − x), 0 ≤ x ≤ 2,
g(x) =  4
 0, elsewhere.
and
 1
 (5 − y ), 2 ≤ y ≤ 4,
h(y) =  4
 0, elsewhere.
we compute σ 2 X ,5=
Y
E (2 X *5Y ) − µ2 X µ5= Y
10[ E ( XY ) − µ X µY ]
2 4
x 5 y 17
µ X = E ( X ) = ∫ (3 − x)dx = and µY = E (Y ) = ∫ (5 − y )dy = .
0 4 6 2 4 6
2
x2 4
y2 25
E ( X ) = ∫ (3 − x)dx = 1 and E (Y ) = ∫ (5 − y )dy =
2 2
,
0 4 2 4 3
2 2
 5  11 25  17  11
σ = 1 −   =and σ Y2 =−   =.
2

 6  36 3  6  36
X

From the joint density function given above, we have


4 2
xy 56
E (=
XY ) ∫∫
2 0 8
(6 − x − y )dxdy
=
24
Then
56 5 17 1
σ XY =
E ( XY ) − µ X µY = − ( )( ) =
− .
24 6 6 36
5
⇒ σ 2 X ,5Y = 10σ XY = − .
18
Hence,

PHM111s - Probability and Statistics


σ −1 / 36 −1
ρ XY
= =
XY
= . Try ρ 2 X ,5Y !!!!!!!
σ Xσ Y (11 / 36)(11 / 36) 11

PHM111s - Probability and Statistics

You might also like