0% found this document useful (0 votes)
66 views23 pages

JPD and QT Notes

Uploaded by

1ds22me007
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
66 views23 pages

JPD and QT Notes

Uploaded by

1ds22me007
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 23

DAYANANDA SAGAR COLLEGE OF ENGINEERING

(An Autonomous Institute Affiliated to VTU, Belagavi)


Shavige Malleshwara Hills, Kumaraswamy Layout, Bengaluru-560078
DEPARTMENT OF MATHEMATICS

Course Material

COURSE PROBABITY THEORY AND OPTIMIZATION

COURSE CODE 22MAT41D

MODULE 4

MODULE NAME JOINT PROBABILITY DISTRIBUTIONS


AND QUEUEING THEORY
STAFF INCHARGE Dr. AMRUTHALAKLSHMI M R

1
DAYANANDA SAGAR COLLEGE OF ENGINEERING
(An Autonomous Institute Affiliated to VTU, Belagavi)
Shavige Malleshwara Hills, Kumaraswamy Layout, Bengaluru-560078
DEPARTMENT OF MATHEMATICS

Objectives:

At the end of this Module, student will be able

1. Understand what is meant by a joint pmf, pdf and cdf of two random

variables.

2. To compute probabilities and marginals from a joint pmf or pdf.

3. To test whether two random variables are independent.

4. To visualize a Markov Chain as a random process bouncing between

different states.

5. To understand the usefulness of Markov chain, if one knows the current state

of the process, then no additional information of its past states is required to

make the best possible prediction of its future.

2
DAYANANDA SAGAR COLLEGE OF ENGINEERING
(An Autonomous Institute Affiliated to VTU, Belagavi)
Shavige Malleshwara Hills, Kumaraswamy Layout, Bengaluru-560078
DEPARTMENT OF MATHEMATICS

JOINT PROBABALITY DISTRIBUTION

Introduction: It is relatively easy to understand and compute the probability for a single
variable. Nevertheless, in machine learning, we often have many random variables that
interact in often complex and unknown ways. There are specific techniques that can be used
to quantify the probability for multiple random variables, such as the joint, marginal, and
conditional probability. These techniques provide the basis for a probabilistic understanding
of fitting a predictive model to data

Joint Probability and Joint Distribution: If X and Y are two discrete random variables,
we define the joint probability function of X and Y by
𝑃 = (𝑋 = 𝑥, 𝑌 = 𝑦) = 𝑓(𝑥, 𝑦)

Where 𝑓(𝑥, 𝑦) satisfy the conditions, 𝑓(𝑥, 𝑦) ≥ 0𝑎𝑛𝑑 ∑𝑥 ∑𝑦 𝑓(𝑥, 𝑦) = 1

The second condition means that the sum over all the values of 𝑥 and 𝑦 is equal to 1.

Suppose 𝑋 = { 𝑥1 , 𝑥2 , 𝑥3 , − − − − − − 𝑥𝑚 } and 𝑌 = { 𝑦1 , 𝑦2 , 𝑦3 , − − − − − − 𝑦𝑛 } then

𝑃 = (𝑋 = 𝑥𝑖 , 𝑌 = 𝑦𝑗 ) denoted by 𝐽𝑖𝑗 .

It should be noted that 𝑓(𝑥, 𝑦)is a function on the Cartesian product of the sets X and Y as
we have

𝑋 × 𝑌 = {(𝑥1 , 𝑦1 ), (𝑥1 , 𝑦2 ), (𝑥1 , 𝑦3 )…………,(𝑥𝑚 , 𝑦𝑛 )}

𝑓 is also known as joint probability density function of 𝑋 𝑎𝑛𝑑 𝑌 in the respective order. The
set of values of this function 𝑓(𝑥𝑖 , 𝑦𝑗 ) = 𝐽𝑖𝑗 for 𝑖 = 1,2,3, … … … 𝑚, 𝑗 = 1,2,3, … … … 𝑛 is
called the Joint probability distribution of 𝑋 𝑎𝑛𝑑 𝑌.

3
DAYANANDA SAGAR COLLEGE OF ENGINEERING
(An Autonomous Institute Affiliated to VTU, Belagavi)
Shavige Malleshwara Hills, Kumaraswamy Layout, Bengaluru-560078
DEPARTMENT OF MATHEMATICS

Joint probability distribution table is given as

Y 𝑦1 𝑦2 𝑦3 ----- 𝑦𝑛 Sum

𝑥1 𝐽11 𝐽12 𝐽13 ………… 𝐽1𝑛 𝑓(𝑥1 )

𝑥2 𝐽21 𝐽22 𝐽23 𝐽2𝑛 𝑓(𝑥2 )

𝑥𝑚 𝐽𝑚1 𝐽𝑚2 𝐽𝑚3 𝐽𝑚𝑛 𝑓(𝑥𝑚 )

Sum 𝑔(𝑦1 ) 𝑔(𝑦1 ) 𝑔(𝑦1 ) 𝑔(𝑦1 ) 1

In the joint probability table, 𝑓(𝑥1 ), 𝑓(𝑥2 ), ………. 𝑓(𝑥𝑚 ) respectively represents sum of all
the entries in the first row, second row, …….mth row. 𝑔(𝑦1 ),𝑔(𝑦2 ), ……. 𝑔 (𝑦𝑛 ) respectively
represents the sum of all the entries in the first column, second column ……. nth column.
That is

𝑓(𝑥1 ) = 𝐽11 + 𝐽12 + . … … … … . . +𝐽1𝑛 : 𝑔(𝑦1 ) = 𝐽11 + 𝐽21 + . … … … … . . +𝐽𝑚1

𝑓(𝑥2 ) = 𝐽21 + 𝐽22 + . … … … … . . +𝐽2𝑛 : 𝑔(𝑦1 ) = 𝐽12 + 𝐽22 + . … … … … . . +𝐽𝑚2

……………………………………… : ………………………………………

𝑓(𝑥1 ) = 𝐽𝑚1 + 𝐽𝑚2 + . … … … … . . +𝐽𝑚𝑛 : 𝑔(𝑦1 ) = 𝐽1𝑛 + 𝐽2𝑛 + . … … … … . . +𝐽𝑚𝑛

4
DAYANANDA SAGAR COLLEGE OF ENGINEERING
(An Autonomous Institute Affiliated to VTU, Belagavi)
Shavige Malleshwara Hills, Kumaraswamy Layout, Bengaluru-560078
DEPARTMENT OF MATHEMATICS

{ 𝑓(𝑥1 ), 𝑓(𝑥2 ), … … . . 𝑓(𝑥𝑚 )} and { 𝑔(𝑦1 ), 𝑔(𝑦2 ), … … … 𝑔(𝑦𝑛 )} are called marginal
distributions of 𝑋 𝑎𝑛𝑑 𝑌 respectively.

It should be noted that,

𝑓(𝑥1 ) + 𝑓(𝑥2 ) + ⋯ … … + 𝑓(𝑥𝑚 ) = 1 , 𝑔(𝑦1 ) + 𝑔(𝑦2 ) + ⋯ … … + 𝑔(𝑦𝑛 ) = 1 and

𝑚 𝑛 𝑚 𝑛

∑ ∑ 𝑓(𝑥𝑖 , 𝑦𝑗 ) = ∑ ∑ 𝐽𝑖𝑗 = 1
𝑖=1 𝑗=1 𝑖=1 𝑗=1

That is sum of all the entries in joint probability table is equal to 1.

Independent Random variables: The discrete random variables X and Y ae said to be in


dependent random variables if
𝑃(𝑋 = 𝑥, 𝑌 = 𝑦) = 𝑃(𝑋 = 𝑥) 𝑃(𝑌 = 𝑦)
That is 𝑃(𝑋 = 𝑥𝑖 , 𝑌 = 𝑦𝑗 ) = 𝑃(𝑋 = 𝑥𝑖 ) 𝑃(𝑌 = 𝑦𝑗 )
This is equivalent to 𝑓(𝑥𝑖 )𝑔(𝑦𝑗 ) = 𝐽𝑖𝑗 in the joint probability table.

Expectation, Variance and Covariance:


If 𝑋 is a discrete random variable taking values 𝑥1 , 𝑥2 , … … … . 𝑥𝑛 having probability function
𝑓(𝑥) then the Expectation of 𝑋 denoted by 𝐸(𝑋)𝑜𝑟 𝜇𝑥 is defined by the relation
𝐸(𝑋)𝑜𝑟 𝜇𝑥 = ∑𝑛𝑖=1 𝑥𝑖 𝑓(𝑥𝑖 ) Or ∑ 𝑥 𝑓(𝑥)
The Variance of 𝑋 denoted by 𝑉(𝑋)𝑖𝑠 𝑑𝑒𝑓𝑖𝑛𝑒𝑑 𝑏𝑦 𝑡ℎ𝑒 𝑟𝑒𝑙𝑎𝑡𝑖𝑜𝑛
𝑉(𝑋) = ∑𝑛𝑖=1(𝑥𝑖 − 𝜇)2 𝑓(𝑥𝑖 ) = 𝐸[(𝑋 − 𝜇)2 ] Where 𝜇 is the mean of 𝑋.

𝜎𝑋 = √𝑉(𝑋) is called the standard deviation (S.D) of 𝑋.


𝜎𝑋 2 = 𝐸(𝑋 2 ) − [𝐸(𝑋)]2
𝜎𝑌 2 = 𝐸(𝑌 2 ) − [𝐸(𝑌)]2
If 𝑋 and 𝑌 are two discrete random variables having the joint probability function 𝑓(𝑥, 𝑦)
then the Expectations of 𝑋 and 𝑌 are denoted as follows.

𝜇𝑋 = 𝐸(𝑋) = ∑ ∑ 𝑥𝑓(𝑥, 𝑦) = ∑ 𝑥𝑖 𝑓(𝑥𝑖 )


𝑥 𝑦 𝑖

5
DAYANANDA SAGAR COLLEGE OF ENGINEERING
(An Autonomous Institute Affiliated to VTU, Belagavi)
Shavige Malleshwara Hills, Kumaraswamy Layout, Bengaluru-560078
DEPARTMENT OF MATHEMATICS

𝜇𝑌 = 𝐸(𝑌) = ∑ ∑ 𝑦𝑓(𝑥, 𝑦) = ∑ 𝑦𝑗 𝑔(𝑦𝑗 )


𝑥 𝑦 𝑗

Further 𝐸(𝑋𝑌) = 𝜇𝑋𝑌 = ∑𝑖,𝑗 𝑥𝑖 𝑦𝑗 𝐽𝑖𝑗


If X and Y are random variables having mean 𝜇𝑥 and 𝜇𝑦 respectively, then the covariance of
X and Y denoted by 𝐶𝑂𝑉(𝑋, 𝑌 ) = 𝐸(𝑋𝑌) − 𝐸(𝑋)𝐸(𝑌) = 𝐸(𝑋𝑌) − 𝜇𝑋 𝜇𝑌
The correlation of 𝑋 and 𝑌 denoted by 𝜌(𝑋, 𝑌 ), is defined by the relation
𝐶𝑂𝑉(𝑋, 𝑌 )
𝜌(𝑋, 𝑌 ) =
𝜎𝑋 𝜎𝑌
Note: If 𝑋 and 𝑌 are independent random variables then
(i) 𝐸(𝑋𝑌) = 𝐸(𝑋)𝐸(𝑌)
(ii) 𝐶𝑂𝑉(𝑋, 𝑌 ) = 0 𝑎𝑛𝑑 𝜌(𝑋, 𝑌 ) = 0

Continuous random variables: Let 𝑋 and 𝑌 be two continuous random variables. If


𝑓(𝑥, 𝑦) is real valued function satisfying the conditions

∞ ∞
𝑓(𝑥, 𝑦) ≥ 0 and ∫−∞ ∫−∞ 𝑓(𝑥, 𝑦) dx dy = 1 then 𝑓(𝑥, 𝑦) is called the joint probability
function or the joint density function of the random variables 𝑋 and 𝑌.
𝑏 𝑑
𝑃(𝑎 ≤ 𝑥 ≤ 𝑏, 𝑐 ≤ 𝑦 ≤ 𝑑) = ∫ ∫ 𝑓(𝑥, 𝑦)𝑑𝑦 𝑑𝑥
𝑎 𝑐

Marginal Distributions:
𝑥 ∞
𝑃(𝑋 ≤ 𝑥) = 𝐹1 (𝑥) = ∫𝑢=−∞ ∫𝑣=−∞ 𝑓(𝑢, 𝑣) 𝑑𝑢𝑑𝑣 is called the marginal distribution of 𝑋.
∞ 𝑦
𝑃(𝑌 ≤ 𝑦) = 𝐹2 (𝑦) = ∫𝑢=−∞ ∫𝑣=−∞ 𝑓(𝑢, 𝑣) 𝑑𝑢𝑑𝑣 is called the marginal distribution of 𝑌.
The derivative of 𝐹1 (𝑥) with respect to 𝑥 and 𝐹2 (𝑦) with respect to 𝑦 be respectively
denoted by 𝑓1 (𝑥) and 𝑓2 (𝑦). They are given by

𝑓1 (𝑥) = ∫𝑣=−∞ 𝑓(𝑥, 𝑣)𝑑𝑣 is called marginal density function of x.

𝑓2 (𝑦) = ∫𝑢=−∞ 𝑓(𝑢, 𝑦)𝑑𝑢 is called marginal density function of Y.
The variables 𝑥 and 𝑦 are said to be independent if 𝑓1 (𝑥). 𝑓2 (𝑦) = 𝑓(𝑥, 𝑦).

6
DAYANANDA SAGAR COLLEGE OF ENGINEERING
(An Autonomous Institute Affiliated to VTU, Belagavi)
Shavige Malleshwara Hills, Kumaraswamy Layout, Bengaluru-560078
DEPARTMENT OF MATHEMATICS

Expectation, Variance and Covariance:


∞ ∞

𝜇𝑋 = 𝐸(𝑋) = ∫ ∫ 𝑥𝑓(𝑥, 𝑦)𝑑𝑥𝑑𝑦


−∞ −∞
∞ ∞

𝜇𝑌 = 𝐸(𝑌) = ∫ ∫ 𝑦𝑓(𝑥, 𝑦)𝑑𝑥𝑑𝑦


−∞ −∞
∞ ∞

𝑉(𝑋) = 𝜇𝑋 = ∫ ∫ (𝑥 − 𝜇)2 𝑓(𝑥, 𝑦)𝑑𝑥𝑑𝑦 = 𝐸[(𝑋 − 𝜇)2 ]


2

−∞ −∞
∞ ∞

𝑉(𝑌) = 𝜇𝑌 2 = ∫ ∫ (𝑦 − 𝜇)2 𝑓(𝑥, 𝑦)𝑑𝑥𝑑𝑦 = 𝐸[(𝑌 − 𝜇)2 ]


−∞ −∞
∞ ∞

𝐶𝑂𝑉(𝑋, 𝑌) = ∫ ∫ (𝑥 − 𝜇𝑋 )(𝑦 − 𝜇𝑌 )𝑓(𝑥, 𝑦)𝑑𝑥𝑑𝑦 = 𝐸[(𝑥 − 𝜇𝑋 )(𝑦 − 𝜇𝑌 )]


−∞ −∞

= 𝐸(𝑋𝑌) − 𝜇𝑋 𝜇𝑌

Note: If 𝑋 and 𝑌 are independent random variables then


(i) 𝐸(𝑋𝑌) = 𝐸(𝑋)𝐸(𝑌)
(ii) 𝐶𝑂𝑉(𝑋, 𝑌 ) = 0 𝑎𝑛𝑑 𝜌(𝑋, 𝑌 ) = 0

Problems:
(1) The joint probability of two random variable X and Y are given as
Y 1 3 9
X
2 1/8 1/24 1/12
4 1/4 ¼ 0
6 1/8 1/24 1/12
Find the (i) Marginal distribution of X and Y (ii) 𝐶𝑜𝑣(𝑋, 𝑌)

7
DAYANANDA SAGAR COLLEGE OF ENGINEERING
(An Autonomous Institute Affiliated to VTU, Belagavi)
Shavige Malleshwara Hills, Kumaraswamy Layout, Bengaluru-560078
DEPARTMENT OF MATHEMATICS

Solution:
(i) Marginal distribution of X and Y are as follows
𝑥𝑖 2 4 6
𝑓(𝑥𝑖 ) 1 1 1 1 1 1 1 1 1 1
+ + = + + 0 = 1/2 + + =
8 24 12 4 4 4 8 24 12 4

𝑦𝑗 1 3 9
g(𝑦𝑗 ) 1 1 1 1 1 1 1 1 1 1 1
+ + = + + = +0+ =
8 4 8 2 24 4 24 3 12 12 6

(ii) 𝐶𝑜𝑣(𝑋, 𝑌) = 𝐸(𝑋𝑌) − 𝐸(𝑋)𝐸(𝑌)


1 1 1
𝐸(𝑋) = ∑ 𝑥𝑖 𝑓(𝑥𝑖 ) = 2 ( ) + 4 ( ) + 6 ( ) = 4
4 2 4
𝑖

1 1 1
𝐸(𝑌) = ∑ 𝑦𝑗 𝑔(𝑦𝑗 ) = 1 ( ) + 3 ( ) + 9 ( ) = 3
2 3 6
𝑖

𝐸(𝑋𝑌) = ∑ 𝑥𝑖 𝑦𝑗 𝐽𝑖𝑗
𝑖,𝑗

1 1 1 1 1
= (2)(1) ( ) + (2)(3) ( ) + (2)(9) ( ) + (4)(1) ( ) + (4)(3) ( )
8 24 12 4 4
1 1 1
+ (4)(9)(0) + (6)(1) ( ) + (6)(3) ( ) + (6)(9) ( ) = 12
8 24 12
𝐶𝑜𝑣(𝑋, 𝑌) = 𝐸(𝑋𝑌) − 𝐸(𝑋)𝐸(𝑌) = 12 − (4)(3) = 0
(2) The joint probability distribution of two random variables X and Y are given as
Y
X -4 2 7
1 1/8 ¼ 1/8
5 1/4 1/8 1/8
Compute (𝑖) 𝐶𝑜𝑣(𝑋, 𝑌) and (ii) 𝜌(𝑋, 𝑌)

8
DAYANANDA SAGAR COLLEGE OF ENGINEERING
(An Autonomous Institute Affiliated to VTU, Belagavi)
Shavige Malleshwara Hills, Kumaraswamy Layout, Bengaluru-560078
DEPARTMENT OF MATHEMATICS

Solution:
Marginal distribution of X
𝑥𝑖 1 5
𝑓(𝑥𝑖 ) 1 1 1 1 1 1 1 1
+ + = + + =
8 4 8 2 4 8 8 2

Marginal distribution of Y
𝑦𝑗 -4 2 7
g(𝑦𝑗 ) 1 1 3 1 1 3 1 1 1
+ = + = + =
8 4 8 4 8 8 8 8 4

(i) 𝐶𝑜𝑣(𝑋, 𝑌) = 𝐸(𝑋𝑌) − 𝐸(𝑋)𝐸(𝑌)


1 1
𝐸(𝑋) = 𝜇𝑋 = ∑ 𝑥𝑖 𝑓(𝑥𝑖 ) = 1 ( ) + 5 ( ) = 3
2 2
𝑖

3 3 1
𝐸(𝑌) = 𝜇𝑌 = ∑ 𝑦𝑗 𝑔(𝑦𝑗 ) = (−4) ( ) + 2 ( ) + 7 ( ) = 1
8 8 4
𝑖

𝐸(𝑋𝑌) = ∑ 𝑥𝑖 𝑦𝑗 𝐽𝑖𝑗
𝑖,𝑗

1 1 1 1 1
= (1)(−4) ( ) + (1)(2) ( ) + (1)(7) ( ) + (5)(−4) ( ) + (5)(2) ( )
8 4 8 4 8
1 1 1 7 5 35 3
+ (5)(7) ( ) = (− ) + ( ) + ( ) − 5 + ( ) + ( ) =
8 2 2 8 4 8 2
3 3
𝐶𝑜𝑣(𝑋, 𝑌) = 𝐸(𝑋𝑌) − 𝐸(𝑋)𝐸(𝑌) = − (3)(1) = −
2 2
𝐶𝑜𝑣(𝑋,𝑌)
(ii) 𝜌(𝑋, 𝑌) = 𝜎𝑋 𝜎𝑌

𝜎𝑋 2 = 𝐸(𝑋 2 ) − [𝐸(𝑋)]2=∑𝑖 𝑥𝑖2 𝑓(𝑥𝑖 ) − [𝐸(𝑋)]2


1 1
= {(1) ( ) + (25) ( )} − 32 = (13 − 4) = 9
2 2

9
DAYANANDA SAGAR COLLEGE OF ENGINEERING
(An Autonomous Institute Affiliated to VTU, Belagavi)
Shavige Malleshwara Hills, Kumaraswamy Layout, Bengaluru-560078
DEPARTMENT OF MATHEMATICS

3 3 1
𝜎𝑌 2 = 𝐸(𝑌 2 ) − [𝐸(𝑌)]2 = ∑ 𝑦𝑗2 𝑔(𝑦𝑗 ) − [𝐸(𝑌)]2 = {(16) ( ) + (4) ( ) + (49) ( )} − 12
8 8 4
𝑗

79 75
=( )−1=
4 4
𝜎𝑋 = 2 and 𝜎𝑌 = √75/4 = 4.33
3
𝐶𝑜𝑣(𝑋, 𝑌) (− 2)
∴ 𝜌(𝑋, 𝑌) = = = −0.1732
𝜎𝑋 𝜎𝑌 (2)(4.33)

(3) Suppose X and Y are independent random variables with the following respective
distribution. Find the joint distribution of X and Y. Also verify the 𝐶𝑜𝑣(𝑋, 𝑌) = 0.

𝑥𝑖 1 2 𝑦𝑗 -2 5 8
𝑓(𝑥𝑖 ) 0.7 0.3 𝑔(𝑦𝑗 ) 0.3 0.5 0.2

Solution: Since X and Y are independent random variables, the joint probability distribution
is obtained by 𝐽𝑖𝑗 = 𝑓(𝑥𝑖 ) 𝑔(𝑦𝑗 )
Y 𝑦1 =-2 𝑦2 =5 𝑦3 =8 𝑓(𝑥𝑖 )
X
𝑥1 =1 𝐽11=0.21 𝐽12=0.35 𝐽13=0.14 0.7
𝑥2 =2 𝐽21=0.09 𝐽22=0.15 𝐽23=0.06 0.3
𝑔(𝑦𝑗 ) 0.3 0.5 0.2 1

𝐽11=𝑓(𝑥1)𝑔(𝑦1)=(0.7)(0.3)=0.21 𝐽21=𝑓(𝑥2)𝑔(𝑦1)=(0.3)(0.3)=0.09
𝐽12=𝑓(𝑥1)𝑔(𝑦2)=(0.7)(0.5)=0.35 𝐽22=𝑓(𝑥2)𝑔(𝑦2)=(0.3)(0.5)=0.15
𝐽13=𝑓(𝑥1)𝑔(𝑦3)=(0.7)(0.2)=0.14 𝐽23=𝑓(𝑥2)𝑔(𝑦3)=(0.3)(0.2)=0.06
𝐶𝑜𝑣(𝑋, 𝑌) = 𝐸(𝑋𝑌) − 𝐸(𝑋)𝐸(𝑌)

𝐸(𝑋) = 𝜇𝑋 = ∑ 𝑥𝑖 𝑓(𝑥𝑖 ) = 1(0.7) + 2(0.3) = 1.3


𝑖

𝐸(𝑌) = 𝜇𝑌 = ∑ 𝑦𝑗 𝑔(𝑦𝑗 ) = (−2)(0.3) + 5(0.5) + 8(0.2) = 3.5


𝑖

𝐸(𝑋𝑌) = ∑𝑖,𝑗 𝑥𝑖 𝑦𝑗 𝐽𝑖𝑗 = 𝑥1 𝑦1 𝐽11 + 𝑥1 𝑦2 𝐽12 + 𝑥1 𝑦3 𝐽13 + 𝑥2 𝑦1 𝐽21 + 𝑥2 𝑦2 𝐽22 +𝑥2 𝑦3 𝐽23

10
DAYANANDA SAGAR COLLEGE OF ENGINEERING
(An Autonomous Institute Affiliated to VTU, Belagavi)
Shavige Malleshwara Hills, Kumaraswamy Layout, Bengaluru-560078
DEPARTMENT OF MATHEMATICS

= 1(−2)(0.21) + 1(5)(0.35) + 1(8)(0.14) + 2(−2)(0.09) + 2(5)(0.15) + 2(8)(0.06)


= 4.55
𝐶𝑜𝑣(𝑋, 𝑌) = 𝐸(𝑋𝑌) − 𝐸(𝑋)𝐸(𝑌) = 4.55 − (1.3)(3.5) = 0

(4) Two cards are selected at a random from a box which contains the cards numbered
1,1,2,2 and 3. Find the joint distribution of 𝑋 𝑎𝑛𝑑 𝑌 𝑤ℎ𝑒𝑟𝑒 𝑋 denotes the sum and 𝑌the
maximum of the two numbers drawn. Also determine 𝐶𝑜𝑣(𝑋, 𝑌) and𝜌(𝑋, 𝑌).
Solution:
The possible pair of numbers are 5𝐶2 =
10,𝑎𝑟𝑒 (1.1) , (1,2), (1,2), (1,3), (1,2), (1,2), (2,2), (2,3), (2,3)
𝑋 is the sum of two numbers
𝑌 is Maximum of two numbers
Distribution of :

𝑥𝑖 2 3 4 5
𝑓(𝑥𝑖 ) 1 4 3 2
= 0.1 = 0.4 = 0.3 = 0.2
10 10 10 10
Distribution of :

𝑦𝑗 1 2 3
𝑔(𝑦𝑗 ) 1 5 4
= 0.1 = 0.5 = 0.4
10 10 10
Joint probability distribution:
1 0
Y 1 2 3 Sum 𝑃(𝑋 = 2, 𝑌 = 1) = 10 = 0.1, 𝑃(𝑋 = 4, 𝑌 = 1) = 10 = 0
X
2 0.1 0 0 0.1 0 1
𝑃(𝑋 = 2, 𝑌 = 2) = 10 = 0, 𝑃(𝑋 = 4, 𝑌 = 2) = 10 = 0.1
3 0 0.4 0 0.4 0 2
𝑃(𝑋 = 2, 𝑌 = 3) = 10 =0, 𝑃(𝑋 = 4, 𝑌 = 3) = 10 = 0.2
4 0 0.1 0.2 0.2
0 0
5 0 0 0.2 0.2 𝑃(𝑋 = 3, 𝑌 = 1) = 10 = 0, 𝑃(𝑋 = 5, 𝑌 = 1) = 10 = 0
0
Sum 0.1 0.5 0.4 1 𝑃(𝑋 = 3, 𝑌 = 2) = 4 = 0.4, 𝑃(𝑋 = 5, 𝑌 = 2) = 10 = 0
0
𝑃(𝑋 = 3, 𝑌 = 3) = 10 = 0, 𝑃(𝑋 = 5, 𝑌 = 3) = 2/10 = 0.2

11
DAYANANDA SAGAR COLLEGE OF ENGINEERING
(An Autonomous Institute Affiliated to VTU, Belagavi)
Shavige Malleshwara Hills, Kumaraswamy Layout, Bengaluru-560078
DEPARTMENT OF MATHEMATICS

𝐶𝑜𝑣(𝑋, 𝑌) = 𝐸(𝑋𝑌) − 𝐸(𝑋)𝐸(𝑌)

𝐸(𝑋) = 𝜇𝑋 = ∑ 𝑥𝑖 𝑓(𝑥𝑖 ) = 2(0.1) + 3(0.4) + 4(0.3) + 5(0.2) = 3.6


𝑖

𝐸(𝑌) = 𝜇𝑌 = ∑ 𝑦𝑗 𝑔(𝑦𝑗 ) = 1(0.1) + 2(0.5) + 3(0.4) = 2.3


𝑖

𝐸(𝑋𝑌) = ∑ 𝑥𝑖 𝑦𝑗 𝐽𝑖𝑗 =
𝑖,𝑗

= 𝑥1 𝑦1 𝐽11 + 𝑥1 𝑦2 𝐽12 + 𝑥1 𝑦3 𝐽13 + 𝑥2 𝑦1 𝐽21 + 𝑥2 𝑦2 𝐽22 + 𝑥2 𝑦3 𝐽23 + 𝑥3 𝑦1 𝐽31


+ 𝑥3 𝑦2 𝐽32 + 𝑥3 𝑦3 𝐽33 + 𝑥4 𝑦1 𝐽41 + 𝑥4 𝑦2 𝐽42 + 𝑥4 𝑦3 𝐽43
𝐸(𝑋𝑌) = 2(1)(0.1) + 2(2)(0) + 2(3)(0) + 3(1)(0) + 3(2)(0.4) + 3(3)(0) + 4(1)(0)
+ 4(2)(0.1) + 4(3)(0.2) + 5(1)(0) + 5(2)(0) + 5(3)(0.2) = 8.8
𝐶𝑜𝑣(𝑋, 𝑌) = 𝐸(𝑋𝑌) − 𝐸(𝑋)𝐸(𝑌) = 8.8 − (3.6)(2.3) = 0.52 ≠ 0
X and Y are not independent.
𝜎𝑋 2 = 𝐸(𝑋 2 ) − [𝐸(𝑋)]2=∑𝑖 𝑥𝑖2 𝑓(𝑥𝑖 ) − [𝐸(𝑋)]2 = {4(0.1) + 9(0.4) + 16(0.3) +
25(0.2)} − (3.6)2 = 13.8 − (3.6)2
𝜎𝑌 2 = 𝐸(𝑌 2 ) − [𝐸(𝑌)]2 = ∑𝑗 𝑦𝑗2 𝑔(𝑦𝑗 ) − [𝐸(𝑌)]2 = {1(0.1) + 4(0.5) + 9(0.4)} −
2.32 =5.7-2.32 = 0.41
𝜎𝑋 = 0.9165 and 𝜎𝑌 = 0.6403
𝐶𝑜𝑣(𝑋, 𝑌) 0.52
𝜌(𝑋, 𝑌) = = = 0.886
𝜎𝑋 𝜎𝑌 (0.9165 )(0.6403)
(5) A coin is tossed three times , let X denotes 0 and 1 according as a tail or a head occurs on
the first toss. Determine (i) the marginal distribution of X and Y,(ii) The joint probability
distribution of X and Y. Also find the expected values of X+Y and XY
Solution:
The given random experiment the sample space is given by
𝑆 = {𝐻𝐻𝐻, 𝐻𝐻𝑇, 𝐻𝑇𝐻, 𝑇𝐻𝐻, 𝐻𝑇𝑇, 𝑇𝐻𝑇, 𝑇𝑇𝐻, 𝑇𝑇𝑇}
𝑋 = {0,1} and 𝑌 = {0,1,2,3}

12
DAYANANDA SAGAR COLLEGE OF ENGINEERING
(An Autonomous Institute Affiliated to VTU, Belagavi)
Shavige Malleshwara Hills, Kumaraswamy Layout, Bengaluru-560078
DEPARTMENT OF MATHEMATICS

4 1
𝑃(𝑋 = 0) = 8 = 2 𝐽𝑖𝑗 = 𝑃(𝑋 = 𝑥𝑖 , 𝑌 = 𝑦𝑗 ) where 𝑥𝑖 = 0,1 and 𝑦𝑗 = 0,1,2,3
4 1
𝑃(𝑋 = 1) = 8 = 2 𝐽11 = 𝑃(𝑋 = 0, 𝑌 = 0) = 0 because no outcome with 𝑋 =

0&𝑌 =0
1 1
𝑃(𝑌 = 0) = 8 𝐽12 = 𝑃(𝑋 = 0, 𝑌 = 1) = 𝑃(𝑇𝐻𝐻) = 8
3 2 1
𝑃(𝑌 = 1) = 8 𝐽13 = 𝑃(𝑋 = 0, 𝑌 = 2) = 𝑃(𝑇𝑇𝐻, 𝑇𝐻𝑇) = 8 = 4
3 1
𝑃(𝑌 = 2) = 8 𝐽14 = 𝑃(𝑋 = 0, 𝑌 = 3) = 𝑃(𝑇𝑇𝑇) = 8
1 1
𝑃(𝑌 = 3) = 8 𝐽21 = 𝑃(𝑋 = 1, 𝑌 = 0) = 𝑃(𝐻𝐻𝐻) = 8

Y 0 1 2 3 2
X 𝐽22 = 𝑃(𝑋 = 1, 𝑌 = 1) = 𝑃(𝐻𝐻𝑇, 𝐻𝑇𝐻) =
8
0 0 1/8 2/8 1/8
1
1 1/8 2/8 1/8 0 𝐽23 = 𝑃(𝑋 = 1, 𝑌 = 2) = 𝑃(𝐻𝑇𝑇) =
8
𝐽24 = 𝑃(𝑋 = 1, 𝑌 = 3) = 0 , because there is no toss in which the first toss is yields H but
but total number of T is 3

𝐸(𝑋 + 𝑌) = ∑ ∑ 𝐽𝑖𝑗 (𝑥𝑖 + 𝑦𝑗 )


𝑖 𝑗

= ∑ 𝐽1𝑗 (𝑥1 + 𝑦𝑗 ) + 𝐽2𝑗 (𝑥2 + 𝑦𝑗 ) ∑{𝐽1𝑗 (0 + 𝑦𝑗 ) + 𝐽2𝑗 (1 + 𝑦𝑗 )}


𝑗 𝐽

= 𝐽11 𝑦1 + 𝐽12 𝑦2 + 𝐽13 𝑦3 + 𝐽14 𝑦4 + 𝐽21 (1 + 𝑦1 ) + 𝐽22 (1 + 𝑦2 ) +


1 2 1 1
𝐽23 (1 + 𝑦3 ) + 𝐽24 (1 + 𝑦4 ) = (0)(0) + (1) (8) + (2) (8) + (3) (8) + ((8) (1 + 0) +
2 1 1 4 3 1 4 3
(8) (1 + 1) + (8) (1 + 2) + (0)(1 + 3) = 0 + 8 + 8 + 8 + 8 + 8 + 8 + 0 = 2

𝐸(𝑋𝑌) = ∑ 𝑥𝑖 𝑦𝑗 𝐽𝑖𝑗
𝑖,𝑗

= 𝑥1 𝑦1 𝐽11 + 𝑥1 𝑦2 𝐽12 + 𝑥1 𝑦3 𝐽13 + 𝑥1 𝑦4 𝐽14 + 𝑥2 𝑦1 𝐽21 + 𝑥2 𝑦2 𝐽22 + 𝑥2 𝑦3 𝐽23


2 2 1
+ 𝑥2 𝑦4 𝐽24 = 0 + 0 + 0 + 0 + 0 + + +0=
8 8 2
(6) The joint probability distribution of two random variable X and Y are given by
𝑓(𝑥, 𝑦) = 𝑘(2𝑥 + 𝑦) 𝑤ℎ𝑒𝑟𝑒 𝑥 𝑎𝑛𝑑 𝑦 Integers such that 0 ≤ 𝑥 ≤ 2 and 0 ≤ 𝑦 ≤3.
(a) Find the value of the constant K
(b) The marginal probability distributions of X and Y.

13
DAYANANDA SAGAR COLLEGE OF ENGINEERING
(An Autonomous Institute Affiliated to VTU, Belagavi)
Shavige Malleshwara Hills, Kumaraswamy Layout, Bengaluru-560078
DEPARTMENT OF MATHEMATICS

(c) Show that the random variables X and Y are dependent.


Solution:
𝑋 = {𝑥𝑖 } = {0,1,2} ,𝑌 = {𝑦𝑗 } = {0,1,2,3} and 𝑓(𝑥, 𝑦) = 𝑘(2𝑥 + 𝑦)
The joint probability distribution table
Y 0 1 2 3 Sum
X
0 0 k 2k 3k 6k
1 2k 3k 4k 5k 14k (a) 42𝑘 = 1
1
2 4k 5k 6k 7k 22k 𝐾 = 42
Sum 6k 9k 12k 15k 42k (b) Distribution of :

𝑥𝑖 0 1 2
𝑓(𝑥𝑖 ) 6 1 14 1 22 11
6𝑘 = = 14𝑘 = = 22𝑘 = =
42 7 42 3 42 21

Distribution of :

𝑦𝑗 0 1 2 3
𝑔(𝑦𝑗 ) 6 1 9 3 12 2 15 5
6𝑘 = = 9𝑘 = = 12𝑘 = = 15𝑘 = =
42 7 42 14 42 7 42 14
(c) It can be seen from the probability table that 𝑓(𝑥𝑖 ).𝑔(𝑦𝑗 ) ≠ 𝐽𝑖𝑗
Hence the random variables are dependent.
(7) If X and Y are random variables having joint probability function
4𝑥𝑦, 0 ≤ 𝑥 ≤ 1,0 ≤ 𝑥 ≤ 1
𝑓(𝑥, 𝑦) = {
0, 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
Verify that (i) 𝐸(𝑋 + 𝑌) = 𝐸(𝑋) + 𝐸(𝑌)
(ii) 𝐸(𝑋𝑌) = 𝐸(𝑋)𝐸(𝑌)
Solution:
∞ ∞ 1 1
1 1
2
𝜇𝑋 = 𝐸(𝑋) = ∫ ∫ 𝑥𝑓(𝑥, 𝑦)𝑑𝑥𝑑𝑦 = ∫ ∫ 𝑥(4𝑥𝑦)𝑑𝑥𝑑𝑦 = 4 ∫ 𝑥 𝑑𝑥 × ∫ 𝑦 𝑑𝑦
0 0
−∞ −∞ 0 0
1 1
𝑥3 𝑦2 2
= 4( ) ( ) =
3 0 2 0 3

14
DAYANANDA SAGAR COLLEGE OF ENGINEERING
(An Autonomous Institute Affiliated to VTU, Belagavi)
Shavige Malleshwara Hills, Kumaraswamy Layout, Bengaluru-560078
DEPARTMENT OF MATHEMATICS

∞ ∞ 1 1
1 1
2
𝜇𝑌 = 𝐸(𝑌) = ∫ ∫ 𝑥𝑓(𝑥, 𝑦)𝑑𝑥𝑑𝑦 = ∫ ∫ 𝑦(4𝑥𝑦)𝑑𝑥𝑑𝑦 = 4 ∫ 𝑦 𝑑𝑦 × ∫ 𝑥 𝑑𝑥
0 0
−∞ −∞ 0 0
1 1
𝑦3 𝑥2 2
= 4( ) ( ) =
3 0 2 0 3
∞ ∞ 1 1

𝐸(𝑋 + 𝑌) = ∫ ∫ (𝑥 + 𝑦)𝑓(𝑥, 𝑦)𝑑𝑥𝑑𝑦 = ∫ ∫(𝑥 + 𝑦)(4𝑥𝑦)𝑑𝑥𝑑𝑦


−∞ −∞ 0 0
1 1 1 1
1
𝑥3 𝑥2
= 4 ∫ ∫(𝑥 𝑦 + 𝑦 𝑥)𝑑𝑥𝑑𝑦 = 4 ∫ {( ) 𝑦 + 𝑦 2 ( ) } 𝑑𝑦
2 2
𝑦=0 3 0 2 0
0 0
1 1
1 1 2 1 𝑦2 1 𝑦3 1 1 2 4
= 4 ∫ ( 𝑦 + 𝑦 ) 𝑑𝑦 = 4 [ + ] = 4( + ) = 4( ) =
0 3 2 3 2 2 3 0 6 6 6 3
∞ ∞ 1 1 1 1

𝐸(𝑋𝑌) = ∫ ∫ (𝑥𝑦)𝑓(𝑥, 𝑦)𝑑𝑥𝑑𝑦 = ∫ ∫ 𝑥𝑦(4𝑥𝑦)𝑑𝑥𝑑𝑦 = ∫ ∫ 4 𝑥 2 𝑦 2 𝑑𝑥𝑑𝑦


−∞ −∞ 0 0 0 0

1 1
1 1 𝑥3 𝑦3 4
= 4 (∫0 𝑥 2 𝑑𝑥 × ∫0 𝑦 2 𝑑𝑦) = 4 ( ) ( ) =
3 0 3 0 9

(8) The joint probability distribution of two continuous random variables X and Y are given
𝑘𝑥𝑦, 0 ≤ 𝑥 ≤ 4,1 ≤ 𝑥 ≤ 5
by 𝑓(𝑥, 𝑦) = {
0, 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
(a) Find the value of 𝐾
(b) 𝐸(2𝑋 + 3𝑌)
(c) 𝐸(𝑋𝑌) = 𝐸(𝑋)𝐸(𝑌)
Solution:
∞ ∞
(a) ∫−∞ ∫−∞ 𝑓(𝑥, 𝑦)𝑑𝑥𝑑𝑦 = 1
4 5

∫ ∫ 𝑘𝑥𝑦𝑑𝑥𝑑𝑦 = 1
𝑥=0 𝑦=1
4
5
𝑘 (∫ 𝑥𝑑𝑥 × ∫ 𝑦 𝑑𝑦) = 1
1
0

15
DAYANANDA SAGAR COLLEGE OF ENGINEERING
(An Autonomous Institute Affiliated to VTU, Belagavi)
Shavige Malleshwara Hills, Kumaraswamy Layout, Bengaluru-560078
DEPARTMENT OF MATHEMATICS

4 5
𝑥2 𝑦2
𝑘( ) ( ) = 1
2 0 2 1
16 25 1
𝑘 ( ) ( − ) = 𝐾(8)(12) = 96𝑘 = 1
2 2 2
1
𝑘=
96
∞ ∞ 1 1
(b) 𝐸(2𝑋 + 3𝑌) = ∫−∞ ∫−∞(2𝑥 + 3𝑦)𝑓(𝑥, 𝑦)𝑑𝑥𝑑𝑦 = ∫0 ∫0 (2𝑥 + 3𝑦)(𝑘𝑥𝑦)𝑑𝑥𝑑𝑦
4 5 𝑥𝑦 4 5 𝑥2𝑦 𝑦2𝑥
= ∫0 ∫1 (2𝑥 + 3𝑦) ( 96 ) 𝑑𝑥𝑑𝑦 = ∫0 ∫1 ( 48 + ) 𝑑𝑥𝑑𝑦
32
4 4 5 5
5 𝑥3 𝑦 𝑦2 𝑥2 64 𝑦2 16 𝑦 3
= ∫𝑦=1 ( 3 ) + 32 ( 2 ) 𝑑𝑦 = 144 ( 2 ) + 64 ( 3 )
0 48 0 1 1
32 16 16 31 47
= (25 − 1) + (125 − 1) = + =
144 192 3 3 3
∞ ∞ 4 5
(c) 𝐸(𝑋𝑌) = ∫−∞ ∫−∞ 𝑥𝑦𝑓(𝑥, 𝑦)𝑑𝑥𝑑𝑦 = ∫0 ∫1 𝑥𝑦(𝑘𝑥𝑦)𝑑𝑥𝑑𝑦
4 5
1 4 5 1 𝑥3 𝑦3 1 248
= 96 ∫𝑥=0 𝑥 2 𝑑𝑥 × ∫1 𝑦 2 𝑑𝑥 = 96 ( 3 ) (3) = 864 (64)(124) =
0 1 27

(9) The joint probability distribution of two continuous random variables X and Y are given
𝑐(𝑥 2 + 𝑦 2 ),0 ≤ 𝑥 ≤ 1,0 ≤ 𝑥 ≤ 1
by 𝑓(𝑥, 𝑦) = {
0, 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
(a) Find the value of 𝑐
1 1
(b) 𝑃(𝑋 < 2 , 𝑌 > 2)
1 3
(c) 𝑃(4 < 𝑋 < 4)
1
(d) 𝑃(𝑌 < 2)

Solution:
∞ ∞
(a) ∫−∞ ∫−∞ 𝑓(𝑥, 𝑦)𝑑𝑥𝑑𝑦 = 1
1 1

∫ ∫ 𝑐(𝑥 2 + 𝑦 2 )𝑑𝑥𝑑𝑦 = 1
0 0
1 1
𝑥3
𝑐 ∫ ( + 𝑦 2 𝑥) 𝑑𝑦 = 1
𝑦=0 3 0
1
1
𝑐∫ ( + 𝑦 2 ) 𝑑𝑦 = 1
𝑦=0 3

16
DAYANANDA SAGAR COLLEGE OF ENGINEERING
(An Autonomous Institute Affiliated to VTU, Belagavi)
Shavige Malleshwara Hills, Kumaraswamy Layout, Bengaluru-560078
DEPARTMENT OF MATHEMATICS

1
1 𝑦3
𝑐( 𝑦+ ) = 1
3 3 0
1 1
𝑐( + ) = 1
3 3
3
𝑐=
2
1 1 1/2 1 3
(b) 𝑃 (𝑋 < 2 , 𝑌 > 2) = ∫0 ∫1/2 2 (𝑥 2 + 𝑦 2 )𝑑𝑥𝑑𝑦
1/2
3 1 𝑥3
= 2 ∫𝑦=1/2 ( 3 + 𝑦 2 𝑥) 𝑑𝑦
0
3 1 1 𝑦2
=2 ∫𝑦=0 (24 + ) 𝑑𝑦
2
1
3 1 𝑦3
= 2 (24 𝑦 + )
6 1/2
3 1 1 1 1 3 1 7 3 8 1
= 2 [24 (1 − 2) + 6 (1 − 8)] = 2 (48 + 48) = 2 × 48 = 4
1 3 3/4 1 3
(c) 𝑃 (4 < 𝑋 < 4) = ∫𝑥=1/4 ∫𝑦=0 2 (𝑥 2 + 𝑦 2 )𝑑𝑥𝑑𝑦
3/4
3 1 𝑥3
= 2 ∫𝑦=0 ( 3 + 𝑦 2 𝑥) 𝑑𝑦
1/4

3 1 1 27 1 3 1
=2 ∫𝑦=0 (3 (64 − 64) + 𝑦 2 (4 − 4)) 𝑑𝑦
1
3 13 𝑦3 3 13 1 3 29 29
= 2 (96 𝑦 + ) = 2 {(96) + 6} = 2 (96) = 64
6 0
1 1 1/2 3
(d) 𝑃 (𝑌 < 2) = ∫𝑥=0 ∫𝑦=0 2 (𝑥 2 + 𝑦 2 )𝑑𝑥𝑑𝑦
1
3 1/2 𝑥 3
= 2 ∫𝑦=0 ( 3 + 𝑦 2 𝑥) 𝑑𝑦
0
1/2
3 1/2 1 3 1 𝑦3 3 1 1 5
=2 ∫𝑦=0 (3 + 𝑦 2 ) 𝑑𝑦 = 2 {3 𝑦 + } = 2 (6 + 24) = 16
3 0

(10) The joint probability distribution of two continuous random variables X and Y are given
𝑥𝑦/96,0 ≤ 𝑥 ≤ 4,1 ≤ 𝑦 ≤ 5
by 𝑓(𝑥, 𝑦) = {
0, 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
Find 𝑃(𝑋 + 𝑌 < 3).

17
DAYANANDA SAGAR COLLEGE OF ENGINEERING
(An Autonomous Institute Affiliated to VTU, Belagavi)
Shavige Malleshwara Hills, Kumaraswamy Layout, Bengaluru-560078
DEPARTMENT OF MATHEMATICS

Solution:
The region bounded by the lines 𝑥 = 0, 𝑥 = 4, 𝑦 = 1, 𝑦 = 5 is a square region as shown in
the figure

2 3−𝑥 3−𝑥

𝑥𝑦 1 2 𝑦2
𝑃(𝑋 + 𝑌 < 3) = ∬ 𝑓(𝑥, 𝑦)𝑑𝑥𝑑𝑦 = ∫ ∫ 𝑑𝑦𝑑𝑥 = ∫ 𝑥( ) 𝑑𝑥
𝑅 96 96 𝑥=0 2 1
𝑥=0 𝑦=1
2 2
1 1
= ∫ 𝑥((3 − 𝑥)2 ) − 1) 𝑑𝑥 = ∫(8𝑥 − 6𝑥 2 + 𝑥 3 ) 𝑑𝑥
192 192
𝑥=0 𝑥=0
2
1 8𝑥 2 𝑥3 𝑥4 1 1 1
= 192 ( −6 + ) = 192 (16 − 16 + 4) = 192 (4) = 48
2 3 4 0

Exercises
(1) X and Y are independent random variables. X take values 2,5,7 with probabilities ½, ¼, ¼
respectively take values 3,4,5 with probabilities 1/3,1/3,1/3.
(i) find the joint distribution of X and Y.
(ii) Show that COV(X,Y)=0
(iii) Find the probability distribution of Z=X+Y
(2) The joint probability distribution of two random variables X and Y are given as:
X Y 1 3 9
2 1 1 1
8 24 12
4 1 1 0
4 4
6 1 1 1
8 24 12
Find the (i) Marginal distribution of X and Y (ii) COV(X,Y)
(3) If X and Y are independent random variables, prove the following results.
(i) E (XY) =E(X).E(Y) (ii) COV(X, Y) =0 and (iii) 𝜎 2𝑋+𝑌 = 𝜎 2𝑋 + 𝜎 2 𝑌 .

18
DAYANANDA SAGAR COLLEGE OF ENGINEERING
(An Autonomous Institute Affiliated to VTU, Belagavi)
Shavige Malleshwara Hills, Kumaraswamy Layout, Bengaluru-560078
DEPARTMENT OF MATHEMATICS

(4) Two marbles are drawn from a box containing 3 blue, 2 red and 3 green marbles. If x is
the number of blue marbles and Y is the number of red marbles. Form
(i) The joint probability distribution of X and Y (ii) Find E(x) and E(Y)
(5) If the joint Probability function of 2 continuous random variables X and Y is given by
1
(6 − 𝑥 − 𝑦), 0 < 𝑥 < 2, 2 < 𝑦 < 4
𝑓(𝑥, 𝑦) = {8
0, 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒

Find (i) 𝑃(𝑥 + 𝑦 < 3) (ii) 𝑃(𝑥 < 1, 𝑦 < 3)

𝑒 −(𝑥+𝑦) , 𝑥 ≥ 0 , 𝑦 ≥ 0
(6) Verify that 𝑓(𝑥, 𝑦) = { is a density function of joint probability
0, 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
distribution. Also evaluate (𝑖)𝑃(𝑥 < 1) (ii) 𝑃(𝑥 > 𝑦)(ii) 𝑃(𝑥 + 𝑦 ≤ 1)

Queueing Theory
A common situation that occurs in everyday life is that of waiting in a line either at bus stops,
petrol pumps, restaurants, ticket booths, doctors’ clinics, bank counters, traffic lights and so
on. Queues (waiting lines) are also found in workshops where the machines wait to be
repaired; at a tool crib where the mechanics wait to receive tools; in a warehouse where items
wait to be used, incoming calls wait to mature in the telephone exchange, trucks wait to be
unloaded, airplanes wait either to take off or land and so on. In general, a queue is formed at a
production/operation system when either customers (human beings or physical entities)
requiring service wait because number of customers exceeds the number of service facilities,
or service facilities do not work efficiently/take more time than prescribed to serve a
customer. Queuing theory can be applied to a variety of situations where it is not possible to
accurately predict the arrival rate (or time) of customers and service rate (or time) of service
facility or facilities. In particular, it can be used to determine the level of service (either the
service rate or the number of service facilities) that balances the following two conflicting
costs: (i) cost of offering the service (ii) cost incurred due to delay in offering service The
first cost is associated with the service facilities and their operation, and the second represents
the cost of customers waiting for service. Obviously, an increase in the existing service
facilities would reduce the customer’s waiting time. Conversely, decreasing the level of
service would result in long queue(s). This means an increase in the level of service increases
the cost of operating service facilities but decreases the cost of customers waiting for service.

19
DAYANANDA SAGAR COLLEGE OF ENGINEERING
(An Autonomous Institute Affiliated to VTU, Belagavi)
Shavige Malleshwara Hills, Kumaraswamy Layout, Bengaluru-560078
DEPARTMENT OF MATHEMATICS

Figure 16.1 illustrates both types of costs as a function of level of service. The combined cost
of service and customer waiting cost is U-shaped because of their trade-off relationship. The
total cost is minimized at the lowest point of the total cost curve. The service level is one that
minimizes the sum of the two costs.

Since customer waiting cost for service is difficult to estimate, it is usually measured in terms
of loss of sales or goodwill when the customer is a human being and has no sympathy with
the service system. But, if the customer is a machine waiting for repair, then cost of waiting is
measured in terms of cost of lost production. Many real-l

THE STRUCTURE OF A QUEUING SYSTEM


The major components (parts or elements) of any waiting-line (queuing) system are shown in
Fig.

Each of these components is discussed below: 1. Calling population (or input source) 2.
Queuing process 3. Queue discipline 4. Service process (or mechanism) Potential customers
who arrive to the queuing system is referred as calling population, also known as customer
(input) source. The manner in which customers arrive at the service facility, individually, or
in batches, at scheduled or unscheduled time is called the arrival process. The customer’s

20
DAYANANDA SAGAR COLLEGE OF ENGINEERING
(An Autonomous Institute Affiliated to VTU, Belagavi)
Shavige Malleshwara Hills, Kumaraswamy Layout, Bengaluru-560078
DEPARTMENT OF MATHEMATICS

entry into the queuing system depends upon the queue conditions. Customers, from a queue,
are selected for service according to certain rules known as queue discipline. A service
facility may be without server (self service), or may include one or more servers operating
either in a series (as a team) or in parallel (multiple service channels). The rate (constant or
random) at which service is rendered is known as the service process. After the service is
rendered, the customer leaves the system. If the server is idle at the time of the customer’s
arrival, then the customer is served immediately, otherwise the customer is asked to join a
queue or waiting line, which may have single, multiple or even priority lines

Queuing Process
The queuing process refers to the number of queues – single, multiple or priority queues and
their lengths. The type of queue depends on the layout of service mechanism and the length
(or size) of a queue depends upon operational situations such as physical space, legal
restrictions, and attitude of the customers. In certain cases, a service system is unable to
accommodate more than the required number of customers at a time. No further customers
are allowed to enter until more space is made available to accommodate new customers. Such
type of situations are referred to as finite (or limited) source queue. Examples of finite source
queues are cinema halls, restaurants, etc. On the other hand, if a service system Fig. 16.3 564
Operations Research: Theory and Applications is able to accommodate any number of
customers at a time, then it is referred to as infinite (or unlimited) source queue. For example,
in a sales department where the customer orders are received, there is no restriction on the
number of orders that can come in. On arriving at a service system, if customers find long
queue(s) in front of a service facility, then they often do not enter the service system inspite
additional waiting space is available. The queue length in such cases depends upon the
attitude of the customers. For example, when a motorist finds that there are many vehicles
waiting at the petrol station, in most of the cases, he does not stop at this station and seeks
service elsewhere. In some finite source queuing systems, no queue is allowed to form. For
example, when a parking space (service facility) cannot accommodate additional incoming
vehicles (customers), the motorists are diverted elsewhere. Multiple queues at a service
facility can also be finite or infinite. But this has certain advantages such as: • Division of

21
DAYANANDA SAGAR COLLEGE OF ENGINEERING
(An Autonomous Institute Affiliated to VTU, Belagavi)
Shavige Malleshwara Hills, Kumaraswamy Layout, Bengaluru-560078
DEPARTMENT OF MATHEMATICS

manpower is possible. • Customer has the option of joining any queue. • Balking behaviour
of the customers can be controlled

SINGLE-SERVER QUEUING MODELS Model I: {(M/M/1): (∞/FCFS)} Exponential


Service – Unlimited Queue This model is based on certain assumptions about the queuing
system: (i) Arrivals are described by Poisson probability distribution and come from an
infinite calling population. (ii) Single waiting line and each arrival waits to be served
regardless of the length of the queue (i.e. no limit on queue length – infinite capacity) and
that there is no balking or reneging. (iii) Queue discipline is ‘first-come, first-served’. (iv)
Single server or channel and service times follow exponential distribution. (v) Customers
arrival is independent but the arrival rate (average number of arrivals) does not change over
time. (vi) The average service rate is more than the average arrival rate. The following events
(possibilities) may occur during a small interval of time, ∆t just before time t. 1. The system
is in state n (number of customers) at time t and no arrival and no departure. 2. The system is
in state n + 1 (number of customers) and no arrival and one departure. 3. The system is in
state n – 1 (number of customers) and one arrival and no departure. Figure illustrates the
process of determining Pn (probability of n customers in the system at time t) when
customers are either waiting or receiving service at each state. Customers may arrival or left
by the completion of the leading customer’s service

22
DAYANANDA SAGAR COLLEGE OF ENGINEERING
(An Autonomous Institute Affiliated to VTU, Belagavi)
Shavige Malleshwara Hills, Kumaraswamy Layout, Bengaluru-560078
DEPARTMENT OF MATHEMATICS

23

You might also like