Chapter-4: Pairs of Random Variables
Chapter-4: Pairs of Random Variables
Chapter-4: Pairs of Random Variables
This chapter and the next one analyze experiments in which an outcome is a collection
of numbers.
The probability model for such an experiment contains the properties of the individual
random variables and it also contains the relationships among the random variables.
The present chapter considers all random variables because a high proportion of the
definitions and theorems apply to both discrete and continuous random variables.
However, just as with individual random variables, the details of numerical calculations
depend on whether random variables are discrete or continuous.
One formula, for discrete random variables, contains sums, and the other formula, for
continuous random variables, contains integrals.
This chapter analyzes experiments that produce two random variables, X and Y
C
Introduction
Pairs of random variables appear in a wide variety of practical situations.
An example of two random variables that we encounter all the time in our research is the
signal ( X ), emitted by a radio transmitter, and the corresponding signal ( Y ) that
eventually arrives at a receiver.
Noise and distortion prevent us from observing X directly and we use the probability
model f X,Y ( x, y ) to estimate X.
Another example is the strength of the signal at a cellular telephone base station
receiver ( Y ) and the distance ( X ) of the telephone from the base station.
This chapter establishes the mathematical models for studying multiple continuous
random variables.
C
Joint Cumulative Distribution Function
In an experiment that produces one random variable, events are points or intervals on a
line.
In an experiment that leads to two random variables X and Y, each outcome (x, y) is a
point in a plane and events are points or areas in the plane.
Just as the CDF of one random variable, Fx ( x ), is the probability of the interval to the
left of x.
The joint CDF Fx,y ( x, y ) of two random variables is the probability of the area in the
plane below and to the left of (x, y).
This is the infinite region that includes the shaded area in Figure 4.1 and everything
below and to the left of it.
C
Joint Cumulative Distribution Function
C
Joint Cumulative Distribution Function
Theorem 4.1
C
Joint Probability Mass Function
Definition 4.2
C
Joint Probability Mass Function
Example 4.1
Test two integrated circuits one after the other.
On each test, the possible outcomes are a (accept) and r (reject).
Assume that all circuits are acceptable with probability 0.9 and that the outcomes of
successive tests are independent.
Count the number of acceptable circuits X and count the number of successful tests
Y before you observe the first reject. ( If both tests are successful, let Y = 2. )
Draw a tree diagram for the experiment and find the joint PMF of X and Y.
The experiment has the following tree diagram.
C
C
Joint Probability Mass Function
Theorem 4.2
C
Joint Probability Mass Function
Example 4.2
Continuing Example 4.1, find the probability of the event B that X, the number of
acceptable circuits, equals Y, the number of tests before observing the first
failure.
C
Marginal PMF
Theorem 4.3
C
C
Example 4.3
Marginal PMF
C
Joint Probability Density Function
Definition 4.3
C
Joint Probability Density Function
C
Joint Probability Density Function
Theorem 4.4
C
Joint Probability Density Function
Theorem 4.5
C
Joint Probability Density Function
Theorem 4.6
C
Joint Probability Density Function
Theorem 4.7
C
Joint Probability Density Function
Example 4.4
C
Joint Probability Density Function
Example 4.5
C
Joint Probability Density Function
The difficulty with this integral is that the nature of the region of integration depends
critically on x and y.
C
C
Joint Probability Density Function
C
Joint Probability Density Function
Example 4.6
C
Joint Probability Density Function
C
Joint Probability Density Function
C
Marginal PDF
Suppose we perform an experiment that produces a pair of random variables X and Y
with joint PDF
In particular,
The situation parallels (with integrals replacing sums) the relationship in Theorem 4.3
between the joint PMF P x,y ( x, y ), and the marginal PMFs and
C
Marginal PDF
Example 4.7
C
Marginal PDF
C
Functions of Two Random Variables
There are many situations in which we observe two random variables and use their
values to compute a new random variable.
For example, we can describe the amplitude of the signal transmitted by a radio station
as a random variable, X.
We can describe the attenuation of the signal as it travels to the antenna of a moving car
as another random variable, Y.
In this case the amplitude of the signal at the radio receiver in the car is the random
variable W = X / Y.
Other practical examples appear in cellular telephone base stations with two antennas.
The amplitudes of the signals arriving at the two antennas are modeled as random
variables X and Y.
The radio receiver connected to the two antennas can use the received signals in a
variety of ways.
C
It can choose the signal with the larger amplitude and ignore the other one.
In this case, the receiver produces the random variable
otherwise.
This is an example of selection diversity combining.
A third alternative is to combine the two signals unequally in order to give less weight
to the signal considered to be more distorted.
In this case W = aX + bY.
W = g ( X, Y ) has PMF
C
Example 4.8
A firm sends out two kinds of promotional facsimiles.
One kind contains only text and requires 40 seconds to transmit each page.
The other kind contains grayscale pictures that take 60 seconds per page. Faxes can be
1, 2, or 3 pages long.
Let the random variable L represent the length of a fax in pages. SL= {1,2, 3}.
Let the random variable T represent the time to send each page. ST = {40,60}.
After observing many fax transmissions, the firm derives the following probability model:
C
C
Theorem 4.10
C
Theorem 4.11
C
Example 4.9
C
C
C
Expected Values
There are many situations in which we are interested only in the expected value of a
derived random variable W = g ( X, Y ), not the entire probability model.
In these situations, we can obtain the expected value directly from P X,Y ( x, y ) or
f X,Y ( x, y ) without taking the trouble to compute P w ( w ) or f w ( w ).
Corresponding to Theorems 2.10 and 3.4, we have:
C
Theorem 4.12
C
Example 4.11
C
Theorem 4.13
C
Theorem 4.14
C
Theorem 4.15
C
Definition 4.4
Covariance
C
Definition 4.5
Correlation
C
Theorem 4.16
C
Example 4.12
C
C
Definition 4.6
C
Definition 4.7
C
Definition 4.8
Correlation Coefficient
C
Theorem 4.17
C
Theorem 4.18
C
Theorem 4.18 Proof
C
Conditioning by an Event
If X and Y are discrete, the new model is a conditional joint PMF, the ratio of the joint
PMF to P[B].
If X and Y are continuous, the new model is a conditional joint PDF, defined as the ratio
of the joint PDF to P[B].
The definitions of these functions follow from the same intuition as Definition 1.6 for the
conditional probability of an event.
Section 4.9 considers the special case of an event that corresponds to an observation of
one of the two random variables: either B = { X = x }, or B = [ Y = y }.
C
Definition 4.9
C
Theorem 4.19
C
Example 4.13
C
Definition 4.10
C
Example 4.14
C
C
Theorem 4.20
Conditional Expected Value
For random variables X and Y and an event B of nonzero probability, the conditional
expected value of W = g ( X, Y ) given B is
C
Definition 4.11
Conditional variance
C
Theorem 4.21
C
Example 4.15
C
Example 4.16
C
Conditioning by a Random Variable
we use the partial knowledge that the outcome of an experiment in order
to derive a new probability model for the experiment.
Now we turn our attention to the special case in which the partial knowledge consists of
the value of one of the random variables: either
Learning { Y = y } changes our knowledge of random variables X, Y.
We now have complete knowledge of Y and modified knowledge of X.
From this information, we derive a modified probability model for X.
The new model is either a conditional PMF of X given For a conditional PDF of X given
Y.
When X and Y are discrete, the conditional PMF and associated expected values
represent a specialized notation for their counterparts,
By contrast, when X and Y are continuous, we cannot apply Section 4.8 directly because
as discussed in Chapter 3.
Instead, we define a conditional PDF as the ratio of the joint PDF to the marginal PDF.
C
Definition 4.12
Conditional PMF
C
Theorem 4.22
C
Example 4.17
C
C
C
Theorem 4.23
C
Example 4.18
C
Definition 4.13
Conditional PDF
C
Example 4.19
C
C
Theorem 4.24
C
Definition 4.14
C
Definition 4.15
C
Example 4.20
C
Theorem 4.25
Iterated Expectation
C
Theorem 4.26
C
Example 4.21
At noon on a weekday, we begin recording new call attempts at a telephone switch.
Let X denote the arrival time of the first call, as measured by the number of seconds
after noon.
In the most common model used in the telephone industry, X and Y are continuous
random variables with joint PDF
C
C
Example 4.22
C
Independent Random Variables
Applying the idea of independence to random variables, we say that X and Y are
independent random variables
C
Definition 4.16
C
Example 4.23
C
Example 4.24
C
Theorem 4.27
C
Example 4.25
C
Bivariate Gaussian Random Variables
The bivariate Gaussian disribution is a probability model for X and Y with the property
that X and Y are each Gaussian random variables.
C
Definition 4.17
C
C
C
Theorem 4.28
C
Theorem 4.29
C
C
Theorem 4.30
C
Theorem 4.31
C
Theorem 4.32
C
Summary
C
Summary