Chapter 8
Chapter 8
Chapter 8
EXPECTATION
Content
8.1 Expectation of a random variable
8.2 Expectation of a function of a random variable
8.3 Properties of expectation
8.4 Variance of a random variable and its Properties
8.5 Moments and moment generating function
8.6 Chebyshev's inequality
8.7 Covariance and correlation coefficient
EXPECTATION
Most probability distributions are characterized by their mean and variance.
The mean of a random variable is referred to as expectation, provided the
expected value converges to a constant.
If we want to summarize a random variable by a single number, then this
number should undoubtedly be its expected value.
The expected value, also called the expectation or mean, gives the center in
the sense of average value of the distribution of the random variable.
If we allow a second number to describe the random variable, then we look
at its variance, which is a measure of spread of the distribution of the
random variable.
Expectation of a random variable
…, xn.
x2, …, xn.
g xi p X xi if x is discrete
E X
g xi f x dx
if x is continuous
8
Exercise: Let X has a pdf given as follows:
21 x , 0 x 1
f x
0, Otherwise
What is E(X 2)
Expectation of two-dimensional random
variable
• Definition: Let (X, Y) be a two-dimensional random variable
and Z = H(X, Y) be real-valued function of (X, Y) .The expected
value of the random variable Z is defined as follows:
a. If Z is discrete random variable with possible values z1, z2,…
and P(zi) = P(Z = zi) then
3 1 3 1
2ir 2
E ( Z ) irf (i, r )ir ir ir
0 0 0 0 9
2 r 3 3
3
2 3 i 1 3 4
r ( )r
90 30 27 4 0 2
Properties of Expectation
1. If C is any constant value, then E(C) = C
2. E(CX) = C ∗ E(X)
3. If X and Y are any two random variables, then E(X + Y) =
E(X) + E(Y)
4. If X and Y be two independent random variables, then E(XY)
= E(X) ∗ E(Y)
5. If a and b are any constant numbers, then E(a + bX)= a + bE(X)
6. E(X) > 0 if X > 0
7. |E(X)| ≤ E(|X|)
That is, the mean value of the square of the deviations of X from
its mean is called the variance of X or the variance of the
distribution.
The positive square root of V(X) is called the standard deviation
of X and it is denoted by s.d(X).
It is worthwhile to observe that
E X xpx 0 1 1 3 2 3 3 1 1.5
8 8 8 8
E X x px 0 1 1 3 2 3 3 1 3
2 2 2 2 2 2
8 8 8 8
V X E X 2 E X
2
0.75
Example 2: Let x has a pdf
1 13 12 1 13 12
2 3 2 2 3 2
1
3
1
x 1
E X 2 x2 dx
1 2
1
1x 4
x
3
2 4 3 1
1 14 13 1 14 13
2 4 3 2 4 3
1
3
V X E X 2 E X
2
2
1 1 2
3 3 9
Properties of Variance
Let X be a random variable
The expression E[(X −k) 2] assumes its minimum value when k=E(X)
b) Choosing C = µ we obtain
c) Choosing C = µ and ε = kσ, where σ2= Var (X) > 0 we obtain
• Remarks:
• −1 ≤ ρ ≤ 1
• If ρ = −1 there is perfect inverse (negative) relationship or
dependence between the two variables X and Y.
• If -1 < ρ < 0, then there is inverse relationship (correlation)
between X and Y. The strength of the relationship will be
determined based on the magnitude of ρ. If - 0.5 < ρ < 0, then the
relationship is inverse-weak. On the other hand, if - 1 < ρ < - 0.5,
then the relationship is inverse-strong.
• If ρ = 0, then the two variable X and Y are uncorrelated, no linear
correlation or relationship. It has to be stressed that absence of
correlation is not a guarantee for independence. But, the converse
is true.
• If 0 < ρ < 1, then there is direct (positive) relationship between X
and Y. The strength of the relationship will be determined from
the magnitude of ρ. If 0 < ρ < 0.5, the relationship is direct-weak.
If, on the other hand, 0.5 < ρ < 1, the relationship is direct-strong.
• Finally if ρ =1 there is perfect direct relationship between the two
random variables X and Y.
• Example: Let the random variables X and Y have the joint P.d.f,
• Proof:
• Hence
• Theorem: suppose that X and Y are independent random variable.
Then
• Example 1: Let
• Hence
• Example 2: Consider the following joint probability distribution.
Remarks:
• The 1st central moment about the mean is zero.
• The second central moment about the mean is the variance of the
random variable.
• All odd moments of X about μx are 0 if the density function of X is
symmetrical about μx, provided that such moment exist.
• Example: Let X have the P.d.f
• Solution: