L05 Final

Download as pdf or txt
Download as pdf or txt
You are on page 1of 19

LECTURE 5: Discrete random variables:

probability mass functions and expectations

• Random variables: the idea and the definition


- Discrete: take values in finite or countable set

• Probability mass function (PMF)

• Random variable examples


Bernoulli
Uniform
Binomial
Geometric

• Expectation (mean) and its properties


The expected value rule
Linearity
1
CHAPTER

Discrete Random
Variables 3

In most random experiments we are interested in a numerical attribute of the outcome


of the experiment. A random variable is defined as a function that assigns a numerical
value to the outcome of the experiment. In this chapter we introduce the concept of a
random variable and methods for calculating probabilities of events involving a ran-
dom variable. We focus on the simplest case, that of discrete random variables, and in-
troduce the probability mass function. We define the expected value of a random
variable and relate it to our intuitive notion of an average. We also introduce the con-
ditional probability mass function for the case where we are given partial information
about the random variable. These concepts and their extension in Chapter 4 provide us
with the tools to evaluate the probabilities and averages of interest in the design of sys-
tems involving randomness.
Throughout the chapter we introduce important random variables and discuss
typical applications where they arise. We also present methods for generating random
variables. These methods are used in computer simulation models that predict the be-
havior and performance of complex modern systems.

3.1 THE NOTION OF A RANDOM VARIABLE


The outcome of a random experiment need not be a number. However, we are usually
interested not in the outcome itself, but rather in some measurement or numerical at-
tribute of the outcome. For example, in n tosses of a coin, we may be interested in the
total number of heads and not in the specific order in which heads and tails occur. In a
randomly selected Web document, we may be interested only in the length of the doc-
ument. In each of these examples, a measurement assigns a numerical value to the out-
come of the random experiment. Since the outcomes are random, the results of the
measurements will also be random. Hence it makes sense to talk about the probabili-
ties of the resulting numerical values. The concept of a random variable formalizes this
notion.
A random variable X is a function that assigns a real number, X1z2, to each out-
come z in the sample space of a random experiment. Recall that a function is simply a
rule for assigning a numerical value to each element of a set, as shown pictorially in

96
Section 3.1 The Notion of a Random Variable 97

S
X(z)  x

real
z
x line

SX

FIGURE 3.1
A random variable assigns a number X1z2 to each outcome z in the
sample space S of a random experiment.

Fig. 3.1. The specification of a measurement on the outcome of a random experiment


defines a function on the sample space, and hence a random variable. The sample space
S is the domain of the random variable, and the set SX of all values taken on by X is the
range of the random variable. Thus SX is a subset of the set of all real numbers. We will
use the following notation: capital letters denote random variables, e.g., X or Y, and
lower case letters denote possible values of the random variables, e.g., x or y.

Example 3.1 Coin Tosses


A coin is tossed three times and the sequence of heads and tails is noted.The sample space for this
experiment is S = 5HHH, HHT, HTH, HTT, THH, THT, TTH, TTT6. Let X be the number of
heads in the three tosses. X assigns each outcome z in S a number from the set SX = 50, 1, 2, 36.
The table below lists the eight outcomes of S and the corresponding values of X.

z: HHH HHT HTH THH HTT THT TTH TTT

X1z2: 3 2 2 2 1 1 1 0

X is then a random variable taking on values in the set SX = 50, 1, 2, 36.

Example 3.2 A Betting Game R.V and Function of a R.V


A player pays $1.50 to play the following game: A coin is tossed three times and the number of
heads X is counted. The player receives $1 if X = 2 and $8 if X = 3, but nothing otherwise. Let
Y be the reward to the player. Y is a function of the random variable X and its outcomes can be
related back to the sample space of the underlying random experiment as follows:

z: HHH HHT HTH THH HTT THT TTH TTT

X1z2: 3 2 2 2 1 1 1 0
Y1z2: 8 1 1 1 0 0 0 0

Y is then a random variable taking on values in the set SY = 50, 1, 86.


Random variables: the formalism

• A random variable ("r.v,") associates a value (a number)


to every possible outcome

• Mathematically: A function from the sample space Sl to the real numbers

• It can take discrete or continuous values

Notation: random variable X numerical value x

• We can have several random variables defined on the same sample space

• A function of one or several random variables is also a random variable

- meaning of X + Y: -t 0. k<2. 5 'II.. ~ c..£ "­


')( ik) v.J.ue.
3
Probability mass function (PMF) of a discrete r.v. X x

• It is the " probability law" or "probability distribution" of X >2 a • x

• If we fix some x, then "X = x" is an event


><;=5" fw : XC...)" ~~ = fa,),s
fiC (5") = '(i
1
prob =­
4
PX( x ) = P(X = x ) = p({w E >2 S.t. X(w) = x}) P'f (7)
"
Px
• Properties: px (x ) >0 L>X(x) = 1 '/1.
x
./'1 ./~

4
PMF calculation

• Two rolls of a tetrahedral die • Let every possible outcome have probability 1/ 16

;- , !> 1 z=x+y Fin d pz(z) fo~ 0. ell.. ~


,,.
4

'i • repeat for all z:


,b
3
Y == Secund
- roll
- collect all possible outcomes for which Z is equal to z
2
, '1
- add their probabilities

3 4 Pl- (~) = 1. (2- :. 2) :- '/1 G


pz(z) -
x == First roll
Pz. (:l» : f.. (2- = '5) -" 2.11 b

'I.(t" Pa (4) -;1(2:::.4) = 3/16




• •
5
The simplest random variable: Bernoulli with parameter p E [0 , 1)

x = 1, w .p . p
f. (<» :> 1 - f 1-1' p
0, w .p . 1 - p '1',,(1):: p H
I

• Models a tria l t hat resu lts in success/ f ai lu re, Heads/ T ai ls, et c.

• In d ica t or r .v . o f an event A: I A = 1 iff A occu rs

6
Discrete uniform random variable; parameters a, b

• Parameters: integers a, b; a <b


• Experiment: Pi ck o ne of a , a + 1, ... , b at r ando m ; all equa ll y likely

• Sample space: {a , a + 1, ... , b} b -Q +~


• Random variable X : X(w) = w

• Model of: co m p let e ig norance Special case: a = b


co nst ant/ d et ermini sti c r .v.

Px (x ) px(x)

1 ............ , 1 ............ .
b a+ 1
• • • • • •

x
-
b
7
a x
Binomial random variable; parameters: positive integer n; p E [0 , 1]

• Experiment: n independent tosses of a coin with P(Heads) = p

• Sample space: Set of sequences of Hand T, of length n


• Random variable X: number of Heads observed

• Model of: number of successes in a given number of independent trials

p 1-11-11-1
x Px(2)=f(X=2)
1- P HI-IT =.f (HI·/T)+l(HTH)I-£(TlI]1)
P 1m, 3
P 1- P

1- P H1T 2
P UlH ____
1- P
1- P HIT ­
1
P ITH
o
1- P TIT
8
n 3 n 10 n 100
•, ..
••
•, •,
•• p -- 0.5 , p -- 0.5 p 0.5
, "
",
, •,
• ••
•• , , ."
. .
•, , • ,
",
,
"
,
"
, • • , ,
, • • " ,

••
. •,
.
.
p - 0.2
.., p -- 0.2 "
"
p - 0. 1


,
" •,
" ••
·,c-c.", -;.-,.",, -j,-""-',c-,,,,-;,c-,,",, -j, .~
• , , ! • •
, , ,

9
Geometric random variable; parameter p: 0 <p < 1

• Experiment: infinitely many independent tosses of a coin; P(Heads) = p

• Sample space: Set of infinite sequences of Hand T


-rTTTI-IHT •••
( ,

• Random variable X: number of tosses until the first Heads

• Model of: waiting times; number of trials until a success

px(k) = 1 (X:: ~) = 1 (~T J1 ') ::c " - p{-' p


l< - ,

p p = 1/ 3
P( no Heads ever)
;­ r·- p)f 1
~TI'" ..) k
(._1"1 I'

"x ::.09 ·f
J):~oo
o 123456789 k

10
Expectation/mean of a random variable

• Motivation: Playa game 1000 times.


Random gain at each play described by:

• "Average" gain:
I'~OO. 7.<;00 ~ '1.~oo.
1000

-I.~ .. Q , y. ~ ­
- 0
I
~.-+
to I '1
Ie:>

• Interpretation: Average in large number


• Definition:
x of independent repetitions of the experiment

• Caution: If we have an infinite sum, it needs to be well-defined.


We assume 2: lx lpx(x) < 00
x
11
Expectation of a Bernoulli r.v.

1, w.p_ p
X=
0, w.p. 1 - p

If X is the indicator of an event A. X = I A:

x ~ :! ; if IJ 0 c. c.u 1.5 r ' e. (4)


E[I A ] ~ .f (jI))

12
Expectation of a uniform r.v .

• Uniform on O, l , ... , n

px (x )

1
n +1

• • • • ••
• Definition: E[X] = L XPX ( x )
x
0 1 n x

, , ,
E[X] = 0--+1--
' -
........ , ""+1
,
:0 --_
(!)+ 1+ •• _ +"") :: _'_.• "" (""+d =
"" + ,
2

13
Expectation as a population average

• n students

• Weight of i th student: xi

• Experiment: pick a student at random, all equally likely

• Random variable X : weight of selected student

- assume the Xi are distinct


J
PX(Xi) = -­
'>1

E[X] = I
­-
, "'l.

14
Elementary properties of expectations

• Definition: E[X) = LXPX(x)


x
• If X > 0 , th e n E[X) >0
10r Co~!l w; X (w) >--0 >"0

• If a < X < b, th e n a < E[X) < b


-
for 0, eQ. w: 0.0'(",) ~h
:: Q. ;? p" (>;t:.) ~ a. -1. =Q
'Z

• If c is a co ns t a nt , E[c) = c

15
The expected value rule. for calculating E [g(X) ] x y

• L et X be a r.v. and let Y = g(X) 9


0.4 5 5
• Averaging over y: E[Y] = L YPY(Y)
y 4
3,(0.1+0.2) +'1,(0.3+0.'11
3· o. I ~. 0.2' Y·0.> t 3
• Averaging over x: ... Lj '0.5
2 2
E [Y] = E[g(X)] = Lg(x)pX(x)
~ __________ ~T~ _ _ _ __

prOOf:.z L ~(!l:) f'.. C"-) • E[X2] =


'f It; '(~):.)'
~(?:) ,,~i

::~ ~ 1, f,,(2:) -:: :2., Z fx(~) • Ca ution: In general, E[g (X) ] ~ g(E[X])
"1 'I;~,("'l=1 7 ,;t:~~)"i' I
£ [)<"2] -=F @[><1).t.
'" LY fyC r) =f[If]
")'
16
Linearity of expectation: E[aX + bl = aE[XI +b
")( =5o.Qa:?1 r:[}(] Qve~Q"1? 5Ct.eo.'y
0

y , .. e'" )0 e~ ,r ~ i X + I0 0 E[ '{ J ; E [2. X + I 0 oJ : 2 f [;rJ + )0 0

• Int uit ive

• Derivation , based on the expected value rule: pC -:1:) ::. a. ':C + h


l( = ,,(x)

= L (Q.9:~ 1,) y" (-:1:) ; a. 2>: 1'1< ('J;) +- h 2' PI< (t)
". ?< ?-.-... ./
:1..

• e."lee Q p+io",Oo ~
17
MIT OpenCourseWare
https://ocw.mit.edu

Resource: Introduction to Probability


John Tsitsiklis and Patrick Jaillet

The following may not correspond to a particular course on MIT OpenCourseWare, but has been provided by the author as an individual learning resource.

For information about citing these materials or our Terms of Use, visit: https://ocw.mit.edu/terms.

You might also like