Article 6
Article 6
Article 6
Distributions
0/ 32
Go to “BACKGROUND COURSE NOTES” at the end of my web page and
download the file distributions.
Today we say goodbye to the elementary theory of probability and start Chapter
3. We will open the door to the application of algebra to probability theory by
introduction the concept of “random variable”.
1/ 32
2/ 32
Intuitive Idea
A random variable is a function, whose values have probabilities attached.
Remark
To go from the mathematical definition to the “intuitive idea” is tricky and not
really that important at this stage.
3/ 32
X = number of heads
What are
P (X = 0), P (X = 3), P (X = 1), P (X = 2)
4/ 32
1
P (X = 0) = P (TTT ) =
8
3
P (X = 1) = P (HTT ) + P (THT ) + P (TTH ) =
8
3
P (X = 2) = P (HHT ) + P (HTH ) + P (THH ) =
8
1
P (X = 3) = P (HHH ) =
8
We will tabulate this
Value X 0 1 2 3
1 3 3 1
Probability of the value P (X = x )
8 8 8 8
Get used to such tabular presentations.
5/ 32
6/ 32
7/ 32
X 0 1
P (X = x ) q p
8/ 32
Definition
A Bernoulli experiment is an experiment which has two outcomes which we call
(by convention) “success” S and failure F.
Example
Flipping a coin. We will call a head a success and a tail a failure.
Z Often we call a “success” something that is in fact for from an actual success-
e.g., a machine breaking down.
9/ 32
S = {S , F }.
X (S ) = 1 and X (F ) = 0
so
P (X = 1) = P (S ) = p and P (X = 0) = P (F ) = q.
10/ 32
0
So a finite subset of R is discrete but so is the set of integers Z.
11/ 32
Definition
A random variable is said to be discrete if its set of possible values is a discrete
set.
12/ 32
pX (x ) = P (X = x )
13/ 32
X 1 0
P (X = x ) p q table
line graph
0 1
histogram
0 1
14/ 32
X 0 1 2 3
1 3 3 1
P (X = x ) 8 8 8 8 table
0 1 2 3
0 1 2 3
15/ 32
16/ 32
5000 etc
17/ 32
18/ 32
19/ 32
line graph of
0 1 2 3
So we start accumulation probability at X = 0
Ordinary Graph of F
0 1 2 3
20/ 32
You can see you here to be careful about the inequalities on the right-hand side.
21/ 32
We will need the relation between the probability mass function p (x ) and the
cumulative distribution function F (x ). Recall that if F (x ) is a function of an
integer variable x then the backward difference function (discrete derivative)
∆F of F is defined by
∆F (x ) = F (x ) − F (x − 1).
Theorem 1
p (x ) = ∆F (x )
22/ 32
Definition
Let X be a discrete random variable with set of possible values D and pmf p (x ).
The expected value or mean value of X denote E (X ) or µ (Greek letter mu) is
defined by X X
E (X ) = ×P (X = x ) = ×p (x )
x ∈D x ∈D
Remark
E (X ) is the whole point for monetary games of chance e.g., lotteries, blackjack,
slot machines.
If X = your payoff, the operators of these games make sure E (X ) < 0. Thorp’s
card-counting strategy in blackjack changed E (X ) < 0 (because ties went to the
dealer) to E (X ) > 0 to the dismay of the casinos. See “How to Beat the Dealer”
by Edward Thorp (a math professor at UCIrvine).
23/ 32
The expected value for the basic example (so the expected number of needs)
! ! ! !
1 3 3 1
E (X ) = (0) + (1) + (2) + (3)
8 8 8 8
3
=
2
24/ 32
! ! !
1 1 1
E (X ) = (1) + (2) + (3)
6 6 6
! ! !
1 1 1
+ (4) + (5) + (6)
6 6 6
1 1 (6)(7)
= [1 + 2 + 3 + 4 + 5 + 6] =
6 6 2
= 7/3.
Rolling of a Die
25/ 32
X (11) = +1, X (T ) = −1
! !
1 1
E (X ) = (+1) + (−1) =0
2 2
26/ 32
27/ 32
28/ 32
(i) V (X ) = E (X 2 ) − E (X )2
or
X 2 P (X ) − µ2
P
(ii) V (X ) =
x ∈D
29/ 32
Remark
Logically, version (i) of the shortcut formula is not correct because we haven’t yet
defined the random variable X 2 .
We will do this soon - “change of random variable”.
30/ 32
7
E (X ) = µ =
2
! ! !
2 2 1 1 1
E (X ) = (1) + (2)2 + (3)2
6 6 6
! ! !
2 1 1 1
+ (4) + (5)2 + (6)2
6 6 6
1h 2 i
= 1 + 22 + 32 + 42 + 52 + 62
6
1
= [91] ←− later
6
So
91
E (X 2 ) =
6
Here
don't forget
to square
31/ 32
Now plug in n = 6.
(2) In the formula for E (X 2 ) don’t square the probabilities
Not squared
32/ 32