Discrete Random Variable
Discrete Random Variable
Discrete Random Variable
A rule that assigns a real number to each outcome is called random variable.
The rule is nothing but a function of the variable X that assigns a unique value to each outcome of the
random experiment.
When a variable X takes the value xi with he probability pi (i=1,2,3,…,n) then X is called random
variable or stochastic variable or variate.
There are two types of random variable : Discrete Random Variable and Continuous Random Variable.
DISCRETE RANDOM VARIABLE
A random variable X which can take only a finite number of values in an interval of the
domain called discrete random variable.
Example :
If a random variable x can assume a discrete set of values say x1, x2,…, xn with respect to
probabilities p1, p2,…, pn such that p1+p2+…+ pn =1 then the occurrences of value xi with respective
probabilities pi is called discrete probability distribution of X.
X 2 3 4 5 6 7 8 9 10 11 12
1 2 3 4 5 6 3
P(X) 5 4 2 1
36 36 36 36 36 36 36 36 36 36 36
Probability Function or Probability Mass Function (pmf)
𝑖 𝑝 𝑥𝑖 ≥ 0
(𝑖𝑖) σ 𝑝 𝑥𝑖 =1
Cumulative Distribution Function (Distribution Function)
If X is a random variable then 𝑃(𝑋 ≤ 𝑥) is called the cumulative distribution function (cdf)
or distribution function and is denoted by F(x).
So, 𝐹 𝑥 = 𝑃(𝑋 ≤ 𝑥)
Expectation of a Discrete Random Variable
If x is a discrete random variable which assumes the discrete set of values 𝑥1 , 𝑥2 , … , 𝑥𝑛 with
the respective probabilities 𝑝1 , 𝑝2 , … , 𝑝𝑛 then the expression or expected value of x is denoted by E(X)
and defined as
E(X)= 𝑝1 𝑥1 + 𝑝2 𝑥2 + ⋯ + 𝑝𝑛 𝑥𝑛 = σ𝑛𝑖=1 𝑝𝑖 𝑥𝑖
i. 𝐸 𝑎 =𝑎
ii. 𝐸 𝑎𝑋 = 𝑎𝐸 𝑋
iii. 𝐸 𝑋 − 𝜇 = 0
X 0 1 2
P(X) 1 1 1
4 2 4
1 1 1
𝐸 𝑋 = ( × 0) + ( × 1) + ( × 2) = 1
4 2 4