Introduction To Probability: 2.1 Random Variable
Introduction To Probability: 2.1 Random Variable
Introduction To Probability: 2.1 Random Variable
Introduction to Probability
Exercise
1. Consider an experiment of tossing a coin and identify S, X(s), {s1 , s2 , ..., sk }
and {x1 , x2 , ..., xk }.
3
4 CHAPTER 2. INTRODUCTION TO PROBABILITY
4. Consider an experiment of rolling two Dice and identify S, X(s), {s1 , s2 , ..., sk }
and {x1 , x2 , ..., xk }.
FX (x) = P (X ≤ x) (2.1)
Example
For the proof of concept, consider an example of probability distribution for
a discrete random variable. Assume an experiment of “rolling two dice” and
define following events
The sample space S comprises of eleven events. Map the outcome of each event
to real line and draw FX (x). [Caution: Consider events such that the sum< 2
and sum> 12 for CDF]
2. FX (−∞) = 0
3. FX (+∞) = 1
4. FX (x2 ) ≥ FX (x1 ) if x2 ≥ x1
Exercise
Find the probability of following events in terms of FX (x)
1. P (a < X ≤ b)
2. P (a < X < b)
3. P (a ≤ X < b)
4. P (X > b)
Since for any x the transformation is z = g(x), therefore, x = g −1 (z) and pZ (z)
can be expressed as
� −1 �
� dg (z) �
pZ (z) = pX (g (z)) ��
−1 �. (2.3)
dz �
Example
Consider a linear transformation of Z = aX + b, where a and b are constants.
It is evident that Z = g(X), hence
z = g(x)
x = g −1 (z)
z−b
=
a
dg −1 (z) 1
⇒ =
dz a � �
1 z−b
⇒pZ (z) = pX
|a| a
It is shown in the subsequent chapters that noisy received signal is of the form
Z = aX + b.
6 CHAPTER 2. INTRODUCTION TO PROBABILITY
Exercise
Use OCTAVE to solve the following parts.
Mean
The mean, denoted by mx of a random variable X is the sum of the values of
X weighted by their probabilities.
Exercise
Use ‘randint’ function to generate random integers and compute the following
sum. �
ni xi = n1 x1 + n2 x2 , ... (2.4)
i
The expectation for g(X), i.e, E[g(X)] can be obtained from pX (x) using the
relation � ∞
E[g(X)] = gX (x)pX (x)dx (2.7)
−∞
Moments
The n − th moment of random variable X is defined as
� ∞
E[X n ] = X n pX (x)dx (2.8)
−∞