Basic Probability Theory: Lect04.ppt S-38.145 - Introduction To Teletraffic Theory - Spring 2005
Basic Probability Theory: Lect04.ppt S-38.145 - Introduction To Teletraffic Theory - Spring 2005
Basic Probability Theory: Lect04.ppt S-38.145 - Introduction To Teletraffic Theory - Spring 2005
Contents
• Basic concepts
• Discrete random variables
• Discrete distributions (nbr distributions)
• Continuous random variables
• Continuous distributions (time distributions)
• Other random variables
2
4. Basic probability theory
3
4. Basic probability theory
Combination of events
• Union “A or B”: A ∪ B = {ω ∈ Ω | ω ∈ A or ω ∈ B}
• Intersection “A and B”: A ∩ B = {ω ∈ Ω | ω ∈ A and ω ∈ B}
• Complement “not A”: Ac = {ω ∈ Ω | ω ∉ A}
• Events A and B are disjoint if
– A∩B=∅
• A set of events {B1, B2, …} is a partition of event A if
– (i) Bi ∩ Bj = ∅ for all i ≠ j
– (ii) ∪i Bi = A A
B1
B3
B2
4
4. Basic probability theory
Probability
5
4. Basic probability theory
Conditional probability
P ( A∩ B )
P( A | B) = P( B)
• It follows that
P ( A ∩ B) = P( B) P( A | B ) = P( A) P ( B | A)
6
4. Basic probability theory
P ( A) = ∑i P ( Bi ) P ( A | Bi )
Bayes’ theorem
P ( A∩ Bi ) P ( Bi ) P ( A|Bi )
P ( Bi | A) = P ( A)
= P ( A)
P ( Bi ) P ( A| Bi )
P ( Bi | A) =
∑ j P ( B j ) P ( A|B j )
• This is Bayes’ theorem
– Probabilities P(Bi) are called a priori probabilities of events Bi
– Probabilities P(Bi | A) are called a posteriori probabilities of events Bi
(given that the event A occured)
8
4. Basic probability theory
P( A ∩ B ) = P( A) P ( B)
• It follows that
P ( A∩ B ) P ( A) P ( B )
P( A | B) = P( B)
= P( B)
= P ( A)
• Correspondingly:
P ( A∩ B ) P ( A) P ( B )
P ( B | A) = P ( A) = P ( A)
= P( B)
9
4. Basic probability theory
Random variables
{ X ≤ x} : ={ω ∈ Ω | X (ω ) ≤ x} ⊂ Ω
belong to the set of events , that is
{X ≤ x} ∈
10
4. Basic probability theory
Example
X(ω) 0 1 1 1 2 2 2 3
11
4. Basic probability theory
Indicators of events
1, ω ∈ A
1A (ω ) =
0, ω ∉ A
• Clearly:
P{1A = 1} = P( A)
P{1A = 0} = P ( Ac ) = 1 − P( A)
12
4. Basic probability theory
FX ( x) = P{ X ≤ x}
P{ X ≤ x, Y ≤ y} = P{ X ≤ x}P{Y ≤ y}
P{ X 1 ≤ x1,..., X n ≤ xn } = P{ X1 ≤ x1}L P{ X n ≤ xn }
14
4. Basic probability theory
P{ X max ≤ x} = P{ X 1 ≤ x, K , X n ≤ x}
= P{ X 1 ≤ x}L P{ X n ≤ x}
15
4. Basic probability theory
Contents
• Basic concepts
• Discrete random variables
• Discrete distributions (nbr distributions)
• Continuous random variables
• Continuous distributions (time distributions)
• Other random variables
16
4. Basic probability theory
P{ X ∈ S X } = 1
• It follows that
– P{X = x} ≥ 0 for all x ∈ SX
– P{X = x} = 0 for all x ∉ SX
• The set SX is called the value set
17
4. Basic probability theory
Point probabilities
FX ( x) = P{ X ≤ x} = ∑ pi
i: xi ≤ x
18
4. Basic probability theory
Example
pX(x) FX(x)
1 1
x x
x1 x2 x3 x4 x1 x2 x3 x4
probability mass function (pmf) cumulative distribution function (cdf)
P{ X = xi , Y = y j } = P{ X = xi }P{Y = y j }
20
4. Basic probability theory
Expectation
µ X := E[ X ] := ∑ P{ X = x} ⋅ x = ∑ p X ( x) x = ∑ pi xi
x∈S X x∈S X i
– Note 1: The expectation exists only if Σi pi|xi| < ∞
– Note 2: If Σi pi xi = ∞, then we may denote E[X] = ∞
• Properties:
– (i) c ∈ ℜ ⇒ E[cX] = cE[X]
– (ii) E[X + Y] = E[X] + E[Y]
– (iii) X and Y independent ⇒ E[XY] = E[X]E[Y]
21
4. Basic probability theory
Variance
D 2 [ X ] = E[ X 2 ] − E[ X ]2
• Properties:
– (i) c ∈ ℜ ⇒ D2[cX] = c2D2[X]
– (ii) X and Y independent ⇒ D2[X + Y] = D2[X] + D2[Y]
22
4. Basic probability theory
Covariance
Cov[ X , Y ] = E[ XY ] − E[ X ]E[Y ]
• Properties:
– (i) Cov[X,X] = Var[X]
– (ii) Cov[X,Y] = Cov[Y,X]
– (iii) Cov[X+Y,Z] = Cov[X,Z] + Cov[Y,Z]
– (iv) X and Y independent ⇒ Cov[X,Y] = 0
23
4. Basic probability theory
σ X := D[ X ] := D 2 [ X ] = Var [ X ]
D[ X ]
c X := C[ X ] := E[ X ]
µ (Xk ) := E[ X k ]
24
4. Basic probability theory
E[ X n ] = µ
2 σ 2
D [Xn] = n
D[ X n ] = σ
n
25
4. Basic probability theory
P{| X n − µ |> ε } → 0
Xn → µ
26
4. Basic probability theory
Contents
• Basic concepts
• Discrete random variables
• Discrete distributions (nbr distributions)
• Continuous random variables
• Continuous distributions (time distributions)
• Other random variables
27
4. Basic probability theory
Bernoulli distribution
28
4. Basic probability theory
Binomial distribution
()
P{ X = i} = in p i (1 − p ) n − i
• Mean value: E[X] = E[X1] + … + E[Xn] = np
• Variance: D2[X] = D2[X1] + … + D2[Xn] = np(1 − p) (independence!)
29
4. Basic probability theory
Geometric distribution
X ∼ Geom( p ), p ∈ (0,1)
– number of successes until the first failure in an independent series of simple
random experiments (of Bernoulli type)
– p = probability of success in any single experiment
• Value set: SX = {0,1,…}
• Point probabilities:
P{ X = i} = p i (1 − p )
• Mean value: E[X] = ∑i ipi(1 − p) = p/(1 − p)
• Second moment: E[X2] = ∑i i2pi(1 − p) = 2(p/(1 − p))2 + p/(1 − p)
• Variance: D2[X] = E[X2] − E[X]2 = p/(1 − p)2
30
4. Basic probability theory
P{ X ≥ i + j | X ≥ i} = P{ X ≥ j}
• Prove!
– Tip: Prove first that P{X ≥ i} = pi
31
4. Basic probability theory
min 1 − pi
P{ X = X i } = 1 − p p , i ∈ {1,2}
1 2
• Prove!
– Tip: See slide 15
32
4. Basic probability theory
Poisson distribution
X ∼ Poisson (a ), a > 0
– limit of binomial distribution as n → ∞ and p → 0 in such a way that np → a
• Value set: SX = {0,1,…}
• Point probabilities:
a i −a
P{ X = i} = e
i!
• Mean value: E[X] = a
• Second moment: E[X(X −1)] = a2 ⇒ E[X2] = a2 + a
• Variance: D2[X] = E[X2] − E[X]2 = a
33
4. Basic probability theory
Example
• Assume that
– 200 subscribers are connected to a local exchange
– each subscriber’s characteristic traffic is 0.01 erlang
– subscribers behave independently
• Then the number of active calls X ∼ Bin(200,0.01)
• Corresponding Poisson-approximation X ≈ Poisson(2.0)
• Point probabilities:
0 1 2 3 4 5
34
4. Basic probability theory
Properties
X 1 + X 2 ∼ Poisson (a1 + a2 )
• (ii) Random sample: Let X ∼ Poisson(a) denote the number of
elements in a set, and Y denote the size of a random sample of this set
(each element taken independently with probability p). Then
Y ∼ Poisson ( pa )
• (iii) Random sorting: Let X and Y be as in (ii), and Z = X − Y. Then
Y and Z are independent (given that X is unknown) and
Contents
• Basic concepts
• Discrete random variables
• Discrete distributions (nbr distributions)
• Continuous random variables
• Continuous distributions (time distributions)
• Other random variables
36
4. Basic probability theory
Example
fX(x) FX(x)
1
x x
x1 x2 x3 x1 x2 x3
probability density function (pdf) cumulative distribution function (cdf)
SX = [x1, x3]
38
4. Basic probability theory
39
4. Basic probability theory
Contents
• Basic concepts
• Discrete random variables
• Discrete distributions (nbr distributions)
• Continuous random variables
• Continuous distributions (time distributions)
• Other random variables
40
4. Basic probability theory
Uniform distribution
1
f X ( x) = , x ∈ ( a, b)
b−a
• Cumulative distribution function (cdf):
FX ( x) := P{ X ≤ x} = x − a , x ∈ (a, b)
b−a
• Mean value: E[X] = ∫ab x/(b − a) dx = (a + b)/2
• Second moment: E[X2] = ∫ab x2/(b − a) dx = (a2 + ab + b2)/3
• Variance: D2[X] = E[X2] − E[X]2 = (b − a)2/12
41
4. Basic probability theory
Exponential distribution
X ∼ Exp(λ ), λ > 0
– continuous counterpart of geometric distribution (“failure” prob. ≈ λdt)
• Value set: SX = (0,∞)
• Probability density function (pdf):
f X ( x) = λe −λx , x > 0
• Cumulative distribution function (cdf):
FX ( x) = P{ X ≤ x} = 1 − e −λx , x > 0
• Mean value: E[X] = ∫0∞ λx exp(−λx) dx = 1/λ
• Second moment: E[X2] = ∫0∞ λx2 exp(−λx) dx = 2/λ2
• Variance: D2[X] = E[X2] − E[X]2 = 1/λ2 42
4. Basic probability theory
– Prove!
• Tip: Prove first that P{X > x} = e−λx
• Application:
– Assume that the call holding time is exponentially distributed with mean h
(min).
– Consider a call that has already lasted for x minutes.
Due to memoryless property,
this gives no information about the length of the remaining holding time:
it is distributed as the original holding time and, on average, lasts still h
minutes!
43
4. Basic probability theory
• Prove!
– Tip: See slide 15
44
4. Basic probability theory
X ∼ N (0,1)
– limit of the “normalized” sum of IID r.v.s with mean 0 and variance 1 (cf.
slide 48)
• Value set: SX = (−∞,∞)
• Probability density function (pdf):
− 12 x 2
f X ( x) = ϕ ( x) := 1
2π
e
• Cumulative distribution function (cdf):
x
FX ( x) := P{ X ≤ x} = Φ ( x) := ∫− ∞ ϕ ( y ) dy
• Mean value: E[X] = 0 (symmetric pdf)
• Variance: D2[X] = 1
45
4. Basic probability theory
X ∼ N( µ , σ 2 ), µ ∈ ℜ, σ > 0
– if (X − µ)/σ ∼ N(0,1)
• Value set: SX = (−∞,∞)
• Probability density function (pdf):
f X ( x) = FX ' ( x) = σ1 ϕ ( )
x−µ
σ
• Cumulative distribution function (cdf):
FX ( x) := P{ X ≤ x} = P σ {X −µ
≤ σ
x−µ
}= Φ( )
x−µ
σ
• Mean value: E[X] = µ + σE[(X − µ)/σ] = µ (symmetric pdf around µ)
• Variance: D2[X] = σ2D2[(X − µ)/σ] = σ2
46
4. Basic probability theory
X 1 + X 2 ∼ N( µ1 + µ 2 , σ 12 + σ 22 )
• (iii) Sample mean: Let Xi ∼ N(µ,σ2), i = 1,…n, be independent and
identically distributed (IID). Then
n
X n := 1n 1σ 2)
∑ i
X ∼ N ( µ , n
i =1
47
4. Basic probability theory
X n ≈ N ( µ , 1n σ 2 )
48
4. Basic probability theory
Contents
• Basic concepts
• Discrete random variables
• Discrete distributions (nbr distributions)
• Continuous random variables
• Continuous distributions (time distributions)
• Other random variables
49
4. Basic probability theory
FW(x) 1
1−ρ
x
0
0
50