ProbabilityII HW9 Solutions
ProbabilityII HW9 Solutions
ProbabilityII HW9 Solutions
Enrique Areyan
March 27, 2014
Chapter 5
Exercises
3.7 Customers arrive at a service facility according to a Poisson process of rate λ customers/hour. Let X(t) be the number of
customers that have arrived up to time t. Let W1 , W2 , . . . be the successive arrival times of the customers. Determine the
conditional mean E[W5 |X(t) = 3].
Solution: We know that up to time t we had 3 customers. Let us compute the density function of W5 |X(t) = 3 by
first computing its cumulative distribution. Let u > t. Then, P r{W5 ≤ u|X(t) = 3} = probability that we will get 5 or
more customers past time t and up to time u, given that we had 3 customers up to time t. But this is the same as getting
2 or more customers between time t and u, i.e., P r{W5 ≤ u|X(t) = 3} = P r{X(u) ≥ 2|X(t) = 3}. Then,
Since X(t) is a Poisson process, we have that X(u) − X(t) ∼ P ois((u − t)λ). So we can compute the above probabilities:
0 1
e−(u−t)λ ((u − t)λ) e−(u−t)λ ((u − t)λ)
= 1− − Poisson p.d.f
0! 1!
Hence, the cdf of W5 |X(t) = 3 is FW5 (u) = 1 − e−(u−t)λ − e−(u−t)λ (u − t)λ, which means that the pdf is the derivative of
this function w.r.t u:
d
fW5 (u) = FW5 = λe−(u−t)λ − λe−(u−t)λ + (u − t)λ2 e−(u−t)λ = (u − t)λ2 e−(u−t)λ
du
Finally, we can compute the expectation of this random variable. By definition:
R∞
E[W5 |X(t) = 3] = ufW5 (u)du definition of expectation
t
R∞
= u(u − t)λ2 e−(u−t)λ du replacing pdf previously calculated
t
R∞
= (v + t)vλ2 e−vλ dv change of variables v = u − t
0
∞
R∞
λ2 v 2 e−vλ dv + t ve−vλ dv
R
= linearity of integral
0 0
∞
e−vλ (−λv(λv + 2) − 2) e−vλ (λv + 1)
2
= λ +t antiderivative
λ3 λ2 0
Since lim e−vλ = 0 and e−λ0 = 1, we have:
v→∞
−2 t 2 2
E[W5 |X(t) = 3] = −λ2 − = − − − t = t+
λ3 λ2 λ λ
Note that this result makes intuitive sense: if λ → 0, then we would have to wait an arbitrary amount of time past t for
customer 5 to occur. Likewise, if λ → ∞ then we would have to wait an infinitesimal amount of time past t. Lastly, if
λ = 2, then we would expect to wait exactly one unit of time past t for 2 more customers to arrive and thus, receive the
5th customer.
4.1 Let {X(t); t ≥ 0} be a Poisson process of rate λ. Suppose it is known that X(1) = n. For n = 1, 2, . . . , determine the
mean of the first arrival time W1 .
Solution: Let U1 , U2 , . . . , Un ∼ U nif orm((0, 1]). Given that X(1) = n, the U 0 s represent the W 0 s but ignoring or-
der. Now, we want to find the mean time of W1 , i.e., the first arrival time. In terms of the U 0 s we have that W1 = U(1) ,
where U(1) = min{U1 , U2 , . . . , Un }. But finding the distribution of U(1) is relatively easy: let v ∈ (0, 1]
n d
Hence, the c.f.d of W1 is FW1 (v) = 1 − [(1 − v) ], thus the p.d.f if FW1 (v) = n(1 − v)n−1 = fW1 (v). The mean is:
dv
Z1 Z1
vfW1 (v)dv = v · n(1 − v)n−1 dv
0 0
4.2 Let {X(t); t ≥ 0} be a Poisson process of rate λ. Suppose it is known that X(1) = 2. Determine the mean of W1 W2 , the
product of the first two arrival times.
Solution: Let U1 , U2 ∼ U nif orm((0, 1]). Since we know that X(1) = 2, the U 0 s represent the W 0 s but ignoring or-
der. Since the product of two real numbers is conmutative, we have that U1 U2 = W1 W2 , i.e., it does not matter if we
multiply the ordered or unordered random variables, the result will be the same. But computing expected values of the U 0 s
2 1
1 R1 u 1
is very easy: E[U1 ] = E[U2 ] = (mean of a uniform on (0, 1], i.e., 0 udu = = ). Finally, since U1 is independent
2 2 0 2
of U2 :
1 1 1
E[W1 W2 ] = E[U1 U2 ] = E[U1 ]E[U2 ] = · =
2 2 4
4.5 Customers arrive at a certain facility according to a Poisson process of rate λ. Suppose that it is known that five customers
arrived in the first hour. Each customer spends a time in the store that is a random variable, exponentially distributed
with parameter α and independent of the other customer times, and then departs. What is the probability that the store
is empty at the end of this first hour?
Solution: Mathematically, this problem models the same situation as that of decaying particles worked in class. In
this case let us interpret a particle being alive at time t as a customer being in the store at time t. Hence, the probability
of a customer being at the store at time t is given by:
Zt
1
p=1− G(v)dv
t
0
In this case the distribution of the time spent in the store by each customer is a exponentially distributed, i.e., G(v) =
1 − e−αv . Solving for p up to time t = 1:
Zt Z1 1
e−αv e−α e0 1 − e−α
1 1 −αv
p=1− G(v)dv = 1 − 1−e dv = 1 − 1 − =− − =
t 1 α 0 α α α
0 0
The complement of this probability is the probability that the customer is not at the store at time t:
1 − e−α
1−p=1−
α
Since each customer spends a time in the store that is independent of the other customers, the probability that the store
is empty at the end of the first hour is the product of these probabilities, i.e.: P r{store empty at tend of first hour} =
P r{customer 1 leaves before first hour, customer 2 leaves before first hour, . . . , customer 5 leaves before first hour} =
P r{customer 1 leaves before first hour}P r{customer 2 leaves before first hour} · · · P r{customer 5 leaves before first hour} =
5
1 − e−α 1 − e−α 1 − e−α 1 − e−α
1− ·1− ···1 − = 1−
α α α α
Problems
3.7 A critical component on a submarine has an operating lifetime that is exponentially distributed with mean 0.50 years.
As soon as a component fails, it is replaced by a new one having statistically identical properties. What is the smallest
number of spare components that the submarine should stock if it leaving for a one-year tour and wishes the probability
of having an inoperable unit caused by failures exceeding the spare inventory to be less than 0.02?
Solution: Let X(t) = number of component failures (equivalently, this is the same as counting the number of com-
ponent replacements since we replace components as soon as it fails). Now, by theorem 3.2, we know that X(t) is a
1 failure 1 failure
Poisson process in this case of rate λ = . Hence, X(t) ∼ P ois( · t).
1/2 year 1/2 year
1 failure
We are interested in one year so we will use X(1) ∼ P ois( · 1) = P ois(2). We wish to minimizing the probability:
1/2 year
P r{X(1) = n} ≤ 0.02
e−2 26 0.13533528323 · 64
n = 6 =⇒ P r{x(1) = 6} = = = 0.01202980295
6! 720
That is, the amplitude impressed on the detector when the pulse arrives is ξk , and its effect thereafter decays exponentially
at rate α. Assume that the detector is additive, so that if N (t) pulses arrive during the time interval [0, t], then the output
at time t is
N (t)
X
Z(t) = θk (t)
k=1
Determine the mean output E[Z(t)] assuming N (0) = 0. Assume that the amplitudes ξ1 , ξ2 , . . . are independent of the
arrival times W1 , W2 , . . .
Solution: " #
NP
(t)
E[Z(t)] = E θk (t) by definition of Z(t)
k=1
∞
n
P P
= E θk (t)|N (t) = n P r{N (t) = n} law of total expectation
n=1 k=1
Let us compute, for a fixed n the following expectation. Note that U1 , . . . , Un denote independent random variables that
are uniformly distributed in [0, t]:
n n
ξk e−α(t−Wk ) |N (t) = n
P P
E θk (t)|N (t) = n = E Definition of θk (t)
k=1 k=1
n
−α(t−Uk )
P
= E ξk e Order does not matter in sum and Theorem 4.1
k=1
1 Rt −α(t−u)
= nE[ξk ] e du Law of unconscious statistician
t0
e−αt Rt αu
= nE[ξk ] e du Taking constants out of integral
t 0
1
e−αt eαu
= nE[ξk ] Integrating
t α 0
1
e−αt eαt − 1
= nE[ξk ] Evaluating limits
t α 0
1 − e−αt
n
= E[ξk ] algebra
t α
E[ξk ] 1 − e−αt P
∞
= nP r{N (t) = n} taking constants out of sum
t α n=1
E[ξk ] 1 − e−αt
= E[N (t)] By definition of expectation of a discrete r.v.
t α
E[ξk ] 1 − e−αt
= λt Since N (t) ∼ P ois(λt)
t α