Hw3sol 21015 PDF
Hw3sol 21015 PDF
Hw3sol 21015 PDF
(Prepared by Yu Xiang)
1. Time until the n-th arrival. Let the random variable N (t) be the number of packets arriving
during time (0, t]. Suppose N (t) is Poisson with pmf
(λt)n −λt
pN (n) = e for n = 0, 1, 2, . . . .
n!
Let the random variable Y be the time to get the n-th packet. Find the pdf of Y .
Solution: To find the pdf fY (t) of the random variable Y , note that the event {Y ≤ t}
occurs iff the time of the nth packet is in [0, t], that is, iff the number N (t) of packets arriving
in [0, t] is at least n. Alternatively, {Y > t} occurs iff {N (t) < n}. Hence, the cdf FY (t) of Y
is given by
∞
X (λt)k −λt
FY (t) = P{Y ≤ t} = P {N (t) ≥ n} = e .
k!
k=n
2. Diamond distribution. Consider the random variables X and Y with the joint pdf
√
c if |x| + |y| ≤ 1/ 2
fX,Y (x, y) =
0 otherwise,
where c is a constant.
1
(a) Find c.
(b) Find fX (x) and fX|Y (x|y).
(c) Are X and Y independent random variables? Justify your answer.
Solution:
(a) The integral of the pdf fX,Y (x, y) over −∞ < x < ∞, −∞ < y < ∞ is c, and therefore
by the definition of joint density
c = 1.
(b) The marginal pdf is obtained by integrating the joint pdf with respect to y. For 0 ≤
x ≤ √12 ,
Z √1 −x
2 1
fX (x) = c dy = 2 √ − x ,
− √1 +x
2
2
√1 +x
1
Z
2
fX (x) = c dy = 2 √ +x .
− √1 −x
2
2
fX,Y (x, y)
fX|Y (x|y) =
fY (y)
(
√ 1 |x| + |y| ≤ √1 , |y| ≤ √1
= 2−2|y| 2 2
0 otherwise.
Alternatively, X and Y are not independent since fX|Y (x|y) depends on the value of y.
3. First available teller. Consider a bank with two tellers. The service times for the tellers are
independent exponentially distributed random variables X1 ∼ Exp(λ1 ) and X2 ∼ Exp(λ2 )
respectively. You arrive at the bank and find that both tellers are busy but that nobody else
is waiting to be served. You are served by the first available teller once he/she is free. What
is the probability that you are served by the first teller?
Solution: From the memoryless property of the exponential distribution, the remaining
services for the tellers are also independent exponentially distributed random variables with
2
parameters λ1 and λ2 , respectively. The probability that you will be served by the first teller
is the probability that the first teller finishes the service before the second teller does. Thus,
Z
P{X1 < X2 } = fX1 ,X2 (x1 , x2 ) dx2 dx1
x2 >x1
Z ∞ Z ∞
= λ1 e−λ1 x1 λ2 e−λ2 x2 dx2 dx1
Z x∞
1 =0 x2 =x1
4. Coin with random bias. You are given a coin but are not told what its bias (probability
of heads) is. You are told instead that the bias is the outcome of a random variable P ∼
Unif[0, 1]. To get more infromation about the coin bias, you flip it independently 10 times.
Let X be the number of heads you get. Thus X ∼ B(10, P ). Assuming that X = 9, find and
sketch the a posteriori probability of P , i.e., fP |X (p|9).
Solution: In order to find the conditional pdf of P, apply Bayes’ rule for mixed random
variables to get
p9 (1 − p)
fP |X (p|9) = R 1
9
0 p (1 − p) dp
p9 (1 − p)
= 1
110
9
= 110p (1 − p).
Figure 1 compares the unconditional and the conditional pdfs for P . It may be seen that given
the information that 10 independent tosses resulted in 9 heads, the pdf is shifted towards the
9
value 10 .
3
4.5
fP(p)
4 f (p|9)
P|X
3.5
2.5
1.5
0.5
0
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
5. Optical communication channel. Let the signal input to an optical channel be:
λ0 with probability 12
X=
λ1 with probability 12 .
The conditional pmf of the output of the channel Y |{X = λ0 } ∼ Poisson(λ0 ), i.e., Poisson
with intensity λ0 and Y |{X = λ1 } ∼ Poisson(λ1 ).
Show that the MAP rule reduces to:
λ0 , y < y ∗
D(y) =
λ1 , otherwise.
minimizes the probability of decoding error. Since the a priori probabilities for the two X
values are equal, the MAP rule is equivalent to the ML rule
p (y|λ0 )
(
λ0 pY |X (y|λ1 ) > 1
D(y) = Y |X
λ1 otherwise.
Now,
4
λ1 −λ0
This ratio is greater than 1 if y < ln(λ1 )−ln(λ0 ) . Therefore, when λ0 = 1 and λ1 = 2, we have
1
1 y < ln(2)
D(y) =
2 otherwise
and
1
y∗ = = 1.44.
ln(2)
Pe = P{D(Y ) 6= X}
= P{Y > y ∗ |X = 1}P{X = 1} + P{Y < y ∗ |X = 2}P{X = 2}
∞ −1 1
X e X e−2 2y
= × 0.5 + × 0.5
y=2
y! y=0
y!
= 0.335.
and
1
y∗ = = 21.497.
ln(2)
The probability of error is
Pe = P{D(Y ) 6= X}
∞ 21 −100
X e−1 X e (100)y
= × 0.5 + × 0.5
y=22
y! y=0
y!
e−100 (100)21
≤ .5(P (Y > 22|X = 1) + 22 × ) ≈ .5(1/22 + 1.6 × 10−20 ) . .025.
21!
where the inequality is obtained by using Markov inequality as well as the shape of the PMF
of Poisson random variable (increasing before k∗ ). This can be further tightened up (using
Chebyshev’s Inequality):
e−100 (100)21
Pe ≤ .5(P (|Y − 1| > 21|X = 1) + 22 × ) ≈ .5(1/(21)2 + 1.6 × 10−20 ) . .0011.
21!
5
(a) Let X be the radioactivity level measured from one of the bottles. What is the optimal
decision rule (based on the measurement X) that maximizes the chance of correctly
identifying the content of the bottle?
(b) What is the associated probability of error?
Solution: Let Θ = 0 denote the case in which the content of the bottle is “Iocane” and let
Θ = 1 denote the case in which the content of the bottle is “Sennari”. Implicit in the problem
statement is that P(Θ = 0) = P(Θ = 1) = 1/2.
Since the Unif(0, 1) pdf fX|Θ (x|0) is larger than the Exp(1) pdf fX|Θ (x|1) for 0 < x < 1,
we have (
0, 0 < x < 1,
D(x) =
1, otherwise.
Solution:
(a) We have
FU (u) = P{U ≤ u}
= P{min(X, Y ) ≤ u}
= P{X ≤ u, Y ≤ u}
= P{X ≤ u}P{Y ≤ u}
= u2
6
for 0 ≤ u ≤ 1. Hence, (
2u, 0 ≤ u ≤ 1,
fU (u) =
0, otherwise.
(b) Similarly,
P{W ≤ w} = P{|X − Y | ≤ w}
= P (−w ≤ X − Y ≤ w).
Since X and Y are uniformly distributed over [0, 1], the above integral is equal to the
area of the shaded region in the following figure:
w 1 x
0
The area can be easily calculated as 1 − (1 − w)2 for 0 ≤ w ≤ 1. Hence FW (w) =
1 − (1 − w)2 and (
2(1 − w), 0 ≤ w ≤ 1,
fW (w) =
0, otherwise.
7
(d) From the figure above,
8. Waiting time at the bank. Consider a bank with two tellers. The service times for the
tellers are independent exponentially distributed random variables X1 ∼ Exp(λ1 ) and X2 ∼
Exp(λ2 ), respectively. You arrive at the bank and find that both tellers are busy but that
nobody else is waiting to be served. You are served by the first available teller once he/she
becomes free. Let the random variable Y denote your waiting time. Find the pdf of Y .
Solution: First observe that Y = min(X1 , X2 ). Since
8
Additional Exercises
Do not turn in solutions to these problems.
Solution:
P
(a) Recall that the probability of any event A ⊆ X is given by P {X ∈ A} = x∈A∩X pX (x).
Because of the independence of X and Y , we have
X X
P {X ∈ A, Y ∈ B} = pX,Y (x, y)
x∈A∩X y∈B∩Y
X X
= pX (x)pY (y)
x∈A∩X y∈B∩Y
X X
= pX (x) pY (y)
x∈A∩X y∈B∩Y
= P {X ∈ A}P {Y ∈ B}.
FU,V (u, v) = P {X ∈ Ax , Y ∈ By {
= P {X ∈ Ax }P {Y ∈ By }
= P {g(X) < u}P {h(Y ) < v}
= P {U < u}P {V < v}
= F (u)F (v),
2. Family planning. Alice and Bob choose a number X at random from the set {2, 3, 4} (so the
outcomes are equally probable). If the outcome is X = x, they decide to have children until
they have a girl or x children, whichever comes first. Assume that each child is a girl with
probability 1/2 (independent of the number of children and gender of other children). Let Y
be the number of children they will have.
(a) Find the conditional pmf pY |X (y|x) for all possible values of x and y.
(b) Find the pmf of Y .
9
Solution:
Can G be a joint cdf for a pair of random variables? Justify your answer.
lim G(x, y) = 1.
y→∞
lim FX (x) = 0 6= 1 .
x→−∞
Therefore G(x, y) is not a cdf. Alternatively, assume that G(x, y) is a joint cdf for X and Y ,
then
But this violates the property that the probability of any event must be nonnegative.
10
fY|S(y|−1)
fY|S(y|0)
0.5 fY|S(y|1)
0.4
0.3
0.2
0.1
0
−3 −2 −1 0 1 2 3
The signal is sent over a channel with additive Laplacian noise Z, i.e., Z is a Laplacian
random variable with pdf
λ −λ|z|
fZ (z) = e , −∞ < z < ∞ .
2
The signal S and the noise Z are assumed to be independent and the channel output is their
sum Y = S + Z.
(a) Find fY |S (y|s) for s = −1, 0, +1 . Sketch the conditional pdfs on the same graph.
(b) Find the optimal decoding rule D(Y ) for deciding whether S is −1, 0 or +1. Give your
answer in terms of ranges of values of Y .
(c) Find the probability of decoding error for D(y) in terms of λ.
Solution:
(a) We use a trick here that is used several times in the lecture notes. Since Y = S + Z and
Z and S are independent, the conditional pdf is
fY |S (y|s) = fZ (y − s) = 12 λe−λ|y−s| .
11
(b) The optimal decoding rule is MAP: D(y) = s where s maximizes
f (y|s)p(s)
p(s|y) = .
f (y)
Since pS (s) is the same for s = −1, 0, +1, the MAP rule becomes the maximum-likelihood
decoding rule: D(y) = arg max f (y|s). The conditional pdfs are plotted in Figure 2. By
s
inspection, the ML rule reduces to
−1 y < − 12
g(y) = 0 − 12 < y < + 12
+1 y > + 21 .
5. Signal or no signal. Consider a communication system that is operated only from time to
time. When the communication system is in the “normal” mode (denoted by M = 1), it
transmits a random signal S = X with
(
+1, with probability 1/2,
X=
−1, with probability 1/2.
When the system is in the “idle” mode (denoted by M = 0), it does not transmit any signal
(S = 0). Both normal and idle modes occur with equal probability. Thus
(
X, with probability 1/2,
S=
0, with probability 1/2.
12
(a) Find and sketch the conditional pdf fY |M (y|1) of the receiver observation Y given that
the system is in the normal mode.
(b) Find and sketch the conditional pdf fY |M (y|0) of the receiver observation Y given that
the system is in the idle mode.
(c) Find the optimal decoder D(y) for deciding whether the system is normal or idle. Provide
the answer in terms of intervals of y.
(d) Find the associated probability of error.
Solution:
(a) If M = 1, (
1 + Z, with probability 1/2,
Y =
−1 + Z, with probability 1/2.
Hence, we have
(
1
2 fZ (y − 1) + 12 fZ (y + 1) = 41 , −2 ≤ y ≤ 2,
fY |M (y|1) =
0, otherwise.
(b) If M = 0, Y = Z, so
(
fZ (y) = 12 , −1 ≤ y ≤ 1,
fY |M (y|0) =
0, otherwise.
(c) Since both modes are equally likely, the optimal MAP decoding rule reduces to the ML
rule, in which
(
0, if fY |M (y|0) > fY |M (y|1),
d(y) =
1, otherwise
(
0, if − 1 < y < 1,
=
1, otherwise.
13