0% found this document useful (0 votes)
30 views

Tutorial 3

This document contains 4 problems related to information theory. Problem 1 discusses processing the output of a communication channel and shows that it cannot strictly improve capacity. Problem 2 finds the capacity of an additive noise channel, which depends on the value of the noise parameter a. Problem 3 shows that a cascade of binary symmetric channels is equivalent to a single BSC, and the capacity goes to 0 as the number of cascaded channels increases. Problem 4 calculates the capacity of a channel with both erasures and errors, and specializes the results to the binary symmetric and binary erasure channels.

Uploaded by

Nasreddine
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
30 views

Tutorial 3

This document contains 4 problems related to information theory. Problem 1 discusses processing the output of a communication channel and shows that it cannot strictly improve capacity. Problem 2 finds the capacity of an additive noise channel, which depends on the value of the noise parameter a. Problem 3 shows that a cascade of binary symmetric channels is equivalent to a single BSC, and the capacity goes to 0 as the number of cascaded channels increases. Problem 4 calculates the capacity of a channel with both erasures and errors, and specializes the results to the binary symmetric and binary erasure channels.

Uploaded by

Nasreddine
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

Information Theory COMM 1003

Spring 2017

Tutorial 3

Problem 1

Processing the output. One is given a communication channel with transition probabilities p(y|x) and
channel capacity C = max p(x) I (X; Y). Someone processes the output by forming Y* = g(Y). He claims that
this will strictly improve the capacity.
(a) Show that he is wrong.
(b) Under what conditions does he not strictly decrease the capacity?

Answer:

a. Since: X  Y  Y*; then we can argue that the processing Y* = g(Y) can only reduce the
mutual information as in the lecture of the Fano’s lemma. Then we can show exactly that:
I(X;Y*) <= I(X;Y)

b. To have equality, this will only happen when the process g(Y) =Y* =Y; i.e., the process does not
do any operation on Y.

Problem 2

Additive noise channel. Find the channel capacity of the following discrete memory less channel:

IMPORTANT: where Pr{Z = 0} = Pr{Z = a} =1/2.

The alphabet for x is X = {0, 1}.

Assume that Z is independent of X.

Observe that the channel capacity depends on the value of a.

Answer:

A sum channel such that:

Y = X + Z; X \in {0, 1}; Z \in {0, a}

We have to distinguish between various cases depending on the values of a:

If a=0, the Z = 0 (strictly). In this case, Y = X and max {I(X;Y)} = max {H(X)} = 1 (bit or 100% per
transmission)
Information Theory COMM 1003
Spring 2017
If a! = 0, +1,-1, the Z =0 (50%) and Z = a (50%)! The output will be X = {0, 1, a, 1+a}. This means that
we can always tell if zero was sent or 1; i.e., if Y = 1, or, 0; then X was (strictly) = 1 or 0 (respectively)
or when Y = 1+a or a; then X was (strictly) = 1 or 0 (respectively)! Here C = 1 (as before).

If a =+1; then Z = 0 or 1. In this case Y \in {0, 1, 1, 2}! Here 1 mean that the signal is going to be
confused. In this case, the 1 can be marked as E and we can only detect 0 as 0 was transmitted or 2 if 1
was transmitted. This is similar to Binary Erasure channels where the capacity = 1- p(a=1) = 1 -1/2 =1/2.
If a = -1; this is similar to the previous case (a = +1).

Problem 3

Cascade of binary symmetric channels (Optical Fiber Channel): Show that a cascade of n
identical independent binary symmetric channels (see figure)

Each with raw error probability p, is equivalent to a single BSC with error probability Pe = 1/2 (1 - (1 -
2p)n) and hence
Show that limnoo I(X0;Xn) = 0 if p lies in the interval ]0; 1[:
HINT: No encoding or decoding takes place at the intermediate between the cascaded blocks

Answer:

The conditional probability distribution p(y|x) for each of the BSCs may be expressed by the
transition probability matrix A, given by
1−𝑝 𝑝
𝐓=[ ]
𝑝 1−𝑝
Since the output of the channel one undertakes the error (p) in channel two and so on; the
transition matrix for n cascaded BSC is=
1−𝑝 𝑝 𝑛
𝐓𝑛 = [ ]
𝑝 1−𝑝
To solve this, we need to compute the transmission matrix SVD (will be given in the formula
sheets)

[U,D,V] = SVD(T)

Then

1 1 1 1 0 1 1 1 1 0
𝐓 = 𝐔𝐃𝐕 𝑇 = [ ][ ] [ ] == 𝐔 [ ]𝐔
√2 1 −1 0 1 − 2𝑝 √2 1 −1 0 1 − 2𝑝

and

1 1 1 1 𝑛 1
0 1 1
𝐓 𝑛 = 𝐔𝐃𝑛 𝐔 = [ ][ ] [ ]
√2 1 −1 0 1 − 2𝑝 √2 1 −1
Information Theory COMM 1003
Spring 2017
1 1 1 1 0 1 1 1
= [ ][ (1 𝑛] [ ]
√2 1 −1 0 − 2𝑝) √2 1 −1

Then:

1 1
(1 + (1 − 2𝑝)𝑛 ) (1 − (1 − 2𝑝)𝑛 ) 𝑝𝑐 𝑝𝑒
𝑛 𝑛
𝐓 = 𝐔𝐃 𝐔 = [2 2 ] = [𝑝
1 1 𝑒 𝑝𝑐 ]
(1 − (1 − 2𝑝)𝑛 ) (1 + (1 − 2𝑝)𝑛 )
2 2
Where 𝑝𝑒 + 𝑝𝑐 = 1

Then the equivalent channel is a BSC as well; and the capacity

lim 𝐶 = lim max 𝐼(𝑋; 𝑌) = 1 − ℎ(𝑝𝑒 ) = 1 + 𝑝𝑒 log 2 𝑝𝑒 + 𝑝𝑐 log 2 𝑝𝑐


𝑛→∞ 𝑛→∞ {𝑃(𝑥)}
And lim 𝑝𝑒 = 1/2 and lim 𝑝𝑐 = 1/2
n→∞ n→∞

Then
lim 𝐶 = 1 − ℎ(1/2) = 0
𝑛→∞

Problem 4

Total distortion channel (Erasure and Binary error). Consider a channel with binary inputs that
has both erasures and errors. Let the probability of error be 𝜖 and the probability of erasure be
𝛼 so the channel is follows

(a) Find the capacity of this channel.


(b) Specialize to the case of the binary symmetric channel (α = 0).
(c) Specialize to the case of the binary erasure channel (𝜖 = 0).
Information Theory COMM 1003
Spring 2017

Answer:

(a) Find the capacity of this channel:

We have the following channel transition matrix:

1−𝛼−𝜖 𝛼 𝜖
𝐓=[ ]  P(Y|X)
𝜖 𝛼 1−𝛼−𝜖

𝐶 = 𝐻 (𝑌) − 𝐻(𝑌|𝑋)

We can rewrite the transmission matrix as:

X / Y 0 𝜖 1
0 1−𝛼−𝜖 𝛼 𝜖
1 𝜖 𝛼 1−𝛼−𝜖

The Capacity is maximized if the input has equal probability (in this case P(x) = ½ )
1 1 1
𝑃(𝑌 = 0) = 2 (1 − 𝛼 − 𝜖) + 2 (𝜖) = 2 (1 − 𝛼) (sum of the first column* P(x) [total probability
theory])
1 1
𝑃(𝑌 = 𝜖) = 2 𝛼 + 2 𝛼 (add the second column* P(x) [total probability theory])
1 1 1
𝑃(𝑌 = 0) = 2 (1 − 𝛼 − 𝜖) + 2 (𝜖) = 2 (1 − 𝛼)

1 1−𝛼
𝐻(𝑌) = ∑ 𝑃(𝑌 = 𝑦𝑖 ) log 2 = −(1 − 𝛼) log 2 − (𝛼) log 2 (𝛼)
𝑃(𝑌 = 𝑦𝑖 ) 2
𝑖
and
𝐻(𝑌|𝑋) = ∑ 𝑃(𝑋 = 𝑥𝑗 )𝐻(𝑌|𝑋 = 𝑥𝑗 ) = 𝑃(𝑋 = 0)𝐻(𝑌|𝑋 = 0) + 𝑃(𝑋 = 1)𝐻(𝑌|𝑋 = 1)
𝑗
1
= [𝐻(𝑌|𝑋 = 1) + 𝐻(𝑌|𝑋 = 0)]
2
= −{(1 − 𝛼 − 𝜖) log 2 (1 − 𝛼 − 𝜖) + 𝜖 log 2 𝜖 + 𝛼 log 2 𝛼}

And you DO THE MATH!

(b) Specialize to the case of the binary erasure channel (𝜖 = 0)

Perfect BEC; then C = 1- 𝛼

(b) Specialize to the case of the binary symmetric channel (𝛼 = 0)

Perfect BSC; then C = 1- ℎ(𝜖)

You might also like