0% found this document useful (0 votes)
7 views

Decoding ConvolutionalCodes

89

Uploaded by

jimmyh901225
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views

Decoding ConvolutionalCodes

89

Uploaded by

jimmyh901225
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 20

Decoding of Convolutional Codes

Decoding Convolutional Codes


n Given the received code word r, determine the most
likely path through the trellis.
l Compare r with the code bits associated with each path.
l Pick the path whose code bits are “closest” to r.
l Can measure distance using either Hamming distance or
squared Euclidian distance.
l Once the most likely path has been selected, the
estimated data bits can be read from the trellis diagram.
l Called Maximum Likelihood (ML) decoding.
l Chooses the most likely path through the trellis.

Prof. D. Tseng 國立台灣科技大學電機工程研究所 2


Maximum likelihood decoding

n Find a transmitted sequence c such that P(r|


c) is maximized.
n r : input of decoder
l {0, 1} if hard-decision decoding
l Otherwise, matched filter output if soft-decision
decoding
n Note: MAP for binary symbol
If P ( S1 r ) > P ( S2 r)
fi pick S1 as the transmitted symbol
Prof. D. Tseng 國立台灣科技大學電機工程研究所 3
Decoding Algorithms:
(1) Brute-Force Method
n Brute-force approach.
l There will be 2L distinct paths through the trellis.
l L is the number of data bits.
l Each path corresponds to a sequence of code bits.

l Just compare the received code word r against


the 2L distinct code sequences.
l Very complex for L >15.
l Easy concept to understand.
l Guaranteed to give the ML solution.
l Not used in practice.
Prof. D. Tseng 國立台灣科技大學電機工程研究所 4
Decoding Algorithms:
(2) Sequential Decoding
n Sequential decoding.
l Similar to the brute-force approach in that we compare r
against many distinct paths.
l However, we only consider the most likely paths.
l If the several consecutive bits in a path are very far from r, we don’t
consider it.
l The number of remaining paths is far fewer than 2L
l Much less complex than brute-force method.
l However, not guaranteed to find ML solution.
l No longer used extensively in practice.
l Occasionally used for codes with very large K (K~32).
l Not effective for smaller values of K.
Prof. D. Tseng 國立台灣科技大學電機工程研究所 5
Decoding Algorithms:
(3) the Viterbi Algorithm
n Viterbi algorithm
l A breakthrough in communications in the late 60’s.
l Like brute force, is guaranteed to find ML solution.
l However the complexity is only O(2K).
l Complexity does not depend on L.
l Takes advantage of the structure of the trellis.
l Goes through the trellis one stage at a time.
l Finds the most likely path leading into each state.
l Discards all other paths leading into the state.
l Is easily implemented in hardware.
l Used in satellites, cell phones, modems, etc.
l Example: Qualcomm Q1900.

Prof. D. Tseng 國立台灣科技大學電機工程研究所 6


Efficiency of Viterbi decoding

n Identifies the path through the trellis that is


closest in Hamming distance to the received
sequence
n The total number of paths grows
exponentially with the number of symbols
n Viterbi decoding maintains only one
“survivor” path for each state, regardless
of the number of symbols
Prof. D. Tseng 國立台灣科技大學電機工程研究所 7
System Model:
Hard Decision Decoding

x rate r = k/n c s(t)


BPSK
convolutional
modulator
L data bits encoder (L+m)/r
code bits
AWGN
+ n(t)

L+m
x!
r ∈{0,1} r(t)
Viterbi BPSK
decoder detector
estimates of estimates
data bits of code bits

⎛ 2rE b ⎞
r = c⊕e Error indicator: Binary Symmetric Channel p = Q ⎜ ⎟
e = 0 no error (prob. 1-p) ⎝ N o ⎠
e = 1 error (prob. p)
Prof. D. Tseng 國立台灣科技大學電機工程研究所 8
Example Sequence

n Let the input to the convolutional encoder be:


l x = [ 1 1 0 1 0 0 ]
n Then the output of the encoder is:
l c = [ 1 1 1 0 1 0 0 0 0 1 1 1 ]
n Suppose p = .25 and the error indicator is:
l e = [ 0 0 0 1 0 0 0 1 0 0 0 1 ]
n Thus the output of the BSC is:
l r = [ 1 1 1 1 1 0 0 1 0 1 1 0]
Prof. D. Tseng 國立台灣科技大學電機工程研究所 9
Step 0:
Initialization
n We will store a path metric at each node in the
trellis.
l Zj(i) is the metric stored in state j at time i.
l Small metrics correspond to likely paths.
l Large metrics correspond to unlikely paths.
n Initialize the trellis
l Z0(0) = 0
l Zj(0) = ∞ for all j>0
l This is because we always start in state 0.
n Place the received code word at the bottom of the
trellis.
l n bits per time instant.
Prof. D. Tseng 國立台灣科技大學電機工程研究所 10
Initialized Trellis
Z3∞
(0) Z3(1) Z3(2) Z3(3) Z3(4) Z3(5) Z3(6)

Z2∞
(0) Z2(1) Z2(2) Z2(3) Z2(4) Z2(5) Z2(6)

Z1∞
(0) Z1(1) Z1(2) Z1(3) Z1(4) Z1(5) Z1(6)

Z00(0) Z0(1) Z0(2) Z0(3) Z0(4) Z0(5) Z0(6)

r = [ 1 1 1 1 1 0 0 1 0 1 1 0 ]
Prof. D. Tseng 國立台灣科技大學電機工程研究所 11
Step 1(a):
Branch Metric Calculation
n Compute a metric for each branch connecting
states at time 0 to states at time 1.
l There will be 2k branches.
n The metric is the Hamming distance
between the received bits and the code bits
assigned to that branch.
l i.e. if the bits associated with that branch were
transmitted, then how many errors would have
occurred in the BSC?
n Label every branch with its metric
Prof. D. Tseng 國立台灣科技大學電機工程研究所 12
Branch Metric Calculation at
time 1
∞ 01
1 Z3(1) Z3(2) Z3(3) Z3(4) Z3(5) Z3(6)

1
10

1
10
∞ Z2(1) Z2(2) Z2(3) Z2(4) Z2(5) Z2(6)
1
01

2
00

∞ 11
0 Z1(1) Z1(2) Z1(3) Z1(4) Z1(5) Z1(6)

0
11

2
00
0 Z0(1) Z0(2) Z0(3) Z0(4) Z0(5) Z0(6)

r = [ 1 1 1 1 1 0 0 1 0 1 1 0 ]
Prof. D. Tseng 國立台灣科技大學電機工程研究所 13
Step 1(b):
Path Metric Calculation

n For each branch connecting states at time 0


with states at time 1,
l Add the branch metric to the partial path metric it
is connect to.
n Label each branch with the corresponding
path metric.

Prof. D. Tseng 國立台灣科技大學電機工程研究所 14


Path Metric Calculation at time 1
∞ 1∞ Z3(1) Z3(2) Z3(3) Z3(4) Z3(5) Z3(6)

1∞


1
∞ Z2(1) Z2(2) Z2(3) Z2(4) Z2(5) Z2(6)

1∞

2

∞ ∞
0 Z1(1) Z1(2) Z1(3) Z1(4) Z1(5) Z1(6)

0 2 Z0(1) Z0(2) Z0(3) Z0(4) Z0(5) Z0(6)

r = [ 1 1 1 1 1 0 0 1 0 1 1 0 ]
Prof. D. Tseng 國立台灣科技大學電機工程研究所 15
Step 1(c):
Trellis Update
n Each node has two branches entering it.
n Pick the most probable branch.
l The most probable path leading into a node is the one with
the lowest metric.
n Maintain the path with the smaller metric.
l Delete the path with the larger metric.
l If there is a tie, randomly pick between the two paths.
n Update the trellis at time 1 with the lower of the two
path metrics.

Prof. D. Tseng 國立台灣科技大學電機工程研究所 16


Trellis Update at time 1
1∞ Z∞
∞ 3(1) Z3(2) Z3(3) Z3(4) Z3(5) Z3(6)

1∞


1
∞ Z∞
2(1) Z2(2) Z2(3) Z2(4) Z2(5) Z2(6)

1∞

2

∞ ∞
0 Z10(1) Z1(2) Z1(3) Z1(4) Z1(5) Z1(6)

0 2 Z02(1) Z0(2) Z0(3) Z0(4) Z0(5) Z0(6)

r = [ 1 1 1 1 1 0 0 1 0 1 1 0 ]
Prof. D. Tseng 國立台灣科技大學電機工程研究所 17
Forward Sweep Through
Trellis

n Now increment time.


l i=i+1
n Repeat steps 1(a) through 1(c)
l 1(a) Calculate branch metrics
l 1(b) Calculate path metric
l 1(c) Update trellis
n Continue until the end of the trellis is
reached.
Prof. D. Tseng 國立台灣科技大學電機工程研究所 18
Forward Sweep Through
Trellis
1

∞ Z∞
3(1) 1∞
01 1 2
3
01 2 0
2
01 2 01 01

1∞ 1∞
10 0
1
10 10
42 2
4
10 10


1 10
1 2
10
0 2
4
10 10 10
∞ Z∞
2(1) 1 1 2 2
1∞ 01
1 4
2
01 0
2
01 2
01
0 01
00 01
2
∞ 2
00∞ 1
2
00 21 00 00
00
∞ Z10(1) 11∞
0 1
2
11 11
1
2 1
3
11 11
1
3
11
∞ 0 2 2 2

0
11 2
0
11 11
15 1
3
11 11 11

0 2 Z02(1) 00
24 5 2 1
3 2 3 1
4 3
4 00
1 00 1
3
00 00

r = [ 1 1 1 1 1 0 0 1 0 1 1 0 ]
Prof. D. Tseng 國立台灣科技大學電機工程研究所 19
Traceback

n We assume that the encoder ended in the all


zeros state.
n The most probable path leads into the last all
zeros state in the trellis, Zo(L+m)
l We can trace this path from right to left.
l Once the ML path has been identified, we can
read the data bits from the trellis.
l The trellis tells us the estimated data.

Prof. D. Tseng 國立台灣科技大學電機工程研究所 20

You might also like