Commsys 2 2012 6
Commsys 2 2012 6
Commsys 2 2012 6
CHAPTER 5
Outline
5.1 Introduction
5.2 Geometric Representation of Signals
Gram-Schmidt Orthogonalization Procedure
5.3 Conversion of the AWGN into a Vector Channel
5.4 Maximum Likelihood Decoding
5.5 Correlation Receiver
5.6 Probability of Error
Transmitter Side
Symbol generation (message) is probabilistic, with
a priori probabilities p1, p2, .. pM. or
Symbols are equally likely
So, probability that symbol mi will be emitted:
i P (mi )
1
= for i=1,2,....,M (5.1)
M
T
E i si2 (t )dt , i=1,2,....,M (5.2)
0
Channel Assumptions:
Linear, wide enough to accommodate the signal si(t)
with no or negligible distortion
Channel noise is w(t) is a zero-mean white Gaussian
noise process AWGN
additive noise
received signal may be expressed as:
0tT
x(t ) si (t ) w(t ), (5.3)
i=1,2,....,M
Receiver Side
Observes the received signal x(t) for a duration of time T sec
Makes an estimate of the transmitted signal si(t) (eq. symbol mi).
Process is statistical
presence of noise
errors
So, receiver has to be designed for minimizing the average
probability of error (Pe)
What is this?
M
Pe = p P(m m
i 1
i i / mi ) (5.4)
Symbol sent
cond. error probability
given ith symbol was
sent Digital Communication Systems 2012 R.Sokullu 8/45
Chapter 5: Signal Space Analysis
Outline
5.1 Introduction
5.2 Geometric Representation of Signals
Gram-Schmidt Orthogonalization Procedure
5.3 Conversion of the AWGN into a Vector Channel
5.4 Maximum Likelihood Decoding
5.5 Correlation Receiver
5.6 Probability of Error
N
0tT
si (t ) sij j (t ), (5.5)
j 1 i==1,2,....,M
coefficient
Energy signal
Digital Communication Systems 2012 R.Sokullu 10/45
Chapter 5: Signal Space Analysis
Coefficients:
T i=1,2,....,M
sij si (t ) j (t )dt , (5.6)
0
j=1,2,....,M
Real-valued basis functions:
T
1 if i j
0 i (t ) j (t )dt ij 0 if i j (5.7)
Figure 5.3
(a) Synthesizer for generating the signal si(t). (b) Analyzer
for generating the set of signal vectors si .
So,
Each signal in the set si(t) is completely
determined by the vector of its coefficients
si1
s
i 2
.
si , i 1,2,....,M (5.8)
.
.
siN
Finally,
The signal vector si concept can be extended to 2D, 3D etc. N-
dimensional Euclidian space
Provides mathematical basis for the geometric representation of
energy signals that is used in noise analysis
Allows definition of
Length of vectors (absolute value)
Angles between vectors
Squared value (inner product of s i with itself)
2 Matrix
si siT s i Transposition
N
= sij2 , i 1,2,....,M (5.9)
j 1
Also,
What is the relation between the vector representation
of a signal and its energy value?
T
N N
After substitution: E i sij j (t ) s (t )
ik k dt
0 j 1 k 1
N N T
j(t) is orthogonal, so N
Ei s 2 2
= si (5.12)
finally we have: j 1
ij
Euclidian Distance
The Euclidean distance between two points
represented by vectors (signal vectors) is equal to
||si-sk|| and the squared value is given by:
N
= (sij -skj ) 2
2
si s k (5.14)
j 1
T
= ( si (t ) sk (t )) 2 dt
0
T
s s
cosik i k
(5.15)
si sk
Schwartz Inequality
Defined as:
2
s1 (t )s2 (t )dt 2
s (t )dt s22 (t )dt (5.16)
1
Outline
5.1 Introduction
5.2 Geometric Representation of Signals
Gram-Schmidt Orthogonalization Procedure
5.3 Conversion of the AWGN into a Vector Channel
5.4 Maximum Likelihood Decoding
5.5 Correlation Receiver
5.6 Probability of Error
Gram-Schmidt Orthogonalization
Procedure
Assume a set of M energy signals denoted by s1(t), s2(t), .. , sM(t).
g 2 (t )
5. We can define the second 2 (t ) (5.23)
T
basis function 2(t) as:
0
g 22 (t )dt
Special case:
For the special case of i = 1 gi(t) reduces to si(t).
General case:
Outline
5.1 Introduction
5.2 Geometric Representation of Signals
Gram-Schmidt Orthogonalization Procedure
5.3 Conversion of the AWGN into a Vector Channel
5.4 Maximum Likelihood Decoding
5.5 Correlation Receiver
5.6 Probability of Error
T T
Now,
Consider a random
N
process X1(t), with x1(t), a
x(t ) x(t ) x ji (t ) (5.32)
sample function which is j 1
related to the received N
signal x(t) as follows: x(t ) x(t ) ( sij w j ) j (t )
Using 5.28, 5.29 and 5.30 j 1
get: =w(t ) w j j (t )
j 1
=w(t ) (5.33)
which means that the sample function x1(t) depends only on
the channel noise!
Digital Communication Systems 2012 R.Sokullu 31/45
Chapter 5: Signal Space Analysis
Statistical Characterization
The received signal (output of the correlator of
Fig.5.3b) is a random signal. To describe it we need
to use statistical methods mean and variance.
The assumptions are:
X(t) denotes a random process, a sample function of which
is represented by the received signal x(t).
Xj(t) denotes a random variable whose sample value is
represented by the correlator output xj(t), j = 1, 2, N.
We have assumed AWGN, so the noise is Gaussian, so X(t)
is a Gaussian process and being a Gaussian RV, X j is
described fully by its mean value and variance.
Mean Value
Let Wj, denote a random variable, represented by its
sample value wj, produced by the jth correlator in
response to the Gaussian noise component w(t).
So it has zero mean (by definition of the AWGN
model)
x j E X j
then the mean of
=E sij W j
Xj depends only on
=sij E[W j ]
sij:
x j = sij (5.35)
Variance
x2i var[ X j ]
Starting from the definition,
we substitute using 5.29 and =E ( X j sij ) 2
5.31
T =E W j2 (5.36)
wi w(t )i (t )dt (5.31)
0
T T
=E W (t ) j (t )dt W (u ) j (u )du
2
xi
0 0
T T
T T =E j (t )i (u )W (t )W (u )dtdu (5.37)
x2i = o 0
o
i (t ) j (u) E[W (t )W (u)]dtdu
0
T T
=E
0 j (t )i (u) Rw (t , u )dtdu (5.38) Autocorrelation function of
o the noise process
Digital Communication Systems 2012 R.Sokullu 35/45
Chapter 5: Signal Space Analysis
It can be expressed as:
(because the noise is N0
stationary and with a R w (t , u ) (t u ) (5.39)
2
constant power spectral
density) N T T
After substitution for
x2i = 0
2 (t ) (u) (t u)dtdu
o 0
i j
that is:
N
f x ( x / mi ) f x j ( x j / mi ), i=1,2,....,M (5.44)
j 1
N
f x ( x / mi ) f x j ( x j / mi ), i=1,2,....,M (5.44)
j 1
Vector x is called
observation vector
Vector x and scalar xj Scalar xj is called
are sample values of observable element
the random vector X
and the random
variable Xj
1 N 2
f x ( x / mi ) ( N 0 ) exp ( x j sij ) ,
N /2
N 0 j 1 i=1,2,....,M (5.46)
Finally,
The AWGN channel, is equivalent to an N-
dimensional vector channel, described by the
observation vector
x si w, i 1, 2,....., M (5.48)
Outline
5.1 Introduction
5.2 Geometric Representation of Signals
Gram-Schmidt Orthogonalization Procedure
5.3 Conversion of the AWGN into a Vector Channel
5.4 Maximum Likelihood Decoding
5.5 Correlation Receiver
5.6 Probability of Error