ADC Chapter 1 Notes

Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 24

Advanced Digital

Communication Systems
Lecture #2
Course Instructor
Dr Nauman Anwar Baig
1.3 Spectral Density

 The spectral density of a signal characterizes the distribution of


the signal’s energy or power in the frequency domain.

 This concept is particularly important when considering filtering in


communication systems while evaluating the signal and noise at
the filter output.

 The energy spectral density (ESD) or the power spectral density


(PSD) is used in the evaluation.
1. Energy Spectral Density (ESD)

 Energy spectral density describes the signal energy per unit


bandwidth measured in joules/hertz.
 Represented as ψx(f), the squared magnitude spectrum
x( f ) X ( f )
2
(1.14)
 According to Parseval’s theorem, the energy of x(t):
 
Ex   x 2 (t ) dt  
2
| X ( f ) | df (1.13)
 Therefore: - -

Ex  
-
x ( f ) df (1.15)
 The Energy spectral density is symmetrical in frequency about
origin and total energy of the signal x(t) can be expressed as:

Ex  2  x ( f ) df (1.16)
0
2. Power Spectral Density (PSD)

 The power spectral density (PSD) function Gx(f ) of the periodic


signal x(t) is a real, even, and nonnegative function of frequency
that gives the distribution of the power of x(t) in the frequency
domain. 
 PSD is represented as: G x (f ) =
n=-

|Cn |2 ( f  nf 0 )
(1.18)
 Whereas the average power of a periodic signal x(t) is
T /2 
represented as: 1 0
Px 
T0
2
 x (t) dt 
2
 |C |
n=-
n
(1.17)
T0 / 2

 Using PSD, the average normalized power of a real-valued


signal is represented as:
 
Px  G

x (f) df  2  G x (f) df
0
(1.19)
1.4 Autocorrelation
1. Autocorrelation of an Energy Signal

 Correlation is a matching process; autocorrelation refers to the


matching of a signal with a delayed version of itself.
 Autocorrelation function of a real-valued energy signal x(t) is
defined as:

Rx ( )  

x(t ) x (t   ) dt for -     (1.21)

 The autocorrelation function Rx(τ) provides a measure of how


closely the signal matches a copy of itself as the copy is shifted
τ units in time.
 Rx(τ) is not a function of time; it is only a function of the time
difference τ between the waveform and its shifted copy.
1. Autocorrelation of an Energy Signal

 The autocorrelation function of a real-valued energy signal has


the following properties:

R x ( ) =R x (- ) symmetrical in about zero

R x ( )  R x (0) for all  maximum value occurs at the origin

R x ( )   x (f) autocorrelation and ESD form a


Fourier transform pair, as designated
by the double-headed arrows


value at the origin is equal to
R x (0)  x 2 (t) dt
the energy of the signal

2. Autocorrelation of a Power Signal

 Autocorrelation function of a real-valued power signal x(t) is


defined as:
T /2
1
R x ( )  lim
T 

T T / 2
x(t) x (t +  ) dt for - <  <  (1.22)

 When the power signal x(t) is periodic with period T0, the
autocorrelation function can be expressed as
T0 / 2
1
R x ( ) 
T0 
T0 / 2
x(t) x (t +  ) dt for - <  <  (1.23)
2. Autocorrelation of a Power Signal

 The autocorrelation function of a real-valued periodic signal has


the following properties similar to those of an energy signal:

R x ( ) =R x (- ) symmetrical about zero

R x ( )  R x (0) for all  maximum value occurs at the origin

R x ( )  Gx (f) autocorrelation and PSD form a


Fourier transform pair
T0 / 2
1 value at the origin is equal to the
R x (0)  
2
x (t) dt average power of the signal
T0  T0 / 2
1.5 Random Signals
1. Random Variables
 All useful message signals appear random; that is, the receiver does
not know, a priori, which of the possible waveform have been sent.

 Let a random variable X(A) represent the functional relationship


between a random event A and a real number.

 The (cumulative) distribution function FX(x) of the random variable X


is given by
FX ( x)  P( X  x) (1.24)

 Another useful function relating to the random variable X is the


probability density function (pdf)
(1.25)
dFX ( x)
p X ( x) 
dx
1.1 Ensemble Averages

 xp
 The first moment of a
m X  E{ X }  X ( x)dx probability distribution of a
 random variable X is called
mean value mX, or expected

value of a random variable X
E{ X 2 }   x 2 p X ( x)dx  The second moment of a
probability distribution is the
 mean-square value of X
var( X )  E{( X  m X ) 2 }  Central moments are the
moments of the difference

between X and mX and the
   2
( x m X ) p X ( x)dx second central moment is the
 variance of X
 Variance is equal to the
var( X )  E{ X }  E{ X }
2 2 difference between the mean-
square value and the square
of the mean
2. Random Processes

 A random process X(A, t) can be viewed as a function of two


variables: an event A and time.
1.5.2.1 Statistical Averages of a Random
Process
 A random process whose distribution functions are continuous can
be described statistically with a probability density function (pdf).

 A partial description consisting of the mean and autocorrelation


function are often adequate for the needs of communication
systems.

 Mean of the random process X(t) :



(1.30)
E{ X (tk )}   xp

Xk ( x) dx  mX (tk )
 Autocorrelation function of the random process X(t)
(1.31)
RX (t1 , t2 )  E{ X (t1 ) X (t2 )}
1.5.2.2 Stationarity
 A random process X(t) is said to be stationary in the strict sense
if none of its statistics are affected by a shift in the time origin.

 A random process is said to be wide-sense stationary (WSS) if


two of its statistics, its mean and autocorrelation function, do not
vary with a shift in the time origin.

E{ X (t )}  mX  a constant (1.32)

RX (t1 , t2 )  RX (t1  t2 ) (1.33)


1.5.2.3 Autocorrelation of a Wide-Sense
Stationary Random Process
 For a wide-sense stationary process, the autocorrelation
function is only a function of the time difference τ = t1 – t2;
RX ( )  E{ X (t ) X (t   )} for      (1.34)
 Properties of the autocorrelation function of a real-valued wide-
sense stationary process are

1. RX ( )  RX ( ) Symmetrical in τ about zero


2. RX ( )  RX (0) for all  Maximum value occurs at the origin
3. RX ( )  GX ( f ) Autocorrelation and power spectral
density form a Fourier transform pair
Value at the origin is equal to the
4. RX (0)  E{ X 2 (t )} average power of the signal
1.5.3. Time Averaging and Ergodicity

 When a random process belongs to a special class, known as an


ergodic process, its time averages equal its ensemble
averages.

 The statistical properties of such processes can be determined


by time averaging over a single sample function of the process.

 A random process is ergodic in the mean if


T /2
1
mX  lim
x  T 
T / 2
X (t )dt (1.35)

 It is ergodic in the autocorrelation function if


T /2
1
RX ( )  lim
x  T 
T / 2
X (t ) X (t   )dt (1.36)
 Time average is more like a typical average,
in that it is the average value of a single
outcome of a stochastic process. ...
 The ensemble is defined as a set of all
possible outcomes of a stochastic process,
and ensemble average means the expected
object (like expected value for random
variable) of the stochastic process
1.5.4. Power Spectral Density and
Autocorrelation
 A random process X(t) can generally be classified as a power
signal having a power spectral density (PSD) GX(f )
 Principal features of PSD functions

1. GX ( f )  0
And is always real valued
2. GX ( f )  GX ( f ) for X(t) real-valued

3. GX ( f )  RX ( ) PSD and autocorrelation form a


Fourier transform pair

4. PX (0)  G

X ( f )df Relationship between average
normalized power and PSD
1.5.5. Noise in Communication Systems

 The term noise refers to unwanted electrical signals that are


always present in electrical systems; e.g spark-plug ignition
noise, switching transients, and other radiating electromagnetic
signals.

 Can describe thermal noise as a zero-mean Gaussian random


process.

 A Gaussian process n(t) is a random function whose amplitude at


any arbitrary time t is statistically characterized by the Gaussian
probability density function

1  1  n 2 
p ( n)  exp      (1.40)
 2  2    
Noise in Communication Systems
 The normalized or standardized Gaussian density function of a
zero-mean process is obtained by assuming unit variance.
1.5.5.1 White Noise

 The primary spectral characteristic of thermal noise is that its


power spectral density is the same for all frequencies of interest
in most communication systems
 Power spectral density Gn(f ) (No is power of noise)
N0 (1.42)
Gn ( f )  watts / hertz
2
 Autocorrelation function of white noise is

N0
Rn ( )   {Gn ( f )} 
1
 ( ) (1.43)
2
 The average power Pn of white noise is infinite

N0
p ( n)  

2
df   (1.44)
 The effect on the detection process of a channel with additive
white Gaussian noise (AWGN) is that the noise affects each
transmitted symbol independently.

 Such a channel is called a memoryless channel.

 The term “additive” means that the noise is simply superimposed


or added to the signal

You might also like