FT GT F GT D FT G D: Asterisk

Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 22

Convolution

The convolution of f (t)and g(t) is written f(t)∗g(t), using an asterisk or star. It is


defined as the integral of the product of the two functions after one is reversed
and shifted.
It is a particular kind of integral transform:

f  t   g  t    f    .g  t    d


  f  t    .g    d

For functions f, g supported on only   (i.e., zero for negative arguments), the
integration limits can be truncated, resulting in

f  t   g  t    f    .g  t    d
0

Convolution is a mathematical operation on two  functions  (f and g); it


produces a third function, that is typically viewed as a modified version of
one of the original functions, giving the integral of the point wise
multiplication of the two functions as a function of the amount that one of
the original functions is  translated. 
Properties of Convolution
Commutative Property x1 (t )  x2 (t )  x2 (t )  x1 (t )

Distributed Property x1 (t )   x2 (t )  x3 (t )    x1 (t )  x2 (t )    x1 (t )  x3 (t ) 

Associative Property x1 (t )   x2 (t )  x3 (t )    x1 (t )  x2 (t )   x3 (t )

Shift Property x1 (t )  x2 (t )  z (t )
x1 (t )  x2 (t  T )  z (t  T )

Time Convolution Theorem:  X 1 ( ) X 2 ( )


x1 (t )  x2 (t ) 
1
Frequency Convolution Theorem: x (t ) x (t ) 
 X 1 ( )  X 2 ( )
2
1 2
Cross-correlation
Cross-correlation is a measure of similarity of two series as a function of the
displacement of one relative to the other.
 
R12 ( )   x1 (t ) x (t   )   x1 (t   ) x2 (t )

2
 
 

If two signals are real R12 ( )   x1 (t ) x2 (t   )   x1 (t   ) x2 (t )


 

The cross-correlation is similar in nature to the  convolution  of two


functions. Whereas convolution involves reversing a signal, then shifting it
and multiplying by another signal, correlation only involves shifting it and
multiplying (no reversing).

Autocorrelation
In an  autocorrelation, which is the cross-correlation of a signal with itself,
there will always be a peak at a lag of zero, and its size will be the signal energy.
Energy and Power Signals

Energy Signals: Consider x(t) as an energy signal if, and only if, it has
nonzero but finite energy (0 < Ex < ∞ ) for all time, where

 In the real world, transmit signals having finite energy (0 < Ex < ∞)
 However, for periodic signals, which by definition exist for all time and
thus have infinite energy.
 Furthermore, random signals have infinite energy.
 Hence, it is convenient to define a class of signals called power signals.
Power Signals
A signal is defined as a power signal if, and only if, it has finite but
nonzero power (0 < Px < ∞ ) for all time, where

1 T
P  TLt  | x (t ) | dt
2

2T T

 An energy signal has finite energy but zero average power,


whereas a power signal has finite average power but infinite
energy.
 As a general rule, periodic signals and random signals are
classified as power signals, while signals that are both
deterministic and nonperiodic are classified as energy signals
 Both energy and power signals are mutually exclusive, i. e., no
signal can both energy and power signal.
Periodic and Aperiodic signal

Periodic Signal
 A signal with a defined pattern is considered to be a periodic signal when it
is repeats itself at regular interval of time.
 The function f(x) can be periodic if x(t + T) = x(t), where t is time and T is
the period. Smallest value of T is called fundamental frequency.

Aperiodic Signal or Non-periodic Signal


 A signal is considered to be non-periodic or aperiodic signal when it does not
repeat its pattern over a period (i.e. interval of time).
Noise in Communication Systems
 Noise is unwanted signal that affects wanted signal
 Noise is random signal that exists in communication systems
 Noise
 Internal
 External
Internal:
 It is due to random movement of electrons in electronic circuits
 Major sources are resistors, diodes, transistors etc.
 Thermal noise or Johnson noise and shot noise are examples.
External:
 Man- made and natural resources
 Sources over which we have no control
 Examples are Motors, generators, atmospheric sources.

Noise level in system is proportional to
 Temperature and bandwidth
 Amount of current
 Gain of circuit
 Resistance of circuit
Effect of noise
 Degrades system performance (Analog and digital)
 Receiver cannot distinguish signal from noise
 Efficiency of communication system reduces

Types of noise
 Thermal noise/white noise
 Shot noise
 Noise temperature
 Quantization noise

Thermal Noise:
 This noise is generated due to thermal motion (Brownian motion) of
electrons inside resistor.
 This noise is zero at absolute zero degree Kelvin and generated when
temperature rises, also called thermal noise.
 Thermal noise also referred as ‘White noise’ since it has uniform spectral
density across the EM Spectrum.
SHOT NOISE
 It is electronic noise that occurs when there are finite number of particles
that carry energy such as electrons or photons.
 It has uniform spectral density like thermal noise.
Additive White Gaussian Noise
Additive white Gaussian noise (AWGN) is a basic noise model used
in communication to mimic the effect of many random processes that occur
in nature.
Additive because it is added to any noise that might be intrinsic to the
information system.
White noise
 Noise has typically very low correlation across time, i.,e.

Rnn     E  N (t ) N (t   )       0, for   0
2
 Noise samples at any two time instant is uncorrelated.
 White refers to the idea that it has uniform power spectral density
across the all frequency .for the communication system.
 It is an analogy to the color white which has uniform emissions at all
frequencies in the visible spectrum.
Gaussian Noise-
 A noise is called Gaussian noise if it follows a Gaussian random process
 N(t) is Gaussian random process if statistics of all orders are jointly
Gaussian.
 This means the joint distribution of the noise samples N(t1), N(t2), . . . ,
N(tk) at time t1, t2, t3, . . . , tk are jointly Gaussian then it is Gaussian
random process.

A noise that satisfies the property of Additive,


Gaussian and White is called Additive White Gaussian
Noise and such a channel is known as AWGN channel
Power Spectral Density (PSD)

A signal is defined as a power signal if, and only if, it has finite but nonzero
power (0 < Px < ∞ ) for all time, where

1 T
P  TLt  | x (t ) | dt
2

2T T
 The distribution of average power of the signal in the
frequency domain is called spectral density or power
spectral density
 The area under the PSD function is equal to the average power of that
signal.
Property of Power Spectrum Density
1. The power spectrum density of a power signal g(t) is a non-negetive real-
valued function of frequency

S g ( f )  o, f

2. Power spectrum density of a real-valued power signal g(t) is an even


function of frequency
S g ( f )  S g ( f ), f

3. Total area under the curve of power spectrum density of a power signal g(t)
equals the signal power

P   S g ( f )df

4. When an power signal is transmitted through a linear time-invariant system, the
power spectral density of the output equals the power spectral density of the input
multiplied by the squared amplitude response of the system.

S y ( f ) | H ( f ) |2 S x ( f )
Properties of the Autocorrelation function of Power signals

1. The autocorrelation function of a real-valued power signal g(t) is a real valued even
function of frequency
Rg ( )  Rg ( )

2. The value of the autocorrelation function of a power signal g(t) at the origin is equal
to the power of the signal

Rg (0)  P
3. The maximum value of the autocorrelation function of a power signal g(t) occurs at
the origin

| Rg ( ) | Rg (0)

4. For a power signal g(t), the autocorrelation function and power spectral density
form a Fourier transform pair

Rg ( ) 
 Sg ( f )
Energy Spectral Density (ESD)

Energy Signals: Consider x(t) as an energy signal if, and only if, it has
nonzero but finite energy (0 < Ex < ∞ ) for all time, where

A signal with finite energy i.e. 0 < Ex < ∞ and zero average power is called
energy signal
 Energy Spectral Density (ESD): The distribution of energy of the
signal in the frequency domain is called energy spectral density
 The area under the ESD function is equal to the average energy of that
signal.
Property of Energy Spectrum Density
1. g( f ) , the Energy spectrum density of an energy signal g(t) is a non-
negaetive real-valued function of frequency

 g ( f )  o, f

2. Energy spectrum density of a real-valued energy signal g(t) is an even


function of frequency
 g ( f )   g ( f ), f
G ( f )  G ( f ), f
3. Total area under the curve of Energy spectrum density of a real-valued
energy signal g(t) equals the signal energy

E    g ( f )df

4. When an energy signal is transmitted through a linear time-invariant system, the
energy spectral density of the output equals the energy spectral density of the input
multiplied by the squared amplitude response of the system.

 y ( f ) | H ( f ) |2  x ( f )
Properties of the Autocorrelation function of energy signals

1. The autocorrelation function Rg ( ) of a real-valued energy signal g(t) is a real


valued even function of frequency

Rg ( )  Rg ( )
2. The value of the autocorrelation function of an energy signal g(t) at the origin is
equal to the energy of the signal

Rg (0)  E
3. The maximum value of the autocorrelation function of an energy signal g(t) occurs at
the origin

| Rg ( ) | Rg (0)

4. For an energy signal g(t), the autocorrelation function and energy spectral density
form a Fourier transform pair

Rg ( ) 
 g ( f )
Main Points:
 Energy spectral density measures signal energy distribution across
frequency.
 Autocorrelation function of an energy signal measures signal self-
similarity versus delay: can be used for synchronization.
 A signal’s autocorrelation and ESD are Fourier transform pairs.
 Power signals often do not have Fourier transforms: instead we
characterize them using PSD.
 Can determine the impact of filtering and modulation of power signals
based on PSD.
Relation between ESD and Autocorrelation
The autocorrelation function R(τ) and energy spectral density function  ( )
form a Fourier transform pair

R    
 ( )

Relation between PSD and Autocorrelation


The autocorrelation function R(τ) and Power spectral density function S   
of a power signal form a Fourier transform pair

S    
S ()

Relation between Convolution and Autocorrelation


If one of the function is an even function of t, let x1(t) is an even function i.e.
X1(-t) then the cross correlation and convolution are equivalent

You might also like