4-3 Gaussian Random Vector
4-3 Gaussian Random Vector
4-3 Gaussian Random Vector
X1(t )
time
X 2 (t )
X N (t )
t1 t2 tk
Φ X (ω1 , ω 2 , , ω k ) = e ( 1 1 2 2
j ω X +ω X ++ω k X k )
( g1)
Φ X (ω ) = e jω X
Then
ωX T = ω1 X1 + ω2 X 2 + + ωk X k
Φ X1 , X 2 ,, X (ω1 , ω2 , , ω )
∞ ∞
= e jω1 x1 e jω2 x2 e jω x f X1 , X 2 ,, X ( x1 , x2 , , x ) dx1dx2 dx
−∞ −∞
∞ ∞ ∞ jω1 x1 jω2 x2
= e jω x f X1 , X 2 ,, X , X +1 ,, X k ( x1 , x2 , , x , x +1 , , xk ) dx1dx2 dx dx +1 dxk
−∞ −∞ −∞
e e
Example
Φ X 1 , X 3 (ω1 , ω3 ) = Φ X 1 , X 2 , X 3 (ω1 , ω2 = 0, ω3 )
Φ X 1 (ω1 ) = Φ X 1 , X 2 , X 3 (ω1 , ω2 = 0, ω3 = 0 )
4
Define
λij = cov( X i , X j ) = X i X j − X i X j
and
λ11 λ12 λ1k
λ λ22 λ2 k
ΛX = 21
λk1 λk 2 λkk
Λ X is referred to as the covariance matrix of the random vector X .
property
The diagonal elements of the covariance matrix are
λ jj = cov ( X j , X j ) = σ 2j j = 1, 2, , k
that is, the variance of X j .
property
For i ≠ j ,
λij = cov ( X i , X j ) = X i X j − X i X j = λ ji
that is, ΛX is symmetrical.
property
The correlation coefficient is
ρij =
(
cov X i , X j )= λij
σ iσ j σ iσ j
Thus
λij = ρijσ iσ j
5
property
When X is a zero mean random vector, that is, X i = 0 for every i = 1, 2, , k ,
λij = X i X j
In that case,
ΛX = X T X
Note that
X1 X1 X1 X1 X 2 X1 X k
X2 X X X2X2 X2Xk
T
X X= (X X2 Xk ) = 2 1
1
Xk X k X1 Xk X2 Xk Xk
6
X1(t )
time
X 2 (t )
X N (t )
t1 t2 tk
As N → ∞, Z (t ) becomes a Gaussian random process.
Proof {Z (t1),Z (t2 ),, Z (tk )} are jointly Gaussian for any k and for any sampling instants.
1 N TN
= Xi Xi
N i =1 i =1
(
which is X1T + X 2T + + X N
T
) ( X1 + X 2 + + X N )
1 N T N N
= X i X i + X iT X j
N i =1 i =1 j =1
j ≠i
noting X iT X j = X iT X j = 0 for i ≠ j
N
1
ΛZ =
N
X iT X i
i =1
N
1
=
N
ΛX i
i =1
= ΛX
7
Proof
T
Φ Z ( ω ) = e jωZ
1 N
jω X iT
=e N i =1
ω
N j X iT
= ∏e N
i =1
ω
N j X iT
= ∏e N
i =1
N
ω
= ∏ Φ Xi
i =1 N
noting { X i } are identical to X
N
ω
= Φ X
N
8
Φ X ( ω ) = eW
W2 W3
=1+W + + + ( m1)
2! 3!
Now assume X is a zero-mean random vector.
The 2nd term of eq.m1 is
(
W = j ω1 X1 + ω2 X 2 + + ωk X k )
For a zero-mean vector X , X j = 0 for all j. Thus we have
W =0 ( m2 )
The 3rd term of eq.m1 is
2
W 2 = j 2 (ω1 X1 + ω2 X 2 + + ωk X k )
= − (ω1 X1 + ω2 X 2 + + ωk X k )(ω1 X1 + ω2 X 2 + + ωk X k )
k k
= − ωi X i X jω j
i =1 j =1
= −ωΛX ωT ( m3)
9
Proof
From eq.m1, m2, and m3,
ω W W2 W3
ΦX = 1 + + + 3
+
N N 2! N 3! N 2
1 1
= 1− ωΛX ωT + 3 f3
2N N2
ω 1 T 1
ln Φ X = ln 1 − 2 N ωΛX ω + 3 f3
N N2
u 2 u3
Recalling ln(1 + u ) = u − + − ; u < 1
2 3
ω 1 T 1
ln Φ X = − 2 N ωΛX ω + 3 f3 + other terms
N N2
From eq.g 4,
ω 1 T 1
ln Φ Z ( ω ) = N ln Φ X = − 2 ωΛX ω + 1 f3 + other terms
N N2
Finally
1
lim ln Φ Z ( ω ) = − ωΛX ωT
N →∞ 2
and from eq.g 3, Λ Z = Λ X .
10
Proof.
Define Y = X − m X .
Then Y is a zero-mean Gaussian random vector, and
it is easy to see ΛY = ΛX .
From eq.m4,
1
ΦY ( ω ) = exp − ωΛY ωT .
2
Thus
(
Φ X ( ω ) = exp jωX T )
(
= exp jω (Y + m X )
T
)
(
= exp jωY T ) exp ( jωm X T )
= ΦY ( ω ) exp ( jωm X T )
1
= exp − ωΛY ωT + jωm X T
2
noting ΛX = ΛY
1
= exp − ωΛX ωT + jωm X T
2
11
f X ( x) =
( 2π )
1
k
2 ΛX
1
2
( −1
exp − 12 ( x − m X ) ΛX ( x − mX ) T
)
Example
For k = 1,
X=X
m X = [μ ]
ΛX = [σ 2 ]
1
Φ X (ω ) = exp − ωσ 2ω + jωμ
2
12
For k = 2,
X = ( X1 X 2 ) jointly Gaussian with correlation coefficient ρ
m X = [ μ1 μ2 ]
cov ( X1, X 2 )
recalling ρ =
σ1σ 2
σ2 ρσ1σ 2
= 1
ρσ1σ 2 σ 22
1
Φ X ( ω ) = exp − ωΛX ωT + jωm TX
2
ω = [ω1 ω2 ]
σ2 ρσ1σ 2 ω1
ωΛX ωT = [ω1 ω2 ] 1
ρσ1σ 2 σ 2 ω2
2
= ω12σ12 + 2ω1ω2 ρσ1σ 2 + ω2 2σ 22
μ
ωm TX = [ω1 ω2 ] 1
μ2
= ω1μ1 + ω2 μ 2
f X ( x) =
( 2π )
1
k
2 ΛX
1
2
( −1
exp − 12 ( x − m X ) ΛX ( x − mX ) T
)
For k = 2,
X = ( X1 , X 2 ) .
x = ( x1, x2 ) .
m X = ( μ1, μ2 ) .
x − m X = ( x1 − μ1, x2 − μ2 ) .
σ12 ρσ1σ 2
ΛX = .
ρσ σ σ 22
1 2
2 2
σ 22 ( x1 − μ1 ) − 2 ρσ1σ 2 ( x1 − μ1 )( x2 − μ2 ) + σ12 ( x2 − μ2 )
( x − m X ) ΛX−1 ( x − m X )T =
(
σ12σ 22 1 − ρ 2 )
2 2
x1 − μ1 x1 − μ1 x2 − μ2 x2 − μ2
− 2ρ +
σ1 σ1 σ2 σ2
=
(
1− ρ 2 )
Finally we have, for k = 2,
2 2
x1 − μ1 x1 − μ1 x2 − μ2 x2 − μ2
−2ρ +
σ1 σ1 σ2 σ2
−
1 2
2(1− ρ )
f X 1 , X 2 ( x1, x2 ) = e
2πσ1σ 2 (1 − ρ 2 )1/2
14
Y T = AX T + bT
where dim X = k , A is a k × k matrix, and b is a k -dimesional constant vector.
Then Y is also a Gaussian random vector with
Note.
A can be a h × k matrix with h < k .
Eq.w1 still holds true.
15
( ) (
Let X1 ~ N μ1, σ12 and X 2 ~ N μ2 , σ 22 . )
Suppose X1 and X 2 are jointly Gaussian with correlation coefficeint ρ .
Define
Y1 = a1 X1 + a2 X 2
Y2 = X1
Find the covariance matrix of (Y1 , Y2 ) .
Solution.
In this example,
a a
A = 1 2
1 0
b=0
Y1 and Y2 are joinly Gaussian with
a a μ a μ + a μ
mYT = Am TX = 1 2 1 = 1 1 2 2
1 0 μ2 μ1
and
a1 a2 σ1 ρσ1σ 2 a1 1
2
T
ΛY = AΛX A =
1 0 ρσ1σ 2 σ 22 a2 0
a 2σ 2 + 2 ρ a1a2 σ1σ 2 + a2 2σ 22 a1σ12 + ρ a2σ1σ 2
= 1 1
a1σ12 + ρ a2 σ1σ 2 σ12
= a12σ12 + a2 2σ 22 + 2 ( a1 X1a2 X 2 − a1 X1 ⋅ a2 X 2 )
= a12σ12 + a2 2σ 22 + 2 a1a2 ( X1 X 2 − X1 ⋅ X 2 )
Covariance matrix ΛY is 1 × 1, and shows VAR (Y1 ) = a12σ12 + 2a1a2 ρσ1σ 2 + a22σ 22
17
N
1
Z (t ) =
N
X i (t ); { X i (t )} are independent random telegraph signals
i =1
X1(t )
X 2 (t )
X N (t )
t=0 t1 = 1 t2 = 2 t3 = 3
As N → ∞, Z (t ) becomes a Gaussian random process.
{Z (t1),Z (t2 ),Z (t3 )} are jointly Gaussian.
18
mean m X (t ) = X (t ) = 0 ;
variance σ X2 (t ) = X (t ) 2 = 1 ;
−2α τ
auto-correlation RX (τ ) = e
We have
1 e −2α e −4α
ΛX = e −2α 1 e −2α
e −4α e −2α
1
19
Φ X (ω1 , ω 2 , ω3 ) = e ( 1 1 2 2 3 3 )
j ω X +ω X +ω X
( e1)
Using matrix notation,
Let ω = (ω1, ω2 , ω3 ) and X = ( X1, X 2 , X 3 ) . Then
ωX T = ω1 X1 + ω2 X 2 + ω3 X 3 and eq.e1 is written as
T
Φ X ( ω ) = e jωX
f Z (z) =
1
k 1 (
exp − 12 zΛZ−1z T ) with k = 3.
( 2π ) 2 ΛZ 2