Module 2 Eigen Values and Eigen Vectors, Diagonalisation

Download as pdf or txt
Download as pdf or txt
You are on page 1of 139

:: Module-II ::

EIGEN VALUES AND


EIGEN VECTORS,
DIAGONALIZATION
Index:
1. Outcomes
2. Eigen Values and Eigen Vectors
3. Eigen Values and Eigen Vectors corresponding to Non-symmetric
matrices
4. Eigen Values and Eigen Vectors corresponding to Symmetric matrices
5. Diagonalization of a matrix
6. Reduction of Quadratic forms to Canonical form by Linear
transformation method
7. Reduction of Quadratic forms to Canonical form by Orthogonal
transformation method
8. Singular Value Decomposition and its Applications
Outcomes:
 Find eigenvalues and eigenvectors which are useful in
the study of diagonalization.

 Classify quadratic forms as definite, semi-definite, and


indefinite.
Eigen Values and Eigen Vectors
Introduction:
 An Eigen vector is a vector that maintains its direction after
undergoing a linear transformation. Eigen vectors are vectors that
point in directions where there is no rotation . Therefore, the Eigen
vectors of a square matrix are the non-zero vectors that, after being
multiplied by the matrix remain parallel to, the original vector.

 Eigen values are the change in length of the Eigen vector from the
original length. Therefore, Eigen value is the factor by which the
Eigen vector is scaled when multiplied by a matrix.
Definition :
Let A be a square matrix then there exist a scalar λ and a nonzero
vector X such that 𝐴𝑋 = 𝜆𝑋 . Then λ is called characteristic value or
Eigen value of A and X is called characteristic vector or Eigen
vector.

The Eigen value λ tells whether the vector X is stretched or shrunk


or reversed or left unchanged when it is multiplied by a matrix A.
Applications in Engineering:

1. Radio System
Frequencies are used in electrical
systems. When we tune our radio,
we change the resonant frequency
until it matches the frequency at
which the station is broadcasting.
Engineers use Eigen values when
they design the radio.
2. Car Design
Car Designers analyze Eigen
values in order to damp out the
noise so that occupant have a
quite ride. Eigen value analysis is
also used in the design of car
stereo system so that the sounds
are directed correctly for the
listening pleasure of the
passengers and driver
3. Beams and Bridges
Eigen values can be used to test for
cracks or deformities in a solid. When
a beam is struck, its natural
frequency (Eigen value) can be
heard. If the beam rings, then it is not
flawed. A dull sound will result from
a flawed beam, because the flawed
causes the Eigen values to change
The natural frequency of the bridge
is the eigenvalue of smallest
magnitude of a system that models
the bridge. The engineers exploit this
knowledge to ensure the stability of
their constructions.
How to Find Eigen Value :
Let X be an Eigen vector of matrix A. Then there must exist Eigen value
λ such that
𝐴𝑋 = 𝜆𝑋 (By definition)
⟹ 𝐴𝑋 − 𝜆𝑋 = 0
⟹ 𝐴 − 𝜆𝐼 𝑋 = 0
This is homogeneous system and has solution if 𝐴 − λ𝐼 = 0
∴ 𝐴 − λ𝐼 = 0 is called the characteristic equation of A.
The roots of characteristic equation gives the Eigen values of A.
Computational Approach :
We summarize the computational approach for determining
Eigen Pairs (λ, X) (Eigen values and Eigen vectors) as a two-step
procedure.

Step 1: To find Eigen values of matrix A Compute the roots of


Characteristic equation 𝐴 − λ𝐼 = 0

Step 2: To find Eigen vectors corresponding to Eigen value λ.


Compute non-trivial solution to homogeneous linear system
𝐴 − 𝜆𝐼 𝑋 = 0
Characteristic Equation:
1. The Characteristic equation for n × 𝑛 matrix is
𝐴 − 𝜆𝐼 = 0
2. For 2 × 2 Matrices characteristic equation is
𝜆2 − 𝑆1 𝜆 + 𝐴 = 0
3. For 3 × 3 Matrices characteristic equation is
𝜆3 − 𝑆1 𝜆2 + 𝑆2 𝜆 − 𝐴 = 0
where 𝑆1 = Trace of given matrix
𝑆2 = Sum of minors of diagonal elements
𝐴 = Determinant of given matrix.
Note:

1. Word “Eigen value” derived from German word “Eigen wert” Eigen
means proper and wert means value therefore Eigen value is also
called as Proper value or Characteristic roots or Latent roots or
Pole.
2. The eigen values of a square matrix A are the roots of
corresponding characteristic equation 𝑎0 𝜆𝑛 + 𝑎1 𝜆𝑛−1 +⋯ + 𝑎𝑛 = 0
3. An 𝑛 × 𝑛 matrix has at least one eigen value and at most ‘n’
numerically different eigen values.
Properties of Eigen Values:
1. Eigenvalues and eigenvectors are only defined for square matrices
(i.e. Number of rows ‘m’ = Number of Columns ‘n’)
2. The sum of eigen values of a matrix is equal to the sum of the
principal diagonal elements.
i.e. 𝜆1 + 𝜆2 + ⋯ + 𝜆𝑛 = 𝑎11 + 𝑎22 + ⋯ + 𝑎𝑛𝑛 (Trace of matrix)
3. The eigen values of an upper or lower triangular or diagonal matrix
are the elements on its main diagonal.
4. The product of the eigen values of a matrix is equal to the determinant
of the matrix. i.e. 𝜆1 × 𝜆2 × ⋯ × 𝜆𝑛 = 𝐴 (Determinant of matrix)
Note: The inverse of matrix exist iff all eigen values are non-zero.
5. If 𝜆1 , 𝜆2 , … , 𝜆𝑛 are the eigen values of A then the eigen values of
1 1 1
𝐴−1 are , , …, .
𝜆1 𝜆 2 𝜆𝑛

6. If 𝜆1 , 𝜆2 , … , 𝜆𝑛 are the eigen values of A then the eigen values of


kA are 𝑘𝜆1 , 𝑘𝜆2 , … , 𝑘𝜆𝑛 .
7. If 𝜆1 , 𝜆2 , … , 𝜆𝑛 are the eigen values of A then the eigen values of
𝐴𝑚 are 𝜆1𝑚 , 𝜆𝑚
2 , … , 𝜆𝑚.
𝑛
8. If 𝜆1 , 𝜆2 , … , 𝜆𝑛 are the eigen values of A then the eigen values of
𝐴 − 𝑘𝐼 are 𝜆1 − 𝑘, 𝜆2 − 𝑘, … , 𝜆𝑛 − 𝑘.
9. The eigen values of A and 𝐴𝑇 are same.
10. The eigen values of symmetric matrices are real.
Examples:
4 0 0
1. If 𝐴 = 0 5 0 then eigen values are 4, 5, 6.
0 0 6
1 −2 −1
2. If 𝐴 = 0 3 2 then eigen values are 1, 3, 5.
0 0 5
1 0 0
3. If 𝐴 = −1 2 0 then eigen values are 1, 2, 3.
4 0 3
1 0 0
4. If 𝐴 = 0 1 0 then eigen values are 1, 1, 1.
0 0 1
Properties of Eigen Vectors:
1. Eigenvectors are not unique .
(i.e. If X is an eigenvector, then k X is also eigenvector where k is
nonzero constant )
2. Zero vector is a trivial solution to the Eigenvalue equation for any
number λ and is not considered as an eigenvector.
3. If 𝜆1 , 𝜆2 , … , 𝜆𝑛 be distinct eigen values of 𝑛 × 𝑛 matrix then
corresponding eigen vectors 𝑋1 , 𝑋2 , … , 𝑋𝑛 form a linearly
independent set.
i.e. Eigen vectors corresponding to different eigen values are
linearly independent but when two or more eigen values are
equal then their corresponding eigen vectors may or may not be
linearly independent.
4. An 𝑛 × 𝑛 matrix may have ‘n’ linearly independent eigen vectors or it
may have fewer than n.
5. An eigen vector can not corresponds to two distinct eigen values.
6. Eigen vectors of symmetric matrix corresponding to different eigen
values are orthogonal.
1 3
e.g. If 𝑋1 = 2 𝑎𝑛𝑑 𝑋2 = 0 are eigen vectors of symmetric matrix
3 −1
corresponding to 𝜆1 𝑎𝑛𝑑 𝜆2 respectively and 𝜆1 ≠ 𝜆2 then
3
𝑋1𝑇 . 𝑋2 = 1 2 3 0 = 1(3)+2(0)+3(-1) = 0
−1
7. Eigen value may be zero but eigen vector can not be zero vector.
Cramer’s Rule :
Consider 𝑎1 𝑥 + 𝑏1 𝑦 + 𝑐1 𝑧 = 0 and
𝑎2 𝑥 + 𝑏2 𝑦 + 𝑐2 𝑧 = 0
Thus, by Cramer’s Rule we have

𝑥 −𝑦 𝑧
= 𝑎 𝑐 = = 𝑘(𝑠𝑎𝑦)
𝑏1 𝑐1 1 1 𝑎1 𝑏1
𝑏2 𝑐2 𝑎2 𝑐2 𝑎2 𝑏2
Eigen Values and Eigen Vectors
corresponding to Non-
symmetric matrices
What is Non-Symmetric Matrix ?
Non-symmetric matrix is the matrix of the form A ≠ 𝐴𝑇 .
i.e. The matrix which is not a symmetric matrix is a non-symmetric
matrix .

e.g.
1 −2 3 2 4 −7
1 2 0 7
a) b) 4 3 −4 c) −4 0 10 d)
−2 0 −7 0
5 5 7 7 −10 3
Eigen Values and Eigen Vectors corresponding
to Non-Symmetric matrices:

There are two types for Eigen values and eigen vectors
corresponding to Non-Symmetric Matrices such as
a) Non Symmetric matrix with Non Repeated Eigen Values
b) Non Symmetric matrix with Repeated Eigen Values
Non Symmetric matrix with Non Repeated
Eigen Values :
Ex. Find all the Eigenvalues and Eigenvectors of the matrix
1 −1 4
3 2 −1
2 1 −1
Solution: Given,
1 −1 4
A = 3 2 −1
2 1 −1
Now we have to find Characteristic equation for A
𝜆3 − 𝑆1 𝜆2 + 𝑆2 𝜆 − 𝑆3 = 0
Where 𝑆1 = sum of main diagonal (i.e. Trace of a matrix) =1+2-1 = 2
𝑆2 = Sum of minor of main diagonal elements = 𝑚11 + 𝑚22 + 𝑚33
2 −1 1 4 1 −1
= + + = (-2+1) + (-1- 8) + (2+3) = -5
1 −1 2 −1 3 2
1 −1 4
𝑆3 = Det(A) = 𝐴 = 3 2 −1 = 1(-2+1) +1(-3+2)+4(3-4) =-6
2 1 −1
∴ The characteristic equation is given by
𝜆3 − 2𝜆2 − 5𝜆 + 6 = 0
To find the solution of characteristic equation we are going to use Synthetic
division method .
Here one root of the characteristic equation is 1 ( ∵ 13 − 2 1 2 −
5(1) + 6 = 0 )
1 1 -2 -5 6
0 1 -1 -6
1 -1 -6 0
Therefore the 𝜆 = 1 and other roots are given by 𝜆2 − 𝜆 − 6 = 0
∴ (𝜆+2)(𝜆-3) = 0 ⇒ 𝜆 = -2, 3
∴ The Eigen values are 1, -2, 3 (Here the A is non symmetric matrix and
eigen values are non repeated i.e. distinct)
Now we have to find the Eigen vectors:
(A - 𝜆I)X = 0
1 −1 4 1 0 0
3 2 −1 − 𝜆 0 1 0 X = 0
2 1 −1 0 0 1
1−𝜆 −1 4 𝑥1 0
3 2−𝜆 −1 𝑥2 = 0
2 1 −1 − 𝜆 𝑥3 0
(1− 𝜆)𝑥1 −𝑥2 + 4𝑥3 = 0
3𝑥1 + (2 − 𝜆)𝑥2 − 𝑥3 = 0 ……. (1)
2𝑥1 + 𝑥2 −(1 + 𝜆)𝑥3 = 0
Case 1: Put 𝜆 = 1 in equation (1) we get
−𝑥2 + 4𝑥3 = 0 …. (2)
3𝑥1 + 𝑥2 − 𝑥3 = 0 …. (3)
2𝑥1 + 𝑥2 − 2 𝑥3 = 0 …. (4)
After Solving these three equations (2), (3), (4) we get
−𝟏
∴ 𝐗𝟏 = 𝟒
𝟏
Case 2: Put 𝜆 = -2 in equation (1) we get
3𝑥1 − 𝑥2 + 4𝑥3 = 0 …. (5)
3𝑥1 + 4𝑥2 − 𝑥3 = 0 …. (6)
2𝑥1 + 𝑥2 + 𝑥3 = 0 …. (7)
After Solving these three equations (5), (6), (7) we get
−𝟏
∴ 𝐗𝟐 = 𝟏
𝟏
Case 3: Put 𝜆 = -2 in equation (1) we get
−2𝑥1 − 𝑥2 + 4𝑥3 = 0 …. (8)
3𝑥1 − 𝑥2 − 𝑥3 = 0 …. (9)
2𝑥1 + 𝑥2 − 4𝑥3 = 0 …. (10)
After Solving these three equations (5), (6), (7) we get
𝟏
∴ 𝐗𝟑 = 𝟐
𝟏
∴ The eigen values for A are 1, −2 and 3 and the respective eigen vectors are
−𝟏 −𝟏 𝟏
𝟒 , 𝟏 and 𝟐 .
𝟏 𝟏 𝟏
Non Symmetric matrix with Repeated Eigen
Values
Ex. Find all the Eigenvalues and Eigenvectors of the matrix
−2 2 −3
2 1 −6
−1 −2 0
Solution: Given,
−2 2 −3
A= 2 1 −6
−1 −2 0
Now we have to find Characteristic equation for A
𝜆3 − 𝑆1 𝜆2 + 𝑆2 𝜆 − 𝑆3 = 0
Where 𝑆1 = sum of main diagonal (i.e. Trace of a matrix) =-2+1+0 = -1
𝑆2 = Sum of minor of main diagonal elements = 𝑚11 + 𝑚22 + 𝑚33
1 −6 −2 −3 −2 2
= + + = (0-12) + (0-3) + (-2-4) = -21
−2 0 −1 0 2 1
−2 2 −3
𝑆3 = Det(A) = 𝐴 = 2 1 −6 = 45
−1 −2 0
∴ The characteristic equation is given by
𝜆3 + 𝜆2 − 21𝜆 − 45 = 0
To find the solution of characteristic equation we are going to use
Synthetic division method .
Here one root of the characteristic equation is -3 ( ∵ (−3)3 +32 −
21(3) − 45 = 0)
-3 1 1 -21 -45
0 -3 6 45
1 -2 -15 0
Therefore the 𝜆 = -3 and other roots are given by 𝜆2 − 2𝜆 − 15 = 0
∴ (𝜆 − 5)(𝜆 + 3) = 0 ⇒ 𝜆 = 5, -3
∴ The Eigen values are 5, -3, -3 (Here the A is non symmetric matrix and
eigen value -3 is repeated )
Now we have to find the Eigen vectors:
(A - 𝜆I)X = 0
−2 2 −3 1 0 0
2 1 −6 − 𝜆 0 1 0 X=0
−1 −2 0 0 0 1

−2− 𝜆 2 −3 𝑥1 0
2 1− 𝜆 −6 𝑥2 = 0
−1 −2 −𝜆 𝑥3 0

( − 2− 𝜆)𝑥1 −𝑥2 + 4𝑥3 = 0


3𝑥1 + (1− 𝜆)𝑥2 − 𝑥3 = 0 ……. (1)
2𝑥1 + 𝑥2 − 𝜆𝑥3 = 0
Case 1: Put 𝜆 = 5 in equation (1) we get
−7𝑥1 + 2 𝑥2 − 3𝑥3 = 0 …. (2)
2𝑥1 − 4𝑥2 − 6𝑥3 = 0 …. (3)
−𝑥1 − 2𝑥2 − 5𝑥3 = 0 …. (4)
After Solving these three equations (2), (3), (4) we get
𝟏
∴ 𝐗𝟏 = 𝟐
−𝟏
Case 2: Put 𝜆 = -3 in equation (1) we get,
𝑥1 + 2 𝑥2 − 3𝑥3 = 0 …. (5)
2𝑥1 + 4𝑥2 − 6𝑥3 = 0 …. (6)
𝑥1 + 2𝑥2 − 3𝑥3 = 0 …. (7)
After Solving these three equations (5), (6), (7) we get
𝟎 𝟑
∴ 𝐗 𝟐 = 𝟑 and 𝐗 𝟑 = 𝟎
𝟐 𝟏
∴ 𝑇ℎ𝑒 eigen 𝑉𝑎𝑙𝑢𝑒𝑠 𝑓𝑜𝑟 𝑚𝑎𝑡𝑟𝑖𝑥 𝐴 𝑎𝑟𝑒 5, −3 𝑎𝑛𝑑 − 3 𝑎𝑛𝑑 𝑡ℎ𝑒
−1 0 3
eigen vectors are −2 , 3 and 0 .
1 2 1
𝐄𝐗𝐂𝐄𝐑𝐂𝐈𝐒𝐄
1 0 −1
1)Find all the Eigenvalues and Eigenvectors of the matrix 1 2 1
2 2 3
(eigen values are 1, 3, 2)
3 2 4
2)Find all the Eigenvalues and Eigenvectors of the matrix 2 0 2
4 2 3
(eigen values are -1, -1, 8)
1 −3
3) Find all the Eigenvalues and Eigenvectors of the matrix
5 4
5 ±𝜄 51
(eigen values are )
2
1 1 3
4) Find all the Eigenvalues and Eigenvectors of the matrix 1 5 1
3 1 1
(eigen values are -2, 3, 6)
Eigen values and Eigen Vectors of
Symmetric Matrices
Revisiting Symmetric Matrices

A matrix ‘𝐴’ is said to be a symmetric matrix if 𝐴 = 𝐴𝑇 .

For example:
5 3 7
3 26 2 is a symmetric matrix because
7 2 10
5 3 7
𝐴 = 𝐴𝑇 = 3 26 2
7 2 10

Symmetric matrices have special properties regarding their eigenvalues and


eigenvectors.
Properties of eigen values and eigen vectors of a
symmetric matrix
1. 𝜆 is real for real symmetric matrix A.
Proof:
Let us consider a matrix 𝐴 be a real symmetric matrix, 𝜆 be its eigen value
and 𝑥 be its eigen vector. Then we have,
𝐴𝑥 = 𝜆𝑥
Take complex conjugate of above.
∴ A𝑥 = 𝜆 𝑥
Let us take transpose of above.
𝑇 𝑇
𝐴𝑥 = 𝜆𝑥
∴ 𝑥 𝑇 𝐴𝑇 = 𝜆𝑥 𝑇
As 𝐴 is a symmetric matrix, 𝐴 = 𝐴𝑇
∴ 𝑥 𝑇 𝐴 = 𝜆𝑥 𝑇
⇒ 𝑥 𝑇 𝜆𝑥 = 𝜆𝑥 𝑇 𝑥 (multiplying both sides by 𝑥)
⇒ 𝜆𝑥 𝑇 𝑥 = 𝜆𝑥 𝑇 𝑥
∴𝜆=𝜆
Therefore, 𝜆 is real for real symmetric matrix A.
2. Eigenvectors corresponding to distinct eigenvalues of a symmetric matrix
are orthogonal.
Proof:
Let 𝐴 be a real symmetric eigenvalue, 𝜆1 ≠ 𝜆2 be distinct eigenvalues,
and 𝑥1 , 𝑥2 be their corresponding real eigenvectors.
Then,
𝐴𝑥1 = 𝜆1 𝑥1 …(1) and 𝐴𝑥2 = 𝜆2 𝑥2 …(2)
Thus, we have 𝑥1T AT = 𝜆1 𝑥1T
∴ 𝑥1T AT 𝑥2 = 𝜆1 𝑥1T 𝑥2 (multiplying both sides by 𝑥2 )
∴ 𝑥1T 𝐴𝑥2 = 𝜆1 𝑥1T 𝑥2
∴ 𝑥1T 𝜆2 𝑥2 = 𝜆1 x1T x2 (From 2)
∴ 𝜆1 − 𝜆2 𝑥1T 𝑥2 = 0
Since 𝜆1 ≠ 𝜆2 , this indicates that 𝑥1𝑇 𝑥2 = 0. Therefore, 𝑥1 and 𝑥2 are
orthogonal.

Note: Vectors 𝑥 and 𝑦 are orthogonal when 𝑥 T 𝑦 = 0.


Example: Eigen values are non-repeated

Q. Find the eigen values and eigen vectors of the following matrix.
1 2
2 1
Verify that eigenvectors corresponding to distinct eigenvalues for this matrix
are orthogonal.
Ans.
Step 1: Write characteristic equation |A − 𝜆𝑥| = 0 and find 𝜆
1−𝜆 2
=0
2 1−𝜆
𝜆2 − 2𝜆 − 3 = 0
⇒ 𝜆 = −1,3
Step 2: Find eigen vectors for each eigen value by using 𝐴 − 𝜆𝐼 𝑥 = 0
For 𝜆 = −1,
2 2 𝑥 0
𝑦 =
2 2 0
We can solve the above equation by using augmented form
2 2 : 0
2 2 : 0
Doing row operation 𝑅2 → 𝑅2 − 𝑅1
2 2 : 0
0 0 : 0
Solving the above system we get eigen vector:
−1
1
For 𝜆 = 3,
−2 2 𝑥 0
𝑦 =
2 −2 0
We can solve the above equation by using augmented form
−2 2 : 0
2 −2 : 0
Doing row operation 𝑅2 → 𝑅2 + 𝑅1
−2 2 : 0
0 0 : 0
Solving the above system we get eigen vector:
1
1
Thus, we can verify that both eigen vectors are orthogonal to each other.
−1 T 1
Since, =0
1 1
Example: Eigen values are repeated
Que. Find the eigen values and eigen vectors of the following matrix.
1 2 3
2 4 6
3 6 9
Verify that eigenvectors corresponding to distinct eigenvalues for this matrix
are orthogonal.
Ans.
Step 1: Write characteristic equation |A − 𝜆𝑥| = 0 and find 𝜆
1−𝜆 2 3
2 4−𝜆 6 =0
3 6 9−𝜆
−𝜆3 + 14𝜆2 = 0
⇒ 𝜆 = 0,0,14
Step 2: Find eigen vectors for each eigen value by using 𝐴 − 𝜆𝐼 𝑥 = 0
For 𝜆 = 0,
1 2 3 𝑥 0
2 4 6 𝑦 = 0
3 6 9 𝑧 0
We can solve the above equation by using augmented form
1 2 3 : 0
2 4 6 : 0
3 6 9 : 0
Doing row operation 𝑅2 → 𝑅2 − 2𝑅1 , 𝑅3 → 𝑅3 − 3𝑅1
1 2 3 : 0
0 0 0 : 0
0 0 0 : 0
Solving the above system we get eigen vectors:
−3 −2
0 , 1
1 0
For 𝜆 = 14,
−13 2 3 𝑥 0
2 −10 6 𝑦 = 0
3 6 −5 𝑧 0
We can solve the above equation by using augmented form
−13 2 3 : 0
2 −10 6 : 0
3 6 −5 : 0
−𝑅1
Doing row operation 𝑅1 →
13
2 3
1 − − : 0
13 13
2 −10 6 : 0
3 6 −5 : 0
Doing row operation 𝑅2 → 𝑅2 − 2𝑅1 , 𝑅3 → 𝑅3 − 3𝑅1
2 3
1 − − : 0
13 13
126 84
0 − : 0
13 13
84 56
0 − : 0
13 13
−13𝑅2 13𝑅3
Doing row operation 𝑅2 → , 𝑅3 →
126 4
2 3
1 − − : 0
13 13
2
0 1 − : 0
3
0 21 14 : 0
Doing row operation 𝑅3 → 𝑅3 − 21𝑅2
2 3
1 − − : 0
13 13
2
0 1 − : 0
3
0 0 0 : 0
Solving the above system we get eigen vector:
1
2
3
−3 −2
Thus, eigen vectors corresponding to 𝜆 = 0 are 0 , 1 and
1 0
1
corresponding to 𝜆 = 14 is 2 .
3
−3 T 1 −2 T 1
We can verify that 0 2 = 0 and 1 2 = 0.
1 3 0 3
Thus, for this example eigenvectors corresponding to distinct eigenvalues
are orthogonal.
Exercise Questions:
1. Find the eigen values and eigen vectors of the following matrix.
8 −6 2
−6 7 −4
2 −4 3
Verify that eigenvectors corresponding to distinct eigenvalues for this matrix
are orthogonal.
2. Find the eigen values and eigen vectors of the following matrix.
2 0 1
0 2 0
1 0 2
Verify that eigenvectors corresponding to distinct eigenvalues for this matrix
are orthogonal.
Exercise Questions:
3. Find the eigen values and eigen vectors of the following matrix.
6 −2 2
−2 3 −1
2 −1 3
Verify that eigenvectors corresponding to distinct eigenvalues for this matrix
are orthogonal.
4. Find the eigen values and eigen vectors of the following matrix.
2 1 −1
1 1 −2
−1 −2 1
Verify that eigenvectors corresponding to distinct eigenvalues for this matrix
are orthogonal.
Answers:
1. 0,3,5; (1,2,2),(2,1,-2),(2,-2,1)
2. 1,2,3; (1,0,-1),(0,1,0),(1,0,1)
3. 8,2,2; (2,-1,1),(1,0,-2),(1,2,0)
4. 2,3,-1; (3,1,1),(-4,1,-3),(0,5,5)
Applications of eigen values and eigen vectors
Eigenvalues and eigenvectors have significant applications in various
branches of engineering, such as:
1. Mechanical Engineering:
- In mechanical systems, eigenvalues correspond to the system's natural
frequencies.
- Eigenvalue analysis is used in vibration analysis to avoid resonance, which
can lead to failure.
2. Control Systems:
- In control systems, eigenvalues determine the stability of the system.
- They are used to analyze the system’s dynamic behavior by examining the
eigenvalues of the system's state matrix.
3. Signal Processing:
- In signal processing, eigenvalue decomposition is used in data
compression and noise reduction, such as in principal component analysis
(PCA).

Eigen values and eigenvectors play a critical role in Singular Value


Decomposition (SVD), although the two concepts are related but distinct. We
will learn about this in the upcoming topics.
Diagonalization
of a Matrix
Diagonal Matrix
 An 𝑛 × 𝑛 matrix 𝐷 is called a Diagonal matrix if all its entries off the
main diagonal are zero, that is if D has the form
𝜆1 0 ⋯ 0
𝑛 0 𝜆2 ⋯ 0
𝐷 = = 𝑑𝑖𝑎𝑔(𝜆1 , 𝜆2 , ⋯ , 𝜆𝑛 )
⋮ ⋮ ⋱ ⋮
0 0 ⋯ 𝜆𝑛
 Diagonal matrices simplify computations, making operations like
multiplication and inversion efficient, for example:
If 𝐷 = 𝑑𝑖𝑎𝑔(𝜆1 , 𝜆2 , … , 𝜆𝑛 ) and 𝐸 = 𝑑𝑖𝑎𝑔(µ1 , µ2 , … , µ𝑛 )
𝐷 + 𝐸 = 𝑑𝑖𝑎𝑔(𝜆1 + µ1 , 𝜆2 + µ2 , … , 𝜆𝑛 + µ𝑛 )
𝐷𝐸 = 𝑑𝑖𝑎𝑔(𝜆1 µ1 , 𝜆2 µ2 , … , 𝜆𝑛 µ𝑛 ) and 𝐷𝑘 = 𝑑𝑖𝑎𝑔(𝜆1𝑘 , 𝜆𝑘2 , … , 𝜆𝑘𝑛 )
Because of the simplicity of these formulas we make another definition:
Diagonalizable Matrix
 Diagonalizable Matrix:
An 𝑛 × 𝑛 matrix 𝐴 is called diagonalizable if 𝑃−1 𝐴𝑃 is a diagonal matrix
for some invertible 𝑛 × 𝑛 matrix 𝑃.
i.e., 𝑃−1 𝐴𝑃 = 𝐷 where 𝐷 is a diagonal matrix.
 The matrix P which diagonalizes 𝐴 is called the modal matrix and the
resulting matrix 𝐷 is called the spectral matrix of 𝐴.
 Similarity of Matrices:
A square matrix 𝐴 of order 𝑛 is called similar to a square matrix 𝐴 of
order 𝑛 if 𝐴 = 𝑃𝐴𝑃−1 for some invertible 𝑛 × 𝑛 matrix 𝑃.
 Thus, a matrix 𝐴 is diagonalizable if and only if 𝐴 is similar to a diagonal
matrix.
Powers of a Matrix

 Let 𝐴 be diagonalizable i.e., 𝑃−1 𝐴𝑃 = 𝐷 where 𝐷 is a diagonal matrix.


∴ 𝐴 = 𝑃𝐷𝑃−1
𝐴2 = 𝑃𝐷𝑃−1 𝑃𝐷𝑃−1 = 𝑃𝐷2 𝑃−1
𝐴3 = 𝑃𝐷𝑃−1 𝑃𝐷𝑃−1 𝑃𝐷𝑃−1 = 𝑃𝐷3 𝑃−1

𝐴𝑛 = 𝑃𝐷𝑛 𝑃−1
 Hence computing 𝐴𝑛 comes down to finding an invertible matrix 𝑃

How to find the modal matrix 𝑃?


Modal Matrix & Spectral Matrix

 Theorem:
Let 𝐴 be an 𝑛 × 𝑛 matrix.
i. 𝐴 is diagonalizable if and only if 𝐴 has 𝑛 linearly independent
eigenvectors 𝑋1 , 𝑋2 , ⋯ , 𝑋𝑛 such that the matrix 𝑃 = [𝑋1 , 𝑋2 , ⋯ , 𝑋𝑛 ] is
invertible.
ii. If 𝐴 is diagonalizable, 𝑃−1 𝐴𝑃 = 𝑑𝑖𝑎𝑔(𝜆1 , 𝜆2 , … , 𝜆𝑛 ) where, for each 𝑖,
𝜆𝑖 is the eigenvalue of A corresponding to 𝑋𝑖 .
Modal Matrix & Spectral Matrix

 Remarks:
i. The matrix 𝑃 which diagonalizes 𝐴 constitutes the eigenvector of 𝐴
as its column entries.
ii. The diagonal matrix 𝐷 has the eigenvalues of 𝐴 as its diagonal
entries
iii. An 𝑛 × 𝑛 matrix with 𝑛 distinct eigenvalues is diagonalizable
Multiplicity of an Eigenvalue.

 Multiplicity of an eigenvalue:
An eigenvalue λ of a square matrix 𝐴 is said to have multiplicity 𝑚 if
it occurs 𝑚 times as a root of the characteristic equation.

 A square matrix A is diagonalizable if and only if every eigenvalue λ


of multiplicity 𝑚 yields exactly 𝑚 linearly independent eigenvectors
Example
 (Example 1) Find the modal matrix 𝑃 which diagonalizes the matrix
3 −1 1
𝐴 = −1 5 −1 and hence calculate 𝐴4 .
1 −1 3
 Solution:
The characteristic equation of 𝐴 is 𝜆3 − 𝑆1 𝜆2 + 𝑆2 𝜆 − |𝐴| = 0 where
𝑆1 = 𝑎11 + 𝑎22 + 𝑎33 = 3 + 5 + 3 = 11
𝑆2 = 𝑀11 + 𝑀22 + 𝑀33 = 14 + 8 + 14 = 36 and 𝐴 = 36
∴ The characteristic equation is 𝜆3 − 11𝜆2 + 36𝜆 − 36 = 0
The eigenvalues are 𝜆1 = 2, 𝜆2 = 3 and 𝜆3 = 6 which are distinct and
hence 𝐴 is diagonalizable.
Matrix Equation of 𝐴 is: 𝐴 − 𝜆𝐼 𝑋 = 0
3−𝜆 −1 1 𝑥1 0
i.e., −1 5 − 𝜆 −1 𝑥2 = 0
1 −1 3 − 𝜆 𝑥3 0
𝑥1
Let 𝑋1 = 𝑥2 be an eigenvector corresponding to eigenvalue 𝜆1 = 2, then
𝑥3
1 −1 1 𝑥1 0
−1 3 −1 𝑥2 = 0
1 −1 1 𝑥3 0
𝑅2 +𝑅1
1 −1 1 𝑅3 −𝑅1
1 −1 1
Consider −1 3 −1 0 2 0
1 −1 1 0 0 0

𝜌 𝐴 − 2𝐼 = 2 < 3, thus the system have infinite number of solutions with


3 − 2 = 1 free variable 𝑡. (Let 𝑥1 = 𝑡 be the free variable)

By 𝑅1 : 𝑥1 − 𝑥2 + 𝑥3 = 0 and by 𝑅2 : 2𝑥2 = 0 ⇒ 𝑥2 = 0
Thus 𝑥1 − 𝑥2 + 𝑥3 = 0 ⇒ 𝑡 − 0 + 𝑥3 = 0 ⇒ 𝑥3 = −𝑡

1
∴ Eigenvector corresponding to 𝜆1 = 2 is 𝑋1 = 0 (with 𝑡 = 1)
−1
𝑥1
Let 𝑋2 = 𝑥2 be an eigenvector corresponding to eigenvalue 𝜆2 = 3, then
𝑥3
0 −1 1 𝑥1 0
−1 2 −1 𝑥2 = 0
1 −1 0 𝑥3 0
0 −1 1 𝑅13 1 −1 1
Consider −1 2 −1 −1 2 −1
1 −1 0 0 −1 1
𝑅2 +𝑅1 1 −1 0 𝑅3+𝑅2 1 −1 0
0 1 −1 0 1 −1
0 −1 1 0 0 0
𝜌 𝐴 − 3𝐼 = 2 < 3, thus the system have infinite number of solutions with
3 − 2 = 1 free variable 𝑡. (Let 𝑥1 = 𝑡 be the free variable)

By 𝑅1 : 𝑥1 − 𝑥2 = 0 and by 𝑅2 : 𝑥2 − 𝑥3 = 0
⇒ x1 = x2 = 𝑥3 = 𝑡
1
∴ Eigenvector corresponding to 𝜆2 = 3 is 𝑋2 = 1 (with 𝑡 = 1)
1
𝑥1
Let 𝑋3 = 𝑥2 be an eigenvector corresponding to eigenvalue 𝜆3 = 6, then
𝑥3
−3 −1 1 𝑥1 0
−1 −1 −1 𝑥2 = 0
1 −1 −3 𝑥3 0
−3 −1 1 𝑅13 1 −1 −3
Consider −1 −1 −1 −1 −1 −1
1 −1 −3 −3 −1 1
𝑅2 +𝑅1
𝑅3 +3𝑅1
1 −1 −3 𝑅3 −2𝑅2 1 −1 −3 −
𝑅2 1 −1 −3
2
0 −2 −4 0 −2 −4 0 1 2
0 −4 −8 0 0 0 0 0 0
𝜌 𝐴 − 6𝐼 = 2 < 3, thus the system have infinite number of solutions with
3 − 2 = 1 free variable 𝑡. (Let 𝑥3 = 𝑡 be the free variable)

By 𝑅1 : 𝑥1 − 𝑥2 − 3𝑥3 = 0 and by 𝑅2 : 𝑥2 + 2𝑥3 = 0 ⇒ x2 = −2𝑡


Thus 𝑥1 − 𝑥2 − 3𝑥3 = 0 ⇒ x1 = 𝑡
1
∴ Eigenvector corresponding to 𝜆3 = 6 is 𝑋3 = −2 (with 𝑡 = 1)
1
1 1 1
Thus Modal Matrix 𝑃 = 𝑋1 , 𝑋2 , 𝑋3 = 0 1 −2 which is invertible.
−1 1 1
1 1
0 −
2 3 2 0 0
1 1 1
where 𝑃−1 = 3 3 3
and 𝑃−1 𝐴𝑃 = 0 3 0
1 1 1 0 0 6

6 3 6
1 1
0 −
2 3
1 1 1 16 0 0 1 1 1
𝐴4 = 𝑃𝐷4 𝑃−1 = 0 1 −2 0 81 0
−1 1 1 0 0 1296 3 3 3
1 1 1

6 3 6
251 −405 235
Thus 𝐴4 = −405 891 −405
235 −405 251
Example
3 −2 0
 (Example 2) Check if 𝐴 = −2 3 0 is diagonalizable, and if so,
0 0 5
diagonalize it.
 Solution: The characteristic equation of 𝐴 is 𝜆3 − 𝑆1 𝜆2 + 𝑆2 𝜆 − |𝐴| = 0
where 𝑆1 = 𝑎11 + 𝑎22 + 𝑎33 = 3 + 3 + 5 = 11
𝑆2 = 𝑀11 + 𝑀22 + 𝑀33 = 15 + 15 + 5 = 35 and 𝐴 = 25
∴ The characteristic equation is 𝜆3 − 11𝜆2 + 35𝜆 − 25 = 0
The eigenvalues are 𝜆1 = 5 with multiplicity 2 and 𝜆2 = 1.
Since eigenvalue 5 has multiplicity 2, 𝐴 is diagonalizable only if the
eigenvalue 𝜆 = 5 yields 2 linearly independent eigenvectors
Matrix Equation of 𝐴 is: 𝐴 − 𝜆𝐼 𝑋 = 0
3−𝜆 −2 0 𝑥1 0
i.e., −2 3 − 𝜆 0 𝑥2 = 0
0 0 5 − 𝜆 𝑥3 0
𝑥1
Let 𝑋 = 𝑥2 be an eigenvector corresponding to eigenvalue 𝜆 = 5, then
𝑥3
−2 −2 0 𝑥1 0
−2 −2 0 𝑥2 = 0
0 0 0 𝑥3 0
−2 −2 0 𝑅2 −𝑅1 −2 −2 0 𝑅
− 1 1 1 0
2
Consider −2 −2 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0

𝜌 𝐴 − 5𝐼 = 1 < 3, thus the system have infinite number of solutions with


3 − 1 = 2 free variable s, 𝑡. (Let 𝑥2 = 𝑠 and 𝑥3 = 𝑡 be the free variables)

By 𝑅1 : 𝑥1 + 𝑥2 = 0 ⇒ 𝑥1 = −𝑠
−𝑠 −1 0
Eigenvector corresponding to 𝜆1 = 5 is 𝑋 = 𝑠 = 𝑠 1 + 𝑡 0
𝑡 0 1
−1 0
∴ Eigenvectors corresponding to 𝜆 = 5 are 𝑋1 = 1 and 𝑋2 = 0
0 1
𝑥1
Let 𝑋3 = 𝑥2 be an eigenvector corresponding to eigenvalue 𝜆 = 1
𝑥3
Since 𝐴 is symmetric, its eigenvectors are pairwise orthogonal
∴ 𝑋1𝑇 𝑋3 = 0 ⇒ −𝑥1 +𝑥2 = 0 ⇒ 𝑥1 = 𝑥2 and 𝑋2𝑇 𝑋3 = 0 ⇒ 𝑥3 = 0
1
∴ Eigenvector corresponding to 𝜆2 = 1 is 𝑋3 = 1
0
−1 0 1
Thus Modal Matrix 𝑃 = 𝑋1 , 𝑋2 , 𝑋3 = 1 0 1 which is invertible.
0 1 0
1 1
− 0 5 0 0
2 2
𝑃−1 = 0 0 1 and 𝑃−1 𝐴𝑃 = 0 5 0 =𝐷
1 1
0 0 0 1
2 2
Example
1 1
 (Example 3) Show that 𝐴 = is not diagonalizable.
0 1
 Solution:
The characteristic equation of 𝐴 is given by: 1 − 𝜆 2

So 𝐴 has only one eigenvalue 𝜆1 = 1 of multiplicity 2 corresponding


1
to which there is only one eigenvector 𝑥1 =
0
(But for 𝐴 to be diagonalizable, eigenvalue 𝜆1 = 1 of multiplicity 2
must yield 2 eigenvectors)
Hence A is not diagonalizable.
Example
−1 2
 (Example 4) Calculate 𝐴2024 if 𝐴 = by diagonalization.
0 1
 Solution:
𝐴 has 𝜆1 = −1 and 𝜆2 = 1 as its eigenvalues.
Since 𝐴 has distinct eigenvalues, 𝐴 is diagonalizable.
1
𝑥1 = is the eigenvector corresponding to 𝜆1 = −1 and
0
1
𝑥2 = is the eigenvector corresponding to 𝜆2 = 1
1
1 1
Since 𝐴 is diagonalizable, 𝑃𝐴𝑃−1 = 𝐷 where 𝑃 =
0 1
1 −1 −1 0
∴𝐴= 𝑃−1 𝐷𝑃 where 𝑃−1 = and 𝐷 =
0 1 0 1

𝐴2024 = 𝑃−1 𝐷2024 𝑃

𝐴2024
=
1 −1 −1 2024 0 1 1
0 1 0 1 0 1

1 0
𝐴2024 = =𝐼
0 1
Exercises
2 1 1
 Is 𝐴 = 1 2 1 diagonalizable?
0 −1 1
 Check if the following matrices are diagonalizable and if so, diagonalize it.
1 0 −1 1 −3 3 2 1 1
i) 0 1 0 ii) 0 −5 6 iii) 1 2 1
0 0 2 0 −3 4 0 −1 1
 Find the modal matrix which diagonalizes the matrix 𝐴 and hence find 𝐴8
1 6 1
where 𝐴 = 1 2 0
0 0 3
Reduction of Quadratic form to
Canonical form by Linear Transformation
Quadratic form

A homogeneous polynomial of second degree in any number of variables is


called a quadratic form.

General quadratic form in n variables 𝑥1 , 𝑥2 ,……, 𝑥𝑛 can be expressed as

𝑄(𝑥) = 𝑎11 𝑥12 +2 𝑎12 𝑥1 𝑥2 + 2 𝑎13 𝑥1 𝑥3 +…….+ 2 𝑎1𝑛 𝑥1 𝑥𝑛


+ 𝑎22 𝑥22 + 2 𝑎23 𝑥2 𝑥3 +…….... +2 𝑎2𝑛 𝑥2 𝑥𝑛
……………………………………
……+ 𝑎𝑛𝑛 𝑥𝑛2
= 𝑛𝑖=1 𝑛𝑗=1 𝑎𝑖𝑗 𝑥𝑖 𝑥𝑗
Examples of quadratic form

𝑄(𝑥) = 2𝑥12 + 3𝑥22 +5𝑥32 + 7𝑥1 𝑥2 + 9𝑥2 𝑥3 +5𝑥3 𝑥1 in three variables.

𝑄(𝑥) = 2𝑥12 + 3𝑥22 +7𝑥1 𝑥2 in two variables.


Matrix expression in quadratic form:

Quadratic form can be expressed as product of matrices in the form

𝑄(𝑥) = 𝑋 ′ 𝐴X

where 𝑋 ′ is the transpose of the column matrix 𝑋 and 𝐴 is the coefficient


matrix of the form which can be arranged to be symmetric , since we take
off-diagonal terms together in pairs and write the results as sum of two
equal terms.
Consider

𝑎11 𝑎12 𝑎13 𝑥1


A= 𝑎21 𝑎22 𝑎23 and 𝑋 = 𝑥2
𝑎31 𝑎32 𝑎33 𝑥3

𝑎11 𝑎12 𝑎13 𝑥1


Then 𝑋 ′ 𝐴𝑋 = 𝑥1 𝑥2 𝑥3 𝑎21 𝑎22 𝑎23 𝑥2
𝑎31 𝑎32 𝑎33 𝑥3

= 𝑎11 𝑥12 + 𝑎22 𝑥22 + 𝑎33 𝑥32 +2 𝑎12 𝑥1 𝑥2 + 2 𝑎13 𝑥1 𝑥3 + 2 𝑎23 𝑥2 𝑥3

= 𝑄(𝑥)
Remarks :

1
1. In matrix 𝐴, 𝑎𝑖𝑗 = 𝑎𝑗𝑖 = coefficient of 𝑥𝑖 𝑥𝑗
2

2. The law of formation of 𝑄(𝑥) , when symmetric matrix 𝐴 is given and of


symmetric matrix 𝐴 , when 𝑄(𝑥) is given should be noted. Similar law holds
for symmetric matrix of any order.

3. Rank of symmetric matrix 𝐴 is called rank of the quadratic form 𝑄(𝑥). The
number of non-zero eigen values of 𝐴 also gives the rank of the quadratic
form 𝑄(𝑥).
Qn. Write the following quadratic form in matrix notation.

1. 𝑄(𝑥) = 𝑥12 + 2𝑥22 +3𝑥32 + 6𝑥1 𝑥2 − 4𝑥2 𝑥3 +2𝑥3 𝑥1

1 3 1 𝑥1
Ans : 𝑄(𝑥) = 𝑋 ′ 𝐴𝑋 = 𝑥1 𝑥2 𝑥3 3 2 −2 𝑥2
1 −2 3 𝑥3
Qn. Write quadratic form corresponding to the given matrix .

1 3 2
𝐴= 3 0 4
2 4 2

Ans . 𝑄(𝑥) = 𝑋 ′ 𝐴𝑋 = 𝑥12 +2𝑥32 + 6𝑥1 𝑥2 + 8𝑥2 𝑥3 +4𝑥3 𝑥1


Linear transformation of quadratic form:

The linear transformation 𝑋 = 𝑃𝑌 transforms the quadratic form 𝑄(𝑥) to the


other quadratic form 𝑄 ′ (𝑥) ,where 𝑃 is non-singular matrix.

𝑄(𝑥) = 𝑋 ′ 𝐴𝑋 = ( 𝑃𝑌)’ 𝐴 (𝑃𝑌)


= (𝑌’𝑃’) 𝐴 (𝑃𝑌)
= 𝑌’(𝑃’𝐴𝑃)𝑌 = 𝑌’𝐵𝑌 = 𝑄’(𝑥).

Note :
1. Here 𝐵 = 𝑃’𝐴𝑃 is the matrix of the transformed quadratic form 𝑄’(𝑥).
Reduction of quadratic form to canonical form
If the quadratic form 𝑄(𝑋) = 𝑋’𝐴𝑋 is reduced to the another quadratic
form 𝑄’(𝑋) = 𝑌’𝐵𝑌 by non-singular transformation 𝑋 = 𝑃𝑌, then reduced
quadratic form is called Canonical form or the sum of squares form or
principal axes form.

Note : In this case,matrix 𝐵 of the reduced quadratic form 𝑄’(𝑋) will be


diagonal matrix.

Let 𝑄(𝑋) = 𝑋’𝐴𝑋 be a quadratic form in 𝑛-variables 𝑥1 , 𝑥2 ,……, 𝑥𝑛 .


Let 𝑟 be the rank of the matrix 𝐴, then canonical form 𝑄(𝑥) will contain 𝑟
terms :
𝑄’(𝑋) = 𝑌’𝐵𝑌 = 𝑐1 𝑦12 + 𝑐2 𝑦22 +…….+𝑐𝑟 𝑦𝑟2
Index and signature

The number of positive terms in canonical form is called the index


and is denoted by 𝑝, the rank of 𝐵 is called rank 𝑟 of the quadratic
form.

The difference between the positive terms (𝑝) and negative terms
(𝑟 − 𝑝) is known as the signature of the quadratic form and is
denoted by 𝑠 .

Thus signature 𝑠 = 𝑝 − (𝑟 − 𝑝) = 2𝑝 − 𝑟.
Definite and semi-definite Forms :

A quadratic form 𝑄(𝑥) = 𝑋’𝐴𝑋 in 𝑛-variables ,where 𝐴 ≠ 0 is called


positive definite quadratic form if 𝑟 = 𝑝 = 𝑛 i.e. it reduces to the form
𝑄’(𝑋) = 𝑐1 𝑦12 + 𝑐2 𝑦22 +…….+𝑐𝑛 𝑦𝑛2

where all the 𝑐𝑟 ’s are positive.

Positive semi-definite : 𝑟 = 𝑝 < 𝑛


Negative definite : 𝑝 = 0 and 𝑟 = 𝑛
Negative Semi-definite : 𝑝 = 0 and 𝑟 < 𝑛
Q. Express the following quadratic form as “sum of the
squares forms” by congruent transformation. Write down
the corresponding linear transformation. Also find the
rank, index and signature.

𝑄(𝑥) = 6𝑥12 +3𝑥22 +3𝑥32 − 4𝑥1 𝑥2 − 2𝑥2 𝑥3 +4𝑥3 𝑥1


Solution:
The quadratic form Q(x) can be expressed in matrix form as follows

6 −2 2 𝑥1
Q(x) = 𝑋 ′ 𝐴X= 𝑥1 𝑥2 𝑥3 −2 3 −1 𝑥2
2 −1 3 𝑥3
The matrix of the quadratic form is

6 −2 2
A = −2 3 −1
2 −1 3
To reduce the given quadratic form to sum of squares
𝑄’(𝑥) = 𝑌’𝐵𝑌 and to find matrix 𝑃 of the linear transformation 𝑋 = 𝑃𝑌 ,
we write 𝐴 = 𝐼𝐴.
6 −2 2 1 0 0
−2 3 −1 = 0 1 0 𝐴
2 −1 3 0 0 1

We will perform row as well as column transformations on matrix 𝐴 on


L.H.S. to obtain diagonal matrix 𝐵 congruent to 𝐴 while perform only
corresponding row transformations on matrix 𝐼 on R.H.S.
6 0 0 1 0 0
7 1
We get 0 3
0 =
3
1 0 𝐴 i.e. 𝐵 = 𝑃’𝐴
16 −2 1
0 0 1
7 7 7
Thus the matrix 𝐴 is reduced to the diagonal matrix 𝐵 and the
transformation matrix 𝑃 is given by

1 −2
1
3 7
𝑃 = (𝑃’)’ = 0 1
1
7
0 0 1

The canonical form 𝑄’(𝑥) = 𝑌’𝐵𝑌


6 0 0
7 𝑦1
= 𝑦1 𝑦2 𝑦3 0 0 𝑦2
3
0 0
16 𝑦3
7
7 16
= 6 𝑦12 + 𝑦22 + 𝑦32
3 7
The rank of quadratic form (𝑟) = 3

The index of quadratic form (𝑝)=3

The signature of quadratic form (𝑠) = 2𝑝 − 𝑟 = 3

∴ Quadratic form is positive definite ( ∵ 𝑟=𝑝=𝑠=3)


Exercise Problems :

Reduce the following quadratic forms to canonical form and find its rank
and signature.

1) 3𝑥 2 + 3 𝑦 2 + 3 𝑧 2 -2yz+2zx+6xy (Ans : rank = 3, signature=3)

2) 𝑥12 +2𝑥22 +3𝑥32 + 2𝑥1 𝑥2 + 2𝑥2 𝑥3 − 2𝑥3 𝑥1 (Ans : rank = 3, signature=1)


Reduction of Quadratic forms to Canonical form
by Orthogonal transformation method
 Introduction-
Let 𝑄 𝑥 = 𝑛𝑖=1 𝑛𝑗=1 𝑎𝑖𝑗 𝑥𝑖 𝑥𝑗 be a given quadratic form.
The coefficient matrix is real symmetric and therefore has n
linearly independent orthogonal set of eigen vectors
corresponding to n eigen values (not necessarily be distinct).

Let 𝑃 = [𝑥1 , 𝑥2 , 𝑥3 ] be a modal matrix whose column are


normalized eigen vectors of matrix 𝐴. Then 𝑃 is an orthogonal
matrix. Thus 𝑋 = 𝑃𝑌 is orthogonal transformation.

With this orthogonal transformation 𝑋 = 𝑃𝑌, the


quadratic form Q(𝑋) is reduced to canonical form 𝑄 ′ (𝑋) as
𝑄 𝑥 = 𝑋 ′ 𝐴𝑋 = 𝑃𝑌 ′ 𝐴 𝑃𝑌

= 𝑌 ′ 𝑃′ 𝐴 𝑃𝑌
= 𝑌 ′ (𝑃′ 𝐴𝑃) 𝑌
= 𝑌 ′ (𝑃−1 𝐴𝑃) 𝑌
= 𝑌 ′ BY
= 𝑄 ′ (x)
 Remark-
Since we know that orthogonal matrix p diagonalizes
matrix A enters on the main diagonal. Thus 𝑃−1 𝐴𝑃 =
𝐵=diagonal matrix. Hence the quadratic form 𝑄 𝑥 is reduced
to canonical form or “sum of the square form”.
𝑛

𝑄 ′ 𝑥 = 𝑌 ′ 𝐵𝑌 = 𝜆𝑖 𝑦𝑖 2
𝑖=1
Note-
If [𝑙1 𝑚1 𝑛1 ]′ , [𝑙2 𝑚2 𝑛2 ]′ , [𝑙3 𝑚3 𝑛3 ]′ are normalized
eigen vectors corresponding to eigen values 𝜆1 , 𝜆2 , 𝜆3 respectively,
then orthogonal transformation matrix (modal matrix) P of X=PY
for quadratic form 𝑄 𝑥 = 𝑋 ′ 𝐴𝑋 is given by
𝑙1 𝑙2 𝑙3
𝑃 = 𝑚1 𝑚2 𝑚3
𝑛1 𝑛2 𝑛3
and diagonal matrix
𝜆1 0 0
B = 0 𝜆2 0
0 0 𝜆3
Examples-
1) Reduce the quadratic form to canonical form by
orthogonal transformation. State the transformation matrix.
3𝑥 2 + 5𝑦 2 + 3𝑧 2 − 2𝑥𝑦 + 2𝑥𝑧 − 2𝑦𝑧

Solution-The quadratic form can be expressed as

𝑄 𝑥 = 𝑋 ′ 𝐴𝑋

Thus the matrix of quadratic form is


3 −1 1
𝐴 = −1 5 −1
1 −1 3

The characteristic equation of A is given by 𝐴 − 𝜆𝐼 = 0


3−𝜆 −1 1
−1 5 − 𝜆 −1 = 0
1 −1 3 − 𝜆
𝜆3 − 11𝜆2 + 36𝜆 − 36 = 0
𝜆−2 𝜆−3 𝜆−6 =0
𝜆1 = 2, 𝜆2 = 3, 𝜆3 = 6
i) For 𝜆1 = 2,
The corresponding eigen vector 𝑋1 = [𝑥1 , 𝑦1 , 𝑧1 ]′ is given by matrix
equation
𝐴1 − 𝜆1 𝐼 𝑋1 = 0
1 −1 1 𝑥1 0
−1 3 −1 𝑦1 = 0
1 −1 1 𝑧1 0
By Back substitution method
𝑦1 = 0
𝑥1 − 𝑦1 + 𝑧1 = 0
𝑥1 + 𝑧1 = 0
put 𝑥1 = 𝑡
∴ 𝑧1 = −𝑡
𝑥1 1
∴ 𝑋1 = 𝑦1 = 0
𝑧1 −1
ii) For 𝜆2 = 3,
The corresponding eigen vector 𝑋2 = [𝑥2 , 𝑦2 , 𝑧2 ]′ is given by matrix
equation
𝐴2 − 𝜆2 𝐼 𝑋2 = 0
0 −1 1 𝑥2 0
−1 2 −1 𝑦2 = 0
1 −1 0 𝑧2 0
Perform 𝑅1 ↔ 𝑅2
−1 2 −1 𝑥2 0
0 −1 1 𝑦2 = 0
1 −1 0 𝑧2 0
Perform 𝑅3 → 𝑅3 + 𝑅1
−1 2 −1 𝑥2 0
0 −1 1 𝑦2 = 0
0 1 −1 𝑧2 0
Perform 𝑅3 → 𝑅3 + 𝑅2
−1 2 −1 𝑥2 0
0 −1 1 𝑦2 = 0
0 0 0 𝑧2 0

By Back substitution method


𝑦2 − 𝑧2 = 0
𝑦2 = 𝑧2 = 𝑡
−𝑥2 + 2𝑦2 − 𝑧2 = 0
∴ 𝑥2 = 𝑡
𝑥2 1
∴ 𝑋2 = 𝑦2 = 1
𝑧2 1
ii) For 𝜆3 = 6,
The corresponding eigen vector 𝑋3 = [𝑥3 , 𝑦3 , 𝑧3 ]′ is given by matrix
equation
𝐴3 − 𝜆3 𝐼 𝑋3 = 0
−3 −1 1 𝑥3 0
−1 −1 −1 𝑦3 = 0
1 −1 −3 𝑧3 0
perform 𝑅1 ↔ 𝑅3
1 −1 −3 𝑥3 0
−1 −1 −1 𝑦3 = 0
−3 −1 1 𝑧3 0
Perform 𝑅2 → 𝑅2 + 𝑅1 , 𝑅3 → 𝑅3 + 3𝑅1

1 −1 −3 𝑥3 0
0 −2 −4 𝑦3 = 0
0 −4 −8 𝑧3 0

Perform 𝑅3 → 𝑅3 − 2𝑅2
1 −1 −3 𝑥3 0
0 −2 −4 𝑦3 = 0
0 0 0 𝑧3 0
By Back substitution method
𝑦3 = −2𝑧3 = 𝑡
−𝑡
𝑧3 =
2
𝑥3 − 𝑦3 − 3𝑧3 = 0
3
𝑥3 − 𝑡 + 𝑡 = 0
2
𝑡
𝑥3 = −
2
𝑥3 1
∴ 𝑋3 = 𝑦3 = −2
𝑧3 1

The matrix of eigen vector is


1 1 1
M= 0 1 −2
−1 1 1
And the diagonal matrix
2 0 0
B= 0 3 0
0 0 6
Normalizing the column of matrix M of unit magnitude, we get the
orthogonal transformation matrix as
The orthogonal transformation 𝑋 = 𝑃𝑌 is
1 1 1
2 3 6
𝑥1 𝑦1
1 2
𝑥2 = 0 − 𝑦2
𝑥3 3 6 𝑦3
1 1 1

2 3 6
The canonical form of the quadratic form Q(x) by orthogonal
transformation 𝑋 = 𝑃𝑌 is 2 0 0 𝑦1
𝑄 ′ 𝑥 = 𝑌 ′ 𝐵𝑌 = 𝑦1 𝑦2 𝑦3 0 3 0 𝑦2
0 0 6 𝑦2
2) Find orthogonal transformation which transforms the quadratic
form 𝑄 𝑥 = 𝑥1 2 + 3𝑥2 2 + 3𝑥3 2 − 2𝑥2 𝑥3 to canonical form. Determine
the index, signature and nature of the quadratic form.

Solution- The coefficient matrix A of quadratic form is


1 0 0
A = 0 3 −1
0 −1 3
The characteristic equation of A is given by 𝐴 − 𝜆𝐼 = 0
1−𝜆 0 0
0 3−𝜆 −1 = 0
0 −1 3 − 𝜆
𝜆3 − 7𝜆2 + 14𝜆 − 8 = 0

𝜆−1 𝜆−2 𝜆−4 =0


𝜆1 = 1, 𝜆2 = 2, 𝜆3 = 4
Next, we find non-singular orthogonal matrix P having three eigen
vectors 𝑋1 , 𝑋2 , 𝑋3 as three column vectors.

i) For 𝜆1 = 1,
from matrix equation 𝐴 − 𝜆1 𝐼 𝑋1 = 0
1
𝑋1 = 0
0
ii) For 𝜆2 = 2,
from matrix equation 𝐴 − 𝜆2 𝐼 𝑋2 = 0
0
𝑋2 = 1
1
iii) For 𝜆3 = 4,
from matrix equation 𝐴 − 𝜆3 𝐼 𝑋3 = 0
0
𝑋3 = 1
−1

1 0 0
The modal matrix M is M= 0 1 1
0 1 −1
And the diagonal matrix
1 0 0
B= 0 2 0
0 0 4
Normalizing the column of matrix M of unit magnitude, we get the
orthogonal transformation matrix as
The orthogonal transformation 𝑋 = 𝑃𝑌 is
1 0 0
𝑥1 1 1 𝑦1
0
𝑥2 = 2 2 𝑦2
𝑥3 1 1 𝑦3
0 −
2 2
The canonical form of the quadratic form Q(x) by orthogonal
transformation 𝑋 = 𝑃𝑌 is 1 0 0 𝑦1
𝑄 ′ 𝑥 = 𝑌 ′ 𝐵𝑌 = 𝑦1 𝑦2 𝑦3 0 2 0 𝑦2
0 0 4 𝑦2
Here index 𝑝 = 3,
signature(s)=2𝑝 − 𝑟 = 3
and the quadratic form is positive definite because 𝑟 = 𝑛 = 𝑝 = 3.
3) Reduce the following quadratic forms to canonical form and find
its rank and signature.
i) 2𝑥 2 + 𝑦 2 − 3𝑧 2 + 12𝑥𝑦 − 4𝑥𝑧 − 8𝑦𝑧

Ans. Rank=3, index=1, indefinite

ii) 𝑥1 2 + 2𝑥2 2 + 3𝑥3 2 + 2𝑥2 𝑥3 − 2𝑥1 𝑥3 + 2𝑥2 𝑥1

Ans. Rank=3, signature=1


Singular Value Decomposition (SVD)
Singular Value Decomposition (SVD) :
• The Singular Value Decomposition (SVD) provides another way to
factorize a matrix, into singular vectors and singular values.
• The process of Singular Value Decomposition (SVD) involves breaking
down a 𝑚 × 𝑛 matrix A into the form 𝐴 = 𝑈Σ𝑉 𝑇 where,
a. 𝑈 is a 𝑚 × 𝑚 orthogonal matrix with orthonormal columns,
b. Σ is a 𝑚 × 𝑛 diagonal matrix (not necessarily a square matrix)
c. 𝑉 is a 𝑛 × 𝑛 orthogonal matrix with orthonormal columns.
• The elements along the diagonal of Σ are known as the singular values of
the matrix 𝐴, the columns of 𝑈 are known as the left-singular vectors of A
and the columns of 𝑉 are known as the right-singular vectors of A.
• Note that 𝐴𝐴T and 𝐴𝑇 𝐴 are symmetric matrices since:
T 𝑇
𝐴𝐴 = (𝐴𝑇 )𝑇 𝐴𝑇 = 𝐴𝐴T
𝑇
𝐴 𝐴 = 𝐴𝑇 (𝐴𝑇 )𝑇 = 𝐴T 𝐴
T

Thus, 𝐴𝐴T and 𝐴𝑇 𝐴 have real eigenvalues with orthonormal


eigenvectors
• The columns of 𝑈 or the left-singular vectors of 𝐴 are the eigenvectors of
𝐴𝐴T .
• The columns of 𝑉 or the right-singular vectors of 𝐴 are the eigenvectors
of 𝐴𝑇 𝐴.
• The nonzero singular values of A are the square roots of the eigenvalues
of 𝐴𝐴T (The same is true for 𝐴𝑇 𝐴)
• The Singular Value Decomposition is so named due to the singular values
that are identified and isolated from matrix 𝐴
• Note that every matrix has a singular value decomposition whether it's
symmetric or not.
• For a symmetric matrix 𝐴, the SVD of 𝐴:
𝐴 = 𝑈Σ𝑉 𝑇 ⇒ 𝑈 = 𝑉 = 𝑃 (eigenvector matrix) and Σ = D (Diagonal
Matrix)
Thus 𝐴 = 𝑃D𝑃𝑇 (Orthogonal Diagonalization)
Examples :
3 2 1
1) Apply SVD to rewrite the matrix 𝐴 = in the decomposed
𝑇
2 1 4
form 𝐴 = 𝑈Σ𝑉
Solution:
Form 𝐴𝑇 𝐴:
3 2 13 8 11
𝑇 3 2 1
𝐴 𝐴= 2 1 = 8 5 6
2 1 4
1 4 11 6 17

Determine the eigenvalues of 𝐴𝑇 𝐴:


The eigenvalues of 𝐴𝑇 𝐴 are 𝜆1 = 30, 𝜆2 = 5, 𝑎𝑛𝑑 𝜆3 = 0 (Arrange in
decreasing magnitude)
Form the matrix 𝑉 𝑇 :
Compute the corresponding eigenvectors and normalize them to produce
the matrix 𝑉. Then compute 𝑉 𝑇

17 6 7
− −
5 30 5 5 5 6
10 5 10
The normalized eigenvectors are 𝑣1 = 5 30
, 𝑣2 = − , 𝑣3 = 5 6
5 5
19 8 1
5 30 5 5 5 6
17 6 7
− −
5 30 5 5 5 6
10 5 10
Thus 𝑉 = 𝑣1 𝑣2 𝑣3 = 5 30

5 5 5 6
and
19 8 1
5 30 5 5 5 6

17 10 19
5 30 5 30 5 30
6 5 8
𝑉𝑇 = −
5 5

5 5 5 5
7 10 1

5 6 5 6 5 6

Note that the columns of 𝑉 are orthonormal.


Form the matrix Σ :
To determine the matrix Σ, list the nonzero singular values, 𝜎𝑖 , in decreasing
magnitude down the main diagonal of Σ, where 𝜎𝑖 = 𝜆𝑖 . Additional rows
and columns of zeros can be added as needed to retain the original
dimension of 𝐴 in Σ . In our example we have three singular values:
30, 5 𝑎𝑛𝑑 0. Retain only the non-zero values, and hence we form the
matrix

Σ= 30 0 0
0 5 0
Note that Σ has the same dimension as our original matrix A.
Form the matrix 𝑈 :
Form the matrix 𝑈 by considering the modified form 𝐴 = 𝑈Σ𝑉 𝑇 , and
isolating each column of 𝑈. Because of the diagonal nature of Σ, this results
1
in 𝑢𝑖 = 𝐴𝑣𝑖 for each non-zero 𝜎𝑖 .
𝜎𝑖
17
5 30 3
1 1 3 2 1 10 5
𝑢1 = 𝐴𝑣1 = = 4 and
𝜎1 30 2 1 4 5 30
19 5
5 30

6
− 4
5 5
1 1 3 2 1 5 −
5
𝑢2 = 𝐴𝑣2 = − = 3
𝜎2 5 2 1 4 5 5
8 5
5 5
3 4

5 5
Thus U = 𝑢1 𝑢2 = 4 3
5 5
Note that the columns of 𝑈 are orthonormal.

Rewrite matrix 𝐴 as 𝐴 = 𝑈Σ𝑉 𝑇


17 10 19
3 4 5 30 5 30 5 30
− 8
𝐴= 5 5 30 0 0 − 6 − 5
4 3 0 5 0 5 5 5 5 5 5
5 5 7 10 1

5 6 5 6 5 6
3 2 2
2) Apply SVD to rewrite the matrix 𝐴 = in the
2 3 −2
decomposed form 𝐴 = 𝑈Σ𝑉 𝑇
Solution : (Alternate Method)
Form 𝐴𝐴𝑇 :
3 2
𝑇 3 2 2 17 8
𝐴𝐴 = 2 3 =
2 3 −2 8 17
2 −2
Determine the eigenvalues of 𝐴𝐴𝑇 :
The eigenvalues of 𝐴𝐴𝑇 are 𝜆1 = 25 𝑎𝑛𝑑 𝜆2 = 9 (Arrange in decreasing
magnitude)
Form the matrix 𝑈:
Compute the corresponding eigenvectors and normalize them to
produce the matrix 𝑈.
1 1
2 2
The normalized eigenvectors are 𝑢1 = 1 , 𝑢2 = 1

2 2

1 1
2 2
Thus U = 𝑢1 𝑢2 = 1 1

2 2
Form the matrix Σ :
To determine the matrix Σ, list the nonzero singular values, 𝜎𝑖 , in
decreasing magnitude down the main diagonal of Σ, where 𝜎𝑖 = 𝜆𝑖 .
(Add zero rows and columns if needed to retain the original dimension of
𝐴 in Σ)
5 0 0
Σ=
0 3 0
Form 𝐴𝑇 𝐴:
3 2 13 12 2
𝑇 3 2 2
𝐴 𝐴= 2 3 = 12 13 −2
2 3 −2
2 −2 2 −2 8

Determine the eigenvalues of 𝐴𝑇 𝐴:


The eigenvalues of 𝐴𝑇 𝐴 are 𝜆1 = 25, 𝜆2 = 9, 𝑎𝑛𝑑 𝜆3 = 0 (Arrange in
decreasing magnitude)
Form the matrix 𝑉 𝑇 :
Compute the corresponding eigenvectors and normalize them to produce
the matrix 𝑉. Then compute 𝑉 𝑇
1 2
1
18 3
2
1 2
The normalized eigenvectors are 𝑣1 = 1 , 𝑣2 = − 18
, 𝑣3 = −
3
2 1
4
0 −
18 3
1 2 1 1
1 0
18 3 2 2
2 4
1 2 1 1
Thus 𝑉 = 𝑣1 𝑣2 𝑣3 = 1 − 18

3
and 𝑉 𝑇 = 18

18
18
2 1
4

1 2

2 −
0 18 3 3 3
3
Rewrite matrix 𝐴 as 𝐴 = 𝑈Σ𝑉 𝑇
1 1
1 1 0
2 2 4
2 2 5 0 0 1 1
𝐴= − 18
1 1 0 3 0 18 18
− 1
2 2 2 2 −
− 3
3 3

Note: If two eigenvectors are known, the third eigenvector can also
be found using orthogonality property.
Application to Image Processing :

• The process of Singular Value Decomposition can be used in many


applications, including watermarking an image, computing weighted
least squares, and optimal prediction.
• Here we will consider how this process could be used to produce
reduced image sizes.
• Digital images require large amounts of memory, and often it is desirable
to reduce the required memory storage and still retain as much of the
image quality as possible.
• In such scenario one can consider using SVD to manipulate these large
sets of data, which allows one to identify the components of the image
which contribute the least to overall image quality.
• Digital images can be interpreted as a matrix with pixels represented as
individual numerical entries.
• Rows and columns of a matrix hold the position of the pixel, and each
value in the matrix represent the corresponding saturation level. These
components can ultimately result in a large amount of memory used to
produce a single image.
• Singular Value Decomposition rewrites the image(matrix) in its broken-
down form allowing one to retain the important singular values that the
image requires while also releasing the values that are not as necessary
in retaining the quality of the image.
• With each loss of a singular value some refinement of the image will be
lost, but the overall image features will be retained.
• The following figures depicts that as more singular values are included in
the image matrix, the clarity of the image improves.

You might also like