Math (P) Refresher Lecture 6: Linear Algebra I
Math (P) Refresher Lecture 6: Linear Algebra I
Math (P) Refresher Lecture 6: Linear Algebra I
Linear Algebra I
September 2006
Todays Topics : Working with Vectors Linear Independence Matrix Algebra Square
Matrices Systems of Linear Equations Method of Substitution Gaussian Elimination GaussJordan Elimination
v = v1 v2 . . . vn , v = .
..
vn
We can also think of a vector as defining a point in n-dimensional space, usually Rn ; each
element of the vector defines the coordinate of the point in a particular direction.
Vector Addition: Vector addition is defined for two vectors u and v iff they have the same
number of elements:
u + v = u1 + v1 u2 + v2 uk + vn
Scalar Multiplication: The product of a scalar c and vector v is:
n
X
ui vi
i=1
v v = v1 v1 + v2 v2 + + vn vn
Much of the material and examples for this lecture are taken from Gill (2006) Essential Mathematics for Political
and Social Scientists, Simon & Blume (1994) Mathematics for Economists and Kolman (1993) Introductory Linear
Algebra with Applications.
Linear Dependence
Linear combinations: The vector u is a linear combination of the vectors v1 , v2 , , vk if
u = c1 v1 + c2 v2 + + ck vk
Linear independence: A set of vectors v1 , v2 , , vk is linearly independent if the only
solution to the equation
c1 v1 + c2 v2 + + ck vk = 0
is c1 = c2 = = ck = 0. If another solution exists, the set of vectors is linearly dependent.
A set S of vectors is linearly dependent iff at least one of the vectors in S can be written as
a linear combination of the other vectors in S.
Linear independence is only defined for sets of vectors with the same number of elements;
any linearly independent set of vectors in n-space contains at most n vectors.
Exercises: Are the following sets of vectors linearly independent?
1.
2.
1
1
1
v1 = 0 , v2 = 0 , v3 = 1
0
1
1
3
2
2
v1 = 2 , v2 = 2 , v3 = 3
1
4
1
Matrix Algebra
Matrix: A matrix is an array of mn real numbers arranged in m rows by n columns.
A= .
..
..
.
.
.
.
.
.
.
am1 am2
amn
Note that you can think of vectors as special cases of matrices; a column vector of length k
is a k 1 matrix, while a row vector of the same length is a 1 k matrix. You can also think
of larger matrices as being made up of a collection of row or column vectors. For example,
A = a1 a2
am
a11 + b11
a12 + b12 a1n + b1n
a21 + b21
a22 + b22 a2n + b2n
A+B=
..
..
..
.
.
.
.
.
.
am1 + bm1 am2 + bm2 amn + bmn
Note that matrices A and B must be the same size, in which case they are conformable for
addition.
Example:
1 2 3
A=
,
4 5 6
1 2 1
B=
2 1 2
2 4 4
A+B=
6 6 8
sA = s .
..
..
..
.. = ..
..
..
..
.
.
.
.
.
.
.
sam1 sam2 samn
am1 am2 amn
Example:
1 2 3
s = 2,
A=
4 5 6
2 4 6
sA =
8 10 12
a b
aA + bC aB + bD
A B
1. c d
= cA + dC cB + dD
C D
e f
eA + f C eB + f D
2 5
1 2 1
1(2) + 2(4) 1(2) 1(5) + 2(3) 1(1)
4 2
4 3 =
2.
=
3 1 4
3(2) + 1(4) + 4(2) 3(5) + 1(3) + 4(1)
6 16
2
1
Note that the number of columns of the first matrix must equal the number of rows of the
second matrix, in which case they are conformable for multiplication. The sizes of the
matrices (including the resulting product) must be
(m k)(k n) = (m n)
Laws of Matrix Algebra:
1. Associative:
(A + B) + C = A + (B + C)
(AB)C = A(BC)
2. Commutative:
A+B=B+A
3. Distributive:
A(B + C) = AB + AC
(A + B)C = AC + BC
Commutative law for multiplication does not hold the order of multiplication matters:
AB 6= BA
Example:
1 2
A=
,
1 3
2 3
AB =
,
2 2
2 1
B=
0 1
1 7
BA =
1 3
4
0
AT = 2 5
3 1
4 2 3
1. A =
,
0 5 1
2
2. B = 1 ,
3
BT = 2 1 3
1 3 2
A=
,
2 1 3
0 1
B = 2 2
3 1
0 1
1 3 2
12 7
T
2 2
(AB) =
=
2 1 3
5 3
3 1
1 2
12 7
0 2 3
T T
3 1 =
B A =
5 3
1 2 1
2 3
Square Matrices
Square matrices have the same number of rows and columns; a k k square matrix is referred
to as a matrix of order k.
The diagonal of a square matrix is the vector of matrix elements that have the same subscripts. If A is a square matrix of order k, then its diagonal is [a11 , a22 , . . . , akk ]0 .
Trace: The trace of a square matrix A is the sum of the diagonal elements:
tr(A) = a11 + a22 + + akk
Properties of the trace operator: If A and B are square matrices of order k, then
1.
2.
3.
4.
4
1 2
A=
= A0 ,
B= 2
2 1
1
4
1 0
A=
,
B= 0
0 2
0
2 1
1 3 = B0
3 1
0 0
1 0
0 1
3. Triangular Matrix: A matrix is triangular one of two cases. If all entries below the
diagonal are zero (aij = 0 for all i > j), it is upper triangular. Conversely, if all entries
above the diagonal are zero (aij = 0 for all i < j), it is lower triangular.
Examples:
1 0 0
1 7 4
ALT = 4 2 0 ,
AU T = 0 3 9
3 2 5
0 0 3
4. Identity Matrix: The n n identity matrix In is the matrix whose diagonal elements
are 1 and all off-diagonal elements are 0. Examples:
1 0 0
1 0
I2 =
,
I3 = 0 1 0
0 1
0 0 1
Linear Equations
Linear Equation: a1 x1 + a2 x2 + + an xn = b
ai are parameters or coefficients. xi are variables or unknowns.
Linear because only one variable per term and degree is at most 1.
1. R2 : line
x2 =
2. R3 : plane
3. Rn : hyperplane
x3 =
b
a2
b
a3
a1
a2 x 1
a1
a3 x 1
a2
a3 x 2
+
+
a12 x2
a22 x2
+
+
..
.
am1 x1 + am2 x2 +
+
+
a1n xn
a2n xn
=
=
..
.
b1
b2
+ amn xn = bm
2. No solution:
3. Infinite solutions:
Method of Substitution
Procedure:
1. Solve one equation for one variable, say x1 , in terms of the other variables in the equation.
2. Substitute the expression for x1 into the other m1 equations, resulting in a new system
of m 1 equations in n 1 unknowns.
3. Repeat steps 1 and 2 until one equation in one unknown, say xn . We now have a value
for xn .
8=8
More generally, we can multiply both sides of any equation by a constant and maintain an
equivalent equation. For example, the following two equations are equivalent:
a11 x1 + a12 x2 = b1
ca11 x1 + ca12 x2 = cb1
Adding Equations: Suppose we had the following two very simple equations:
3 = 3
7 = 7
If we add these two equations to each other, we get
7+3=7+3
10 = 10
a012 x2
a013 x3
a01n xn
b01
a022 x2
a023 x3
a02n xn
b02
a033 x3
+
..
.
a03n xn
= b03
..
.
= b0m
a0mn xn
where a0ij denotes the coefficient of the jth unknown in the ith equation after the above
transformation. Note that at each stage of the elimination process, we want to change some
coefficient of our system to 0 by adding a multiple of an earlier equation to the given equation.
The coefficients a011 , a022 , etc in boxes are referred to as pivots, since they are the terms
used to eliminate the variables in the rows below them in their respective columns. Once the
linear system is in the above reduced form, we then use back substitution to find the values
of the xj s.
As well see, pivots dont need to be on the ij, i = j diagonal. Additionally, sometimes when we pivot, we will
eliminate variables in rows above a pivot.
Exercises:
1. Using Gaussian elimination, solve
x 3y = 3
2x + y = 8
10
=
=
=
..
.
x2
x3
..
.
xn
b1
b2
b3
= bm