Linalg

Download as pdf or txt
Download as pdf or txt
You are on page 1of 18

EE263, Stanford University Stephen Boyd and Sanjay Lall

Linear algebra review

I norm, inner product

I subspaces, halfspaces

I independence, basis, dimension

I invertible matrices

1
(Euclidean) norm

for x 2 Rn we define the (Euclidean) norm as


p p
kxk = x21 + x22 +    + x2n = xT x

kxk measures length of vector (from origin)

important properties:

I k xk = j jkxk (homogeneity)
I kx + yk  kxk + kyk (triangle inequality)
I kxk  0 (nonnegativity)
I kxk = 0 () x = 0 (definiteness)

2
RMS value and (Euclidean) distance

root-mean-square (RMS) value of vector x 2 Rn :


X
n !1=2
rms(x) =
1
x2i kxk
= p
n i=1 n

norm defines distance between vectors: dist(x; y ) = kx yk


x
x−y

3
Inner product

hx; yi := x1 y1 + x2 y2 +    + xn yn = xT y

important properties:

I h x; yi = hx; yi
I hx + y; zi = hx; z i + hy; zi
I hx; yi = hy; xi
I hx; xi  0
I hx; xi = 0 () x = 0

f (y) = hx; yi is linear function : Rn ! R, with linear map defined by row vector xT
4
Cauchy-Schwarz inequality and angle between vectors

I for any x; y 2 Rn , jxT yj  kxkkyk


I (unsigned) angle between vectors in Rn defined as
T
 = \(x; y) = cos 1 x y
kxkkyk

θ
 
xT y
kyk2
y

thus xT y = kxkkyk cos  5


Cauchy-Schwarz inequality and angle between vectors

special cases:

I x and y are aligned:  = 0; xT y = kxkkyk;


(if x 6= 0) y = x for some  0
I x and y are opposed:  = ; xT y = kxkkyk
(if x 6= 0) y = x for some  0
I x and y are orthogonal:  = =2 or =2; xT y = 0
denoted x?y

6
Cauchy-Schwarz inequality and angle between vectors

interpretation of xT y > 0 and xT y < 0:


I xT y > 0 means \(x; y) is acute
I xT y < 0 means \(x; y) is obtuse
x x

y y
xT y > 0 xT y < 0

7
Linear functionals

I a function f : Rn ! R is called a functional


I for a 2 Rn , the function f (x) = aT x is a linear functional
I every linear functional has this form
I if a 6= 0 then for any 2 R the set H = fx 2 Rn j aT x = g is called a hyperplane
I hyperplanes H and H are parallel, H0 passes through the origin

8
Halfspaces

I define a halfspace with outward normal vector y , and boundary passing through 0 by

H = fx j xT y  0g
I H is all vectors x that make an obtuse or right angle with y

{x | xT y ≤ 0}

0
y

9
Subspaces

A set S  Rn is called a subspace if

x+y 2S for all x; y 2 S


x2S for all 2 R and x 2 S

I we say S is closed under addition and closed under scalar multiplication


I geometrically, S is a flat set which passes through the origin

10
Examples of subspaces

I S1 = Rn , i.e., the entire vector space is considered a subspace of itself


I S2 = f0g, the origin is the smallest subspace of Rn
I the span of a set of vectors v1 ; v2 ; : : : ; vk 2 Rn is a subspace
span(v1 ; v2 ; : : : ; vk ) = f 1 v1 +    + k vk j i 2 Rg
the set of all linear combinations of the vectors

I the sum of two subspaces is a subspace

S+T = fx + y j x 2 S; y 2 T g

11
Orthogonal subspaces

I two subspaces S; T  Rn are called orthogonal if

xT y = 0 for all x 2 S; y 2 T

I for any set S  Rn , the orthogonal complement is

S ? = fx j xT y = 0 for all y 2 S g

I S ? is always a subspace, even if S is not


I S ? is the set of all vectors x, each of which is orthogonal to every vector in S

12
Independent set of vectors

a set of vectors fv1 ; v2 ; : : : ; vk g is independent if

1 v1 + 2 v2 +    + k vk = 0 = ) 1 = 2 =  = 0

some equivalent conditions:

I coefficients of v = 1 v1 + 2 v2 +    + k vk are uniquely determined, i.e.,

1 v1 + 2 v2 +    + k vk = 1 v1 + 2 v2 +    + k vk
implies 1 = 1; 2 = 2; : : : ; k = k
I no vector vi can be expressed as a linear combination of the other vectors v1 ; : : : ; vi 1 ; vi+1 ; : : : ; vk
I no one vector vi is in the span of the others

13
Basis and dimension

set of vectors fv1 ; v2 ; : : : ; vk g is called a basis for a subspace S if

S = span(v1 ; v2 ; : : : ; vk )
and
fv1 ; v2 ; : : : ; vk g is independent

I equivalently, every v 2 S can be uniquely expressed as


v = 1 v1 +    + k vk

I for a given subspace S, the number of vectors in any basis is the same, called the dimension of S,
denoted d = dim S

I any set of independent vectors in S has no more than d elements


I any set of vectors that span S has at least d elements
14
Invertibility

A square matrix A 2 Rnn is called invertible if there is a matrix B 2 Rnn such that

BA = I

I B is called the inverse of A, written A 1


I we have AA 1 = A 1 A = I
I A is invertible iff it has linearly independent columns

15
Interpretations of inverse

suppose A 2 Rnn has inverse B = A 1

I mapping associated with B undoes mapping associated with A (applied either before or after!)
I x = By is a perfect (pre- or post-) equalizer for the channel y = Ax
I x = By is unique solution of Ax = y

16
Change of coordinates

203
6 .7
..
6
I standard basis vectors in Rn are e1 ; e2 ; : : : ; en , where ei = 6 1. 7
4 .. 75 has a 1 in ith component
0

I for any x we have


x = x1 e1 + x2 e2 +    + xn en
where xi are called the coordinates of x (in the standard basis)
I if t1 ; t2 ; : : : ; tn is another basis for Rn , we have
x = x~1 t1 + x~2 t2 +    + x~n tn
where x
~i are the coordinates of x in the basis t1 ; t2 ; : : : ; tn
I then x = T x~ and x~ = T 1 x
17
Similarity transformation

consider linear transformation y = Ax, A 2 Rnn


express y and x in terms of t1 ; t2 : : : ; tn , so x = T x
~ and y = T y
~, then

y~ = (T 1 AT )~x

I A ! T 1 AT is called similarity transformation


I similarity transformation by T expresses linear transformation y = Ax in coordinates t1 ; t2 ; : : : ; tn

18

You might also like