0% found this document useful (0 votes)
166 views12 pages

Linear Transformations

Linear transformations are functions that take vectors as inputs and output other vectors while satisfying two properties: 1) the sum of inputs maps to the sum of outputs and 2) scaling an input scales the output by the same amount. A linear transformation can be represented by a matrix, with the matrix columns giving the images of the basis vectors. Finding this standard matrix involves applying the transformation to the basis vectors of the domain.

Uploaded by

verma93
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
166 views12 pages

Linear Transformations

Linear transformations are functions that take vectors as inputs and output other vectors while satisfying two properties: 1) the sum of inputs maps to the sum of outputs and 2) scaling an input scales the output by the same amount. A linear transformation can be represented by a matrix, with the matrix columns giving the images of the basis vectors. Finding this standard matrix involves applying the transformation to the basis vectors of the domain.

Uploaded by

verma93
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 12

Linear Transformations

Linear Algebra
MATH 2010

• Functions in College Algebra: Recall in college algebra, functions are denoted by

f (x) = y

where f : dom(f ) → range(f ).


• Mappings: In Linear Algebra, we have a similar notion, called a map:

T :V →W

where V is the domain of T and W is the codomain of T where both V and W are vector spaces.

• Terminology: If
T (v) = w
then
– w is called the image of v under the mapping T
– v is caled the preimage of w
– the set of all images of vectors in V is called the range of T
• Example: Let
T ([v1 , v2 ]) = [2v2 − v1 , v1 , v2 ]
then T : <2 → <3 .
– Find the image of v = [0, 6].

T ([0, 6]) = [2(6) − 0, 0, 6] = [12, 0, 6]

– Findthe preimage of w = [3, 1, 2].

[3, 1, 2] = [2v1 − v1 , v1 , v2 ]

which means
2v2 − v1 = 3
v1 = 1
v2 = 2
So, v = [1, 2].
• Example: Let
T ([v1 , v2 , v3 ]) = [2v1 + v2 , v1 − v2 ]
Then T : <3 → <2 .
– Find the image of v = [2, 1, 4]:

T ([2, 1, 4]) = [2(2) + 1, 2 − 1] = [5, 1]

– Find the preimage of w = [−1, 2]

[−1, 2] = [2v1 + v2 , v1 − v2 ]

This leads to
2v1 + v2 = −1
v1 − v2 = 2
Recall that you are looking for v = [v1 , v2 , v3 ]. So, there are really 3 unknowns in the system:

2v1 + v2 + 0v3 = −1
v1 − v2 + 0v3 = 2

This leads to the solution


1 5
v = [ , − , k]
3 3
where k is an real number.
• Definition: Let V and W be vector spaces. The function T : V → W is called a linear transformation
of V into W if the following 2 properties are true for all u and v in V and for any scalar c:
1. T (u + v) = T (u) + T (v)
2. T (cu) = cT (u)
• Example: Determine whether T : <3 → <3 defined by

T ([x, y, z]) = [x + y, x − y, z]

is a linear transformation.
1. Let u = [x1 , y1 , z1 ] and v = [x2 , y2 , z2 ]. Then we want to prove T (u + v) = T (u) + T (v).

T (u + v) = T ([x1 , y1 , z1 ] + [x2 , y2 , z2 ])
= T ([x1 + x2 , y1 + y2 , z1 + z2 ])
= [x1 + x2 + y1 + y2 , x1 + x2 − (y1 + y2 ), z1 + z2 ]

and
T (u) + T (v) = T ([x1 , y1 , z1 ]) + T ([x2 , y2 , z2 ])
= [x1 + y1 , x1 − y1 , z1 ] + [x2 + y2 , x2 − y2 , z2 ]
= [x1 + y1 + x2 + y2 , x1 − y1 + x2 − y2 , z1 + z2 ]
= [x1 + x2 + y1 + y2 , x1 + x2 − (y1 + y2 ), z1 + z2 ]
Therefore, T (u + v) = T (u) + T (v).
2. We want to prove T (cu) = cT (u).

T (cu) = T (c[x1 , y1 , z1 ])
= T ([cx1 , cy1 , cz1 ])
= [cx1 + cy1 , cx1 − cy1 , cz1 ]

and
cT (u) = cT ([x1 , y1 , z1 ])
= c[x1 + y1 , x1 − y1 , z1 ]
= [c(x1 + y1 ), c(x1 − y1 ), cz1 ]
= [cx1 + cy1 , cx1 − cy1 , cz1 ]
So, T (cu) = cT (u).
Therefore, T is a linear transformation.
• Example: Determine whether T : <2 → <2 defined by

T ([x, y]) = [x2 , y]

is a linear transformation.
1. Let u = [x1 , y1 ] and v = [x2 , y2 ]. Then we want to prove T (u + v) = T (u) + T (v).

T (u + v) = T ([x1 , y1 ] + [x2 , y2 ])
= T ([x1 + x2 , y1 + y2 ])
= [(x1 + x2 )2 , y1 + y2 ]
= [x21 + 2x1 x2 + x22 , y1 + y2 ]

and
T (u) + T (v) = T ([x1 , y1 ]) + T ([x2 , y2 ])
= [x21 , y1 ] + [x22 , y2 ]
= [x21 + x22 , y1 + y2 ]
Since, T (u + v) 6= T (u) + T (v), T is not a linear transformation. There is no need to test the
second criteria. However, you could have proved the same thing using the second criteria:
2. We would want to prove T (cu) = cT (u).

T (cu) = T (c[x1 , y1 ])
= T ([cx1 , cy1 ])
= [(cx1 )2 , cy1 ]
= [c2 x21 , cy1 ]

and
cT (u) = cT ([x1 , y1 ])
= c[x21 , y1 ]
= [cx21 , cy1 ]
So, T (cu) 6= cT (u) either. Thus, again, we would have showed, T was not a linear transformation.
• Two Simple Linear Transformations:
– Zero Transformation: T : V → W such that T (v) = 0 for all v in V
– Identity Transformation: T : V → V such that T (v) = v for all v in V
• Theorem: Let T be a linear transformation from V into W , where u and v are in V . Then
1. T (0) = 0
2. T (−v) = −T (v)
3. T (u − v) = T (u) − T (v)
4. If
v = c1 v1 + c2 v2 + ... + cn vn
then
T (v) = c1 T (v1 ) + c2 T (v2 ) + ... + cn T (vn )
3 3
• Example: Let T : < → < such that

T ([1, 0, 0]) = [2, 4, −1] T ([0, 1, 0]) = [1, 3, −2] T ([0, 0, 1]) = [0, −2, 2]

Find T ([−2, 4, −1]). Since

[−2, 4, −1] = −2[1, 0, 0] + 4[0, 1, 0] − 1[0, 0, 1]

we can say

T ([−2, 4, −1]) = −2T ([1, 0, 0])+4T ([0, 1, 0])−1T ([0, 0, 1]) = −2[2, 4, −1]+4[1, 3, −2]−[0, −2, 2] = [0, 6, −8]
• Theorem: Let A be a mxn matrix. The function T defined by

T (v) = Av

is a linear transformation from <n → <m .


• Examples:

– If T (v) = Av where  
1 2
A =  −2 4 
−2 2
then T : <2 → <3 .
– If T (v) = Av where  
−1 2 1 3 4
A=
0 0 2 −1 0
then T : <5 → <2 .
• Standard Matrix: Every linear transformation T : <n → <m has a mxn standard matrix A associ-
ated with it where
T (v) = Av
To find the standard matrix, apply T to the basis elements in <n . This produces vectors in <m which
become the columns of A:

For example, let

T ([x1 , x2 , x3 ]) = [2x1 + x2 − x3 , −x1 + 3x2 − 2x3 , 3x2 + 4x3 ]

Then
T ([1, 0, 0]) = [2, −1, 0] T ([0, 1, 0]) = [1, 3, 3] T ([0, 0, 1]) = [−1, −2, 4]
these vectors become the columns of A:
 
2 1 −1
A =  −1 3 −2 
0 3 4
• Shortcut Method for Finding the Standard Matrix: Two examples:

1. Let T be the linear transformation from above, i.e.,

T ([x1 , x2 , x3 ]) = [2x1 + x2 − x3 , −x1 + 3x2 − 2x3 , 3x2 + 4x3 ]

Then the first, second and third components of the resulting vector w, can be written respectively
as
w1 = 2x1 + x2 − x3
w2 = −x1 + 3x2 − 2x3
w3 = 3x2 + 4x3
Then the standard matrix A is given by the coefficient matrix or the right hand side:
 
2 1 −1
A =  −1 3 −2 
0 3 4

So,     
w1 2 1 −1 x1
 w2  =  −1 3 −2   x2 
w3 0 3 4 x3
2. Example: Let
T ([x, y, z]) = [x − 2y, 2x + y]
3 2
Since T : < → < , A is a 3x2 matrix:

w1 = x − 2y + 0z
w2 = 2x + y + 0z

So,  
1 −2 0
A=
2 1 0

• Geometric Operators:

– Reflection Operators:
∗ Reflection about the y-axis: The schematic of reflection about the y-axis is given below. The
transformation is given by
w1 = −x
w2 = y
with standard matrix  
−1 0
A=
0 1
∗ Reflection about the x-axis: The schematic of reflection about the x-axis is given below. The
transformation is given by
w1 = x
w2 = −y
with standard matrix  
1 0
A=
0 −1

∗ Reflection about the line y = x: The schematic of reflection about the line y = x is given
below. The transformation is given by

w1 = y
w2 = x

with standard matrix  


0 1
A=
1 0

– Projection Operators:
∗ Projected onto x-axis: The schematic of projection onto the x-axis is given below. The
transformation is given by
w1 = x
w2 = 0
with standard matrix  
1 0
A=
0 0
∗ Projected onto y-axis: The schematic of projection onto the y-axis is given below. The
transformation is given by
w1 = 0
w2 = y
with standard matrix  
0 0
A=
0 1

∗ In <3 , you can project onto a plane. The standard matrices for the projection is given below.
· Projection onto xy-plane:  
1 0 0
A= 0 1 0 
0 0 0
· Projection onto xz-plane:  
1 0 0
A= 0 0 0 
0 0 1
· Projection onto yz-plane:  
0 0 0
A= 0 1 0 
0 0 1
– Rotation Operator: We can consider rotating through an angle θ.

If we look at a more detailed depiction of the rotation, as depicted below, we see how we can use
trignometric identities to recover the standard matrix.

Using trigonometric identities, we have

x = r cos(φ)
y = r sin(φ)

and
w1 = r cos(θ + φ)
w2 = r sin(θ + φ)
Using trigonometric identities on w1 and w2 , we have

w1 = r cos(θ) cos(φ) − r sin(θ) sin(φ)


w2 = r sin(θ) cos(φ) + r cos(θ) sin(φ)

which equals
w1 = x cos(θ) − y sin(θ)
w2 = x sin(θ) + y cos(θ)
if we plug in x and y formulas from above. Therefore, the standard matrix is given by
 
cos(θ) − sin(θ)
A=
sin(θ) cos(θ)
– Dilation and Contraction Operators: We can consider the geometric process of dilating
or contracting vectors. For example, in <2 , the contraction of a vector is given below where
0 < k < 1.

If
∗ 0 < k < 1, we have contraction and
∗ k > 1, we have dilation
In each case, the standard matrix is given by
 
k 0
A=
0 k

In <3 , we have the standard matrix


 
k 0 0
A= 0 k 0 
0 0 k

• One-to-One linear transformations: In college algebra, we could perform a horizontal line test to
determine if a function was one-to-one, i.e., to determine if an inverse function exists. Similarly, we
say a linear transformation T : <n → <m is one-to-one if T maps distincts vectors in <n into distinct
vectors in <m . In other words, a linear transformation T : <n → <m is one-to-one if for every w in
the range of T , there is exactly one v in <n such that T (v) = w.
• Examples:
1. The rotation operator is one-to-one, because there is only one vector v which can be rotated
through an angle θ to get any vector w.
2. The projection operator is not one-to-one. For example, both [2, 4] and [2, −1] can be projected
onto the x-axis and result in the vector [2, 0].
• Linear system equivalent statements: Recall that for a linear system, the following are equivalent
statements:
1. A is invertible
2. Ax = b is consistent for every nx1 matrix b
3. Ax = b has exactly one solution for every nx1 matrix b

• Recall, that for every linear transformation T : <n → <m , we can represent the linear transformation
as
T (v) = Av
where A is the mxn standard matrix associated with T . Using the above equivalent statements with
this form of the linear transformation, we have the following theorem.
• Theorem: If A is an nxn matrix and T : <n → <n is given by
T (v) = Av
then the following is equivalent.
1. A is invertible
2. For every w in <n , there is some vector v in <n such that T (v) = w, i.e., the range of T is <n .
3. For every w in <n , there is a unique vector v in <n such that T (v) = w, i.e., T is one-to-one.
• Examples:
1. Rotation Operator: The standard matrix for the rotation operator is given by
 
cos(θ) − sin(θ)
A=
sin(θ) cos(θ)
To determine if A is invertible, we can find the determinant of A:
|A| = cos2 (θ) + sin2 (θ) = 1 6= 0
so A is invertible. Therefore, the range of the rotation operator in <2 is all of <2 and it is
one-to-one.
2. Projection Operators: For each projection operator, we can easily show that |A| = 0. Therefore,
the projection operator is not one-to-one.
• Inverse Operator: If T : <n → <n is a one-to-one transformation given by
T (v) = Av
where A is the standard matrix, then there exists an inverse operator T −1 : <n → <n and is given by
T −1 (w) = A−1 v

• Examples:
1. The standard matrix for the rotation operator through an angle θ is
 
cos(θ) − sin(θ)
A=
sin(θ) cos(θ)
The inverse operator can be found by rotating back through an angle −θ, i.e.,
 
cos(−θ) − sin(−θ)
A=
sin(−θ) cos(−θ)
Using trigonometric idenitities, we can see this is the same as
 
−1 cos(θ) sin(θ)
A =
− sin(θ) cos(θ)

2. Let
T ([x, y]) = [2x + y, 3x + 4y]
Then T has the standard matrix  
2 1
A=
3 4
Thus, |A| = 5 6= 0, so T is one-to-one and has an inverse operator with standard matrix
   
1 4 −1 4/5 −1/5
A−1 = =
5 −3 2 −3/5 2/5

So, the inverse operator is given by


 
w1 4 1 3 2
T −1 (w) = A−1 = [ w1 − w2 , − w1 + w2 ]
w2 5 5 5 5
• Kernel of T : One of the properties of linear transformations is that
T (0) = 0
There may be other vectors v in V such that T (v) = 0. The kernel of T is the set of all vectors v in V
such that
T (v) = 0
It is denoted ker(T ).
• Example: Let T : <2 → <3 be given by
T ([x1 , x2 ]) = [x1 − 2x2 , 0, −x1 ]
To find ker(T ), we need to find all vectors v = [x1 , x2 ] in <2 , such that T (v) = 0 = [0, 0, 0] in <3 . In
other words,
x1 − 2x2 = 0
0 = 0
−x1 = 0
The only solution to this system if [0, 0]. Thus
ker(T ) = {[0, 0]} = {0}

• Example: Let T : <3 → <2 be given by T (x) = Ax where


 
1 −1 −2
A=
−1 2 3
To find ker(T ), we need to find all v = [x1 , x2 , x3 ] such that T (v) = [0, 0]. In other words, we need to
solve the system  
 x
1 −1 −2  1 
  
0
x2 =
−1 2 3 0
x3
Putting this in augmented form, we have
 
1 −1 −2 | 0
−1 2 3 | 0
which reduces to  
1 0 −1 | 0
0 1 1 | 0
Therefore, x3 = t is a free parameter, so the solutions is given by
     
x1 t 1
 x2  =  −t  =  −1  t
x3 t 1
Therefore, ker(T ) = span({[1, −1, 1]}).
• Corollary: If T : <n → <m is given by
T (v) = Av
then ker(T ) is equal to the nullspace of A.
• Example: Given T (v) = Av where  
1 −2 1
A=
0 2 1
find a basis for ker(T ).

Solving the system, we have    


1 −2 1 1 0 2

0 2 1 0 1 1/2
Therefore, a basis for ker(T ) is given by a basis for the nullspace of A: {[−2, −1/2, 1]}.
• Example:Given T (v) = Av where
 
1 2 0 1 −1
 2 1 3 1 0 
A=
 −1 0

−2 0 1 
0 0 0 2 8

find a basis for ker(T ).

Ans: {[−2, 1, 1, 0, 0], [1, 2, 0, −4, 1]}


• Terminology: The dimension of ker(T ) is called the nullity of T . In the previous example, the nullity
of T is 2.

• Range of T : The range of T is the set of all vectors w such that T (v) = w. If T : <n → <m is given
by
T (v) = Av
then the range of T is the column space of A.
• Onto: If T : V → W is a linear transformation from a vector space V to a vector space W , then T
is said to be onto (or onto W ) if every vector in W is the image of at least one vector in V , i.e., the
range of T = W .

• Equivalence Statements for One-to-One, Kernel: If T : V → W is a linear transformation, then


the following are equivalent:
1. T is one-to-one
2. ker(T ) = {0}

• Equivalence Statements for One-to-One, Kernel, and Onto: If T : V → V is a linear transfor-


mation and V is finite-dimensional, then the following are equivalent:
1. T is one-to-one
2. ker(T ) = {0}
3. T is onto
• Isomorphism: If a linear transformation T : V → W is both one-to-one and onto, then T is said to
be an isomorphism and the vector spaces V and W are said to be isomorphic.

You might also like