1.9.rank of A Matrix

Download as pdf or txt
Download as pdf or txt
You are on page 1of 7

1.9.

Rank of a Matrix
Row space and Column space of a matrix
Suppose be an arbitrary matrix over a field . Let the rows of ,
,
may be viewed as vectors in , and the columns of ,
,
may be viewed as vectors in . Then

i) The subspace of spanned by row vectors of is called the row space of ,


denoted by .
ii) The subspace of spanned by the column vectors of is called the
column space of , denoted by .

Example 1: Consider the matrix . Then the row vectors of are

which spans a subspace of called the row space of and the column vectors
of are

which spans a subspace of called the column space of .

Note: The collection of rows (or columns) of may not form a basis of (or
) because sometimes the collection of rows (or columns) may not be linearly
independent. However, a least linearly independent set may give a basis for
(or )

Theorem 2: Prove that the row space and the column space of a matrix
have the same dimension.
Proof: Let be the row vectors of . Then the row vector of is

Let . Then assume that the set of vectors forms


a basis for . Then the vector of is .

Since is a basis of , for any scalars we have

Equating the components of the above vectors on both sides, we get

The above equations can be written as

This implies that each column vector of lies in a subspace spanned by .

Since , we have

With similar arguments one can show that

From and , we have

Hence be proved.
Rank of a matrix
The dimension of the row space or the column space of a matrix is called the
rank of the matrix , denoted .

Theorem 1: If the nonzero row vectors in a row canonical form of a matrix


forms a basis for the row space of , then the rank of is the number of
nonzero row vectors in .

Example 2: Find a basis for the row space of the following matrix , and
determine its rank:

Solution: Using Gauss-Jordan Elimination Method, the row-canonical form of the


given matrix is

Notice that the set of vectors is the least linearly


independent set of . Thus, the row vectors forms a basis for
the row space of and hence .

Example 3: Find a basis for the column space of the following matrix and
then determine its rank:

Solution: The transpose of is . Notice that the column

space of becomes the row space of . Using Gauss-Jordan Elimination


Method, the row-canonical form of the matrix is
Notice that the set of vectors is the least linearly
independent set of . Thus, the row vectors forms a basis for
the row space of .

Now, by writing these row vectors as columns which forms a basis for the column

space of . That is, the column vectors forms a basis for and

hence

Example 4: Suppose that be the subspace of the vector space spanned


by the row vectors . Then find the
basis of .

Solution: The matrix formed by the given row vectors is

Using Gauss-Jordan Elimination Method, the row-canonical form of is

Therefore, the nonzero vectors and forms a basis for the


subspace of spanned by the given row vectors.

Theorem 3: If and are matrices of the same size, then prove that

Proof: Let the matrices and are of size . Then is also a matrix of size .
Assume that the columns of and be and respectively.
Then the columns of is .
By the definition, the ranks of and are the dimensions of the spanning column
vectors in and respectively. That is, the columns
where spans and , where spans .

Also, the rank of is the dimension of the spanning column vectors in


. That is, the columns ,
where spans .

Therefore, for any vector in can be expressed as a


linear combination of the column vectors .

That is, for any scalars such that

Thus, for

That is,

Hence be proved.

Note:
i) The rank of a matrix of order is .
ii) If and , then and

.
iii) If , and , then
Row Equivalent Matrices
Two matrices and are said to be row equivalent if the rows of can be obtained
from the rows of through a sequence of elementary row operations such that
, that is, .

Theorem 4: Let be a row canonical form of a matrix . Then the nonzero row
vectors of form a basis for the row space of and hence the rank of is the
number of nonzero row vectors in .

Result: Let be a invertible matrix. Prove that the columns of are


linearly independent if and only if rank .

Proof:
Necessary Condition

Given that is an invertible matrix of order . Let , an identity matrix of


order . Notice that the columns of are linearly independent since is in row
canonical form and hence , that is, .

Sufficient Condition

Assume that . If is invertible, then is column equivalent to and


hence . But if is not invertible, then is equivalent to a matrix with a
zero column vector and hence , which is a contradiction to our
assumption that .

Thus, the columns of are linearly independent if and only if rank .

Hence be proved.

Note: The same proof will be applicable when rows are linearly independent iff
rank .
Consider the following system of linear equations in unknowns

where and are real numbers for and .

System can be represented by the matrix equation

 a11 a12 a1n 


a a2 n 
where  21 a22 , and
 
 
 am1 am 2 amn 

The matrix is called the coefficient matrix and the matrix is called the
 a11 a12 ... a1n b1 
a b2 
augmented matrix. i.e.,  21 a22 ... a2n
 ... ... ... ... ... 
 
 am1 am 2 ... am n bm 

(a) If , then has a unique solution.


(b) If , then has infinitely many solutions.
(c) If , then has no solution.

You might also like