Exercise 2.4.2. For each of the two matrices below give the dimension and find a basis for each of their four subspaces:
Answer: The echelon matrix has only a single pivot, in the second column. As discussed on page 93, the second column
is therefore a basis for the column space
. (The third column of
is equal to four times the second column.) The dimension of
is 1 (same as the rank of
).
The echelon matrix can be derived from
via Gaussian elimination (i.e., by subtracting two times the first row of
from the second row of
). Again per the discussion on page 93, since the second column of
is a basis for the column space of
the second column of
, the vector
, is a basis for the column space of
. (As with
, the third column of
is equal to four times the second column.) The dimension of
is 1 (the same as the rank of
, which is the same as the rank of
).
Turning to the row spaces, the only nonzero row of the echelon matrix is the first row, so per the discussion on page 91 the vector
is a basis for the row space
. The dimension of
is 1 (again, the same as the rank of
). Since the matrix
can be derived from
using Gaussian elimination the row spaces of the two matrices are identical, so that the vector
is also a basis for the row space
. (This basis vector happens to be the first row of
, and the second row of
is equal to two times the first row.) The dimension of
is 1 (same as the rank of
and
).
We now turn to the nullspaces and
, i.e., the solutions to the equations
and
. In particular for
we must find
such that
As noted above, if we do Gaussian elimination on (i.e., by multiplying the first row by 2 and subtracting it from the second row) then we obtain the matrix
. Both matrices thus have rank
with
being a basic variable (since the pivot is in the second column) and
,
, and
being free variables.
From the equation above we see that we must have or
. Setting each of the free variables
,
, and
to 1 in turn (with the other free variables set to zero) we have the following set of vectors as solutions to the homogeneous equation
and a basis for the null space of
:
Since can be obtained from
by Gaussian elimination any solution to
is also a solution to
and vice versa, so the above vectors also form a basis for the nullspace of
. The dimensions of the two nullspaces
and
are both 3 (equal to the number of vectors in the basis) and in fact the nullspaces are identical (since they have the exact same basis).
Finally we turn to finding a basis for each of the left nullspaces and
. As discussed on page 95 there are two possible approaches to doing this. One way to find the left nullspace of
is to look at the operations on the rows of
needed to produce zero rows in the resulting echelon matrix
in the process of Gaussian elimination; the coefficients used to carry out those operations make up the basis vectors of the left nullspace
.
In particular, the one and only zero row in is produced by multiplying the first row of
by two and subtracting it from the second row of
; the coefficients for this operation are -2 (for the first row) and 1 (for the second). The vector
is therefore a basis for the left nullspace
(which has dimension 1). We can test this by multiplying
on the left by the transpose of this vector:
The left nullspace of can be found in a similar manner: Since
is already in echelon form the first step of Gaussian elimination would be equivalent to adding nothing to the second row, in other words, multiplying the first row by zero and then adding it to the second (zero) row; the coefficients for this operation are 0 (for the first row) and 1 (for the second). The vector
is therefore a basis for the left nullspace
(which also has dimension 1). As with
we can test this by multiplying
on the left by the transpose of this vector:
An alternate approach to find the left nullspace of is to explicitly solve
or
Gaussian elimination on proceeds as follows, first by exchanging the first row and second row and then by subtracting 4 times the first row from the third:
We thus have as a basic variable (since the pivot is in the first column) and
as a free variable. From the first row of the final matrix we have
or
in the homogeneous case. Setting the free variable
then gives us the vector
as a basis for the left nullspace of
. Since there is only one vector in the basis the left nullspace of
has dimension 1.
Similarly we can also find the left nullspace of by solving the homogeneous system
or
Gaussian elimination on proceeds as follows, first by exchanging the first row and second row and then by subtracting 4 times the first row from the third:
As with we have
as a basic variable and
as a free variable. From the first row of the final matrix we have
or
in the homogeneous case. Setting the free variable
then gives us the vector
as a basis for the left nullspace of
. Since there is only one vector in the basis the left nullspace of
has dimension 1.
Note that the dimension of the column space of is the rank
of
, namely 1, while the dimension of the nullspace of
is equal to the number of columns of
minus the rank, or
. The dimension of the row space of
is also
while the dimension of the left nullspace of
is equal to the number of rows of
minus the rank, or
. These results are in accordance with the Fundamental Theorem of Linear Algebra, Part I on page 95. Similar results hold for
.
Also note that the row space of is equal to the row space of
; this is because the rows of
are linear combinations of the rows of
and vice versa. Similarly the nullspace of
is equal to the nullspace of
for the same reason.
NOTE: This continues a series of posts containing worked out exercises from the (out of print) book Linear Algebra and Its Applications, Third Edition by Gilbert Strang.
If you find these posts useful I encourage you to also check out the more current Linear Algebra and Its Applications, Fourth Edition, Dr Strang’s introductory textbook Introduction to Linear Algebra, Fourth Edition
and the accompanying free online course, and Dr Strang’s other books
.