## Linear Algebra and Its Applications, Review Exercise 2.26

Review exercise 2.26. State whether the following statements are true or false:

a) For every subspace $S$ of $\mathbb{R}^4$ there exists a matrix $A$ for which the nullspace of $A$ is $S$.

b) For any matrix $A$, if both $A$ and its transpose $A^T$ have the same nullspace then $A$ is a square matrix.

c) The transformation from $\mathbb{R}^1$ to $\mathbb{R}^1$ that transforms $x$ into $mx+b$ (for some scalars $m$ and $b$) is a linear transformation.

Answer: a) For the statement to be true, for any subspace $S$ of $\mathbb{R}^4$ we must be able to find a matrix $A$ such that $\mathcal{N}(A) = S$. (In other words, for any vector $x$ in $S$ we have $Ax = 0$ and for any $x$ such that $Ax = 0$ the vector $x$ is an element of $S$.)

What would such a matrix $A$ look like? First, since $S$ is a subspace of $\mathbb{R}^4$ any vector $x$ in $S$ has four entries: $x = (x_1, x_2, x_3, x_4)$. If we have $Ax = 0$ then the matrix $A$ must have four columns; otherwise $A$ would not be able to multiply $x$ from the left side.

Second, note that the nullspace of $A$ and the column space of $A$ are related: If the column space has dimension $r$ then the nullspace of $A$ has dimension $n-r$ where $n$ is the number of columns of $A$. Since $A$ has four columns (from the previous paragraph) the dimension of $\mathcal{N}(A)$ is $n-r = 4-r$.

Put another way, if $s$ is the dimension of $S$ and $S$ is the nullspace of $A$ then we must have $s = n-r = 4-r$ or $r = 4-s$. So if the dimension of $S$ is $s=0$ then we have $r = 4-s = 4$, if the dimension of $S$ is $s=1$ then we have $r = 3$, and so on.

Finally, if the matrix $A$ has rank $r$ then $A$ has both $r$ linearly independent columns and $r$ linearly independent rows. As noted above, the number of columns of $A$ (linearly independent or otherwise) is fixed by the requirement that the nullspace of $A$ should be a subspace of $\mathbb{R}^4$. So $A$ must always have four columns, and must have at least $r = 4-s$ rows.

We now have five possible cases to consider, and can approach them as follows:

• $S$ has dimension 0 or 4. In both these cases there is only one possible subspace $S$ and we can easily find a matrix $A$ meeting the specified criteria.
• $S$ has dimension 1, 2, or 3. In each of these cases there are many possible subspaces $S$ (an infinite number, in fact). For a given $S$ we do the following:
1. Start with a basis for $S$. (Any basis will do.)
2. Consider the effect on the basis vectors if they were to be in the nullspace of some matrix $A$ meeting the criteria above.
3. Take the corresponding system of linear equations, re-express it as a system involving the entries of $A$ as unknowns and the entries of the basis vectors as coefficients, and show that we can solve the system to find the unknowns.
4. Show that all other vectors in $S$ are also in the nullspace of $A$.
5. Show that any vector in the nullspace of $A$ must also be in $S$.

We now proceed to the individual cases:

$S$ has dimension $s=4$. We then have $S = \mathbb{R}^4$. (If $S$ has dimension 4 then its basis has four linearly independent vectors. If $S \ne \mathbb{R}^4$ then there must be some vector $v$ in $\mathbb{R}^4$ but not in $S$, and that vector must be linearly independent of the vectors in the basis of $S$. But it is impossible to have five linearly independent vectors in a 4-dimensional vector space, so we conclude that $S = \mathbb{R}^4$.)

We then must have $Ax = 0$ for any vector $x$ in $S = \mathbb{R}^4$. This is true when $A$ is equal to the zero matrix. As noted above $A$ must have exactly four columns and at least $r = 4-s = 4-4 = 0$ rows. So one possible value for $A$ is

$A = \begin{bmatrix} 0&0&0&0 \end{bmatrix}$

(The matrix $A$ could have additional rows as well, as long as they are all zeros.)

We have thus found a matrix $A$ such that any vector $x$ in $S = \mathbb{R}^4$ is also in $\mathcal{N}(A)$. Going the other way, any vector $x$ in $\mathcal{N}(A)$ must have four entries (in order for $A$ to be able to multiply it) so that any such vector $x$ is also in $\mathbb{R}^4 = S$.

So if $S$ is a 4-dimensional subspace (namely $\mathbb{R}^4$) then a matrix $A$ exists such that $S$ is the nullspace of $A$.

$S$ has dimension $s=0$. The only 0-dimensional subspace of $\mathbb{R}^4$ is that consisting only of the zero vector $x = (0, 0, 0, 0)$. (If $S$ contained only a single nonzero vector then it would not be closed under multiplication, since multiplying that vector times the scalar 0 would produce a vector not in $S$. If $S$ were not closed under multiplication then it would not be a subspace.)

In this case the matrix $A$ would have to have rank $r = 4-s = 4-0 = 4$. If $r = 4$ then all four columns of $A$ would have to be linearly independent and $A$ would have to have at least four linearly independent rows. Suppose we choose the four elementary vectors $e_1$ through $e_4$ as the columns, so that

$A = \begin{bmatrix} 1&0&0&0 \\ 0&1&0&0 \\ 0&0&1&0 \\ 0&0&0&1 \end{bmatrix}$

If $Ax = 0$ we then have

$A = \begin{bmatrix} 1&0&0&0 \\ 0&1&0&0 \\ 0&0&1&0 \\ 0&0&0&1 \end{bmatrix} \begin{bmatrix} x_1 \\ x_2 \\ x_3 \\ x_4 \end{bmatrix} = \begin{bmatrix} x_1 \\ x_2 \\ x_3 \\ x_4 \end{bmatrix} = 0$

so that the only solution is $x = (0, 0, 0, 0)$. We have thus again found a matrix $A$ for which $\mathcal{N}(A) = S$.

(Note that any other matrix of rank $r = 4$ would have worked as well: The product $Ax$ is a linear combination of the columns of $A$, with the coefficients being $x_1$ through $x_4$. If the columns of $A$ are linearly independent then that linear combination can be zero only if all the coefficients $x_1$ through $x_4$ are zero.)

Having disposed of the easy cases, we now proceed to the harder ones.

$S$ has dimension $s=3$. In this case we are looking for a matrix $A$ with rank $4-3 = 1$ such that $\mathcal{N}(A) = S$. The matrix $A$ thus must have only one linearly independent column and (more important for our purposes) only one linearly independent row. We need only find a matrix that is 1 by 4. (If desired we can construct suitable matrices that are 2 by 4, 3 by 4, etc., by adding additional rows that are multiples of the first row.)

Since the dimension of $S$ is 3, any three linearly independent vectors in $S$ form a basis for $S$; we pick an arbitrary set of such vectors $u$, $v$, and $w$. For $S$ to be equal to $\mathcal{N}(A)$ we must have $Au = 0$, $Av = 0$, and $Aw = 0$. We are looking for a matrix $A$ that is 1 by 4, so these equations correspond to the following:

$\begin{bmatrix} a_{11}&a_{12}&a_{13}&a_{14} \end{bmatrix} \begin{bmatrix} u_1 \\ u_2 \\ u_3 \\ u_4 \end{bmatrix} = 0$

$\begin{bmatrix} a_{11}&a_{12}&a_{13}&a_{14} \end{bmatrix} \begin{bmatrix} v_1 \\ v_2 \\ v_3 \\ v_4 \end{bmatrix} = 0$

$\begin{bmatrix} a_{11}&a_{12}&a_{13}&a_{14} \end{bmatrix} \begin{bmatrix} w_1 \\ w_2 \\ w_3 \\ w_4 \end{bmatrix} = 0$

These can in turn be rewritten as the following system of equations:

$\begin{array}{rcrcrcrcr} u_1a_{11}&+&u_2a_{12}&+&u_3a_{13}&+&u_4a_{14}=&0 \\ v_1a_{11}&+&v_2a_{12}&+&v_3a_{13}&+&w_4a_{14}=&0 \\ w_1a_{11}&+&w_2a_{12}&+&w_3a_{13}&+&w_4a_{14}=&0 \end{array}$

This is a system of three equations in four unknowns, equivalent to the matrix equation $By = 0$ where

$By = \begin{bmatrix} u_1&u_2&u_3&u_4 \\ v_1&v_2&v_3&v_4 \\ w_1&w_2&w_3&w_4 \end{bmatrix} \begin{bmatrix} a_{11} \\ a_{12} \\ a_{13} \\ a_{14} \end{bmatrix}$

Since the vectors $u$, $v$, and $w$ are linearly independent (because they form a basis for the 3-dimensional subspace $S$) and the vectors in question form the rows of the matrix $B$, the rank of $B$ is $r = 3$. We also have $r = m$, the number of rows of $B$. Since this is true, per 20Q on page 96 there exists at least one solution $y$ to the above system $By = 0$. But $y$ is simply the first and only row in the matrix we were looking for, so this in turn means that we have found a matrix $A$ for which $u$, $v$, and $w$ are in the nullspace of $A$.

If $x$ is a vector in $S$ then $x$ can be expressed as a linear combination of the basis vectors $u$, $v$, and $w$ for some set of coefficients $c_1$, $c_2$, and $c_3$. We then have

$Ax = A(c_1u+c_2v+c_3w)$

$= c_1Au + c_2Av + c_3Aw$

$= c_1 \cdot 0 + c_2 \cdot 0 + c_3 \cdot 0 = 0$

So any vector $x$ in $S$ is also an element in the nullspace of $A$.

Suppose that $y$ is a vector in the nullspace of $A$ and $y$ is not in $S$. Since $y$ is not in $S$ it cannot be expressed as a linear combination solely of the basis vectors $u$, $v$, and $w$; rather we must have

$y = c_1u+c_2v+c_3w + c_4z$

where $z$ is some vector that is linearly independent of $u$, $v$, and $w$.

If $y$ is in the nullspace of $A$ then we have $Ay = 0$ so

$0 = Ay = A(c_1u+c_2v+c_3w+c_4z)$

$= c_1Au + c_2Av + c_3Aw + c_4Az$

$= c_1 \cdot 0 + c_2 \cdot 0 + c_3 \cdot 0 + c_4Az = c_4Az$

If $c_4Az = 0$ then either $c_4 = 0$ or $Az = 0$. If $c_4 = 0$ then we have

$y = c_1u+c_2v+c_3w$

so that $y$ is actually an element of $S$, contrary to our supposition. If $Az = 0$ then $z$ is an element of the nullspace of $A$. But $\mathcal{N}(A)$ has dimension 3 and already contains the three linearly independent vectors $u$, $v$, and $w$. The fourth vector $z$ cannot be both an element of $\mathcal{N}(A)$ and also linearly independent of $u$, $v$, and $w$.

Our assumption that $y$ is in the nullspace of $A$ but is not in $S$ has thus led to a contradiction. We conclude that any element of $\mathcal{N}(A)$ is also in $S$. We previously showed that any element of $S$ is also in $\mathcal{N}(A)$, so we conclude that $S = \mathcal{N}(A)$.

For any 3-dimensional subspace $S$ of $\mathbb{R}^4$ we can therefore find a matrix $A$ such that $S$ is the nullspace of $A$.

$S$ has dimension $s=2$. In this case we are looking for a matrix $A$ with rank $4-2 = 2$ such that $\mathcal{N}(A) = S$. The matrix $A$ thus must have only two linearly independent columns and only two linearly independent rows. We thus look for a matrix that is 2 by 4.

Since the dimension of $S$ is 2, any two linearly independent vectors in $S$ form a basis for $S$; we pick an arbitrary set of such vectors $u$ and $v$. For $S$ to be equal to $\mathcal{N}(A)$ we must have $Au = 0$ and $Av = 0$. We are looking for a matrix $A$ that is 2 by 4, so these equations correspond to the following:

$\begin{bmatrix} a_{11}&a_{12}&a_{13}&a_{14} \\ a_{21}&a_{22}&a_{23}&a_{24} \end{bmatrix} \begin{bmatrix} u_1 \\ u_2 \\ u_3 \\ u_4 \end{bmatrix} = 0$

$\begin{bmatrix} a_{11}&a_{12}&a_{13}&a_{14} \\ a_{21}&a_{22}&a_{23}&a_{24} \end{bmatrix} \begin{bmatrix} v_1 \\ v_2 \\ v_3 \\ v_4 \end{bmatrix} = 0$

These can in turn be rewritten as the following system of four equations in eight unknowns:

$\begin{array}{rrcr} u_1a_{11}+u_2a_{12}+u_3a_{13}+u_4a_{14}&&=&0 \\ &u_1a_{21}+u_2a_{22}+u_3a_{23}+u_4a_{24}&=&0 \\ v_1a_{11}+v_2a_{12}+v_3a_{13}+v_4a_{14}&&=&0 \\ &v_1a_{21}+v_2a_{22}+v_3a_{23}+v_4a_{24}&=&0 \end{array}$

or $By = 0$ where

$B = \begin{bmatrix} u_1&u_2&u_3&u_4&0&0&0&0 \\ 0&0&0&0&u_1&u_2&u_3&u_4 \\ v_1&v_2&v_3&v_4&0&0&0&0 \\ 0&0&0&0&v_1&v_2&v_3&v_4 \end{bmatrix}$

Since the four rows are linearly independent (this follows from the linear independence of $u$ and $v$) we have $r = m$ so that the system is guaranteed to have a solution $y$. The entries in $y$ are just the entries of $A$ so we have found a matrix $A$ for which the basis vectors $u$ and $v$ are members of the nullspace.

Since the basis vectors of $S$ are in $\mathcal{N}(A)$ all other elements of $S$ are in $\mathcal{N}(A)$ also. By the same argument as in the 3-dimensional case, any vector $y$ in $\mathcal{N}(A)$ must be in $S$ also; otherwise a contradiction occurs. Thus we conclude that $S = \mathcal{N}(A)$.

For any 2-dimensional subspace $S$ of $\mathbb{R}^4$ we can therefore find a matrix $A$ such that $S$ is the nullspace of $A$.

$S$ has dimension $s=1$. In this case we are looking for a matrix $A$ with rank $4-1 = 3$ such that $\mathcal{N}(A) = S$. The matrix $A$ thus must have only three linearly independent columns and only three linearly independent rows. We thus look for a matrix that is 3 by 4.

Since the dimension of $S$ is 1, any nonzero vector $u$ in $S$ forms a basis for $S$. For $S$ to be equal to $\mathcal{N}(A)$ we must have $Au = 0$. We are looking for a matrix $A$ that is 3 by 4, so this equation corresponds to the following:

$\begin{bmatrix} a_{11}&a_{12}&a_{13}&a_{14} \\ a_{21}&a_{22}&a_{23}&a_{24} \\ a_{31}&a_{32}&a_{33}&a_{34} \end{bmatrix} \begin{bmatrix} u_1 \\ u_2 \\ u_3 \\ u_4 \end{bmatrix} = 0$

This can be rewritten as a system $By = 0$ of three equations with twelve unknowns ($a_{ij}$), a system which is guaranteed to have at least one solution due to the linear independence of the three rows of $B$. The solution then forms the entries of $A$, so that we have $Au = 0$. Since $u$ is a basis for $S$ any other vector in $S$ is also in the nullspace of $A$.

By the same argument as in the 3-dimensional case, any vector $y$ in $\mathcal{N}(A)$ must be in $S$ also; otherwise a contradiction occurs. Thus we conclude that $S = \mathcal{N}(A)$.

For any 1-dimensional subspace $S$ of $\mathbb{R}^4$ we can therefore find a matrix $A$ such that $S$ is the nullspace of $A$.

Any subspace of $\mathbb{R}^4$ must have dimension from 0 through 4. We have thus shown that for any subspace $S$ of $\mathbb{R}^4$ we can find a matrix $A$ such that $\mathcal{N}(A) = S$. The statement is true.

b) Suppose that for some $m$ by $n$ matrix $A$ both $A$ and its transpose $A^T$ have the same nullspace.

The rank $r$ of $A$ is also the rank of $A^T$. The rank of $\mathcal{N}(A)$ is then $n-r$, and the rank of $\mathcal{N}(A^T)$ is $m-r$. Since $\mathcal{N}(A) = \mathcal{N}(A^T)$ we then have $n - r = m - r$ so that $m = n$.

The number of rows $m$ of $A$ is the same as the number of columns $n$ of $A$ so that $A$ (and thus $A^T$) is a square matrix. The statement is true.

c) If $f(x)$ represents the transformation in question, with $f(x) = mx+b$, we have

$f(x+y) = m(x+y) + b = mx + my + b$

$f(x) + f(y) = (mx+b) + (my+b) = mx + my + 2b$

These two quantities are not the same unless $b=0$, so for $b \ne 0$ the transformation is not linear. The statement is false.

NOTE: This continues a series of posts containing worked out exercises from the (out of print) book Linear Algebra and Its Applications, Third Edition by Gilbert Strang.

If you find these posts useful I encourage you to also check out the more current Linear Algebra and Its Applications, Fourth Edition, Dr Strang’s introductory textbook Introduction to Linear Algebra, Fourth Edition and the accompanying free online course, and Dr Strang’s other books.

This entry was posted in linear algebra and tagged , . Bookmark the permalink.