## Linear Algebra and Its Applications, Review Exercise 2.3

Review exercise 2.3. State whether each of the following is true or false. If false, provide a counterexample.

i) If a subspace $S$ is spanned by a set of $m$ vectors $x_1$ through $x_m$ then the dimension of $S$ is $m$.

ii) If $S_1$ and $S_2$ are two subspaces of a vector space $V$ then the intersection of $S_1$ and $S_2$ is nonempty.

iii) For any matrix $A$ if $Ax = Ay$ then we have $x = y$.

iv) For any matrix $A$ if $A$ is reduced to echelon form then the rows of the resulting matrix $U$ form a unique basis for the row space of $A$.

v) If $A$ is a square matrix and the columns of $A$ are linearly independent then the columns of $A^2$ are also linearly independent.

Answer: i) The statement is false. The dimension of $S$ is the number of vectors in its basis, and the basis vectors have to be linearly independent. However it is possible that some of the vectors in the spanning set may be linear combinations of other vectors in the set; in that case $m$ would be larger than the number of basis vectors, and thus larger than the dimension of $S$. For example, the vectors $(1, 0)$, $(0, 1)$, and $(1, 1)$ span $\mathbb{R}^2$ but the dimension of $\mathbb{R}^2$ is 2, not 3.

ii) The statement is true. Every vector space $V$ contains the zero vector. Since $S_1$ and $S_2$ are subspaces they are also vector spaces in their own right, and therefore both contain the zero vector also. So the intersection of $S_1$ and $S_2$ is guaranteed to contain (at least) the zero vector and thus will always be nonempty.

iii) The statement is false. The matrix $A$ could be the zero matrix, in which case we would have $Ax = Ay = 0$ no matter what values $x$ and $y$ had.

iv) The statement is false. The row space of $A$ is the same as the row space of $U$ (since the rows of $U$ are linear combinations of the rows of $A$) and the (nonzero) rows of $U$ do form a basis for $A$. However this basis is not unique.

For example, suppose that

$A = \begin{bmatrix} 1&0 \\ 1&2 \end{bmatrix}$

Then $A$ can be reduced to echelon form as

$U = \begin{bmatrix} 1&0 \\ 0&2 \end{bmatrix}$

The vectors $(1, 0)$ and $(0, 2)$ form a basis for the row space of $A$ but this basis is not unique. For example, the original rows $(1, 0)$ and $(1, 2)$ are also linearly independent and form a basis for the row space of $A$.

v) The statement is true. If the columns of $A$ are linearly independent then $A$ is nonsingular and has an inverse $A^{-1}$. (See the discussion on page 98.) We then have

$(A^{-1})^2A^2 = A^{-1}(A^{-1}A)A = A^{-1}IA = A^{-1}A = I$

and also

$A^2(A^{-1})^2 = A(AA^{-1})A^{-1} = AIA^{-1} = AA^{-1} = I$

So $(A^{-1})^2$ is both a left and right inverse for $A^2$ and we see that $A^2$ is invertible with $(A^2)^{-1} = (A^{-1})^2$. But if $A^2$ is invertible then it is nonsingular and its columns are linearly independent.

UPDATE: Corrected a typo in the answer to (iv) (a reference to $A$ should have been a reference to $U$).

NOTE: This continues a series of posts containing worked out exercises from the (out of print) book Linear Algebra and Its Applications, Third Edition by Gilbert Strang.

If you find these posts useful I encourage you to also check out the more current Linear Algebra and Its Applications, Fourth Edition, Dr Strang’s introductory textbook Introduction to Linear Algebra, Fourth Edition and the accompanying free online course, and Dr Strang’s other books.

This entry was posted in linear algebra. Bookmark the permalink.