## Linear Algebra and Its Applications, Exercise 3.3.12

Exercise 3.3.12. Given the subspace $V$ spanned by the two vectors $(1, 1, 0, 1)$ and $(0, 0, 1, 0)$ find the following:

a) a set of basis vectors for $V^\perp$

b) the matrix $P$ that projects onto $V$

c) the vector in $V$ that has the minimum distance to the vector $b = (0, 1, 0, -1)$ in $V^\perp$

Answer: a) The subspace $V$ is the column space $\mathcal{R}(A)$ for the matrix

$A = \begin{bmatrix} 1&0 \\ 1&0 \\ 0&1 \\ 1&0 \end{bmatrix}$

for which the vectors $(1, 1, 0, 1)$ and $(0, 0, 1, 0)$ are the columns. The orthogonal complement $V^\perp$ corresponds to the left nullspace $\mathcal{N}(A^T)$, so we can find a basis for $V^\perp$ by solving the system of equations corresponding to $A^Ty =0$:

$\begin{bmatrix} 1&1&0&1 \\ 0&0&1&0 \end{bmatrix} \begin{bmatrix} y_1 \\ y_2 \\ y_3 \\ y_4 \end{bmatrix} = 0$

in matrix form or

$\setlength\arraycolsep{0.2em}\begin{array}{rcrcrcrcl}y_{1}&+&y_{2}&+&&&y_{4}&=&0 \\ &&&&y_{3}&&&=&0 \end{array}$

expressed as a system of equations.

The matrix $A^T$ is already in echelon form, with $y_1$ and $y_3$ as basic variables and $y_2$ and $y_4$ as free variables. Setting $y_2 = 1$ and $y_4 = 0$ we have $y_1 = -1$ and $y_3 = 0$. Setting $y_2 = 0$ and $y_4 = 1$ we again have $y_1 = -1$ and $y_3 = 0$. So two solutions to $A^Ty = 0$ are $(-1, 1, 0, 0)$ and $(-1, 0, 0, 1)$ and these form a basis for $V^\perp = \mathcal{N}(A^T)$.

b) Since $V$ is the column space for the matrix $A$ above, the projection matrix onto $V$ is $P = A(A^TA)^{-1}A^T$. We have

$A^TA = \begin{bmatrix} 1&1&0&1 \\ 0&0&1&0 \end{bmatrix} \begin{bmatrix} 1&0 \\ 1&0 \\ 0&1 \\ 1&0 \end{bmatrix} = \begin{bmatrix} 3&0 \\ 0&1 \end{bmatrix}$

Since $A^TA$ is a diagonal matrix its inverse is simply

$(A^TA)^{-1} = \begin{bmatrix} \frac{1}{3}&0 \\ 0&1 \end{bmatrix}$

We then have

$P = A(A^TA)^{-1}A^T$

$= \begin{bmatrix} 1&0 \\ 1&0 \\ 0&1 \\ 1&0 \end{bmatrix} \begin{bmatrix} \frac{1}{3}&0 \\ 0&1 \end{bmatrix} \begin{bmatrix} 1&1&0&1 \\ 0&0&1&0 \end{bmatrix}$

$= \begin{bmatrix} \frac{1}{3}&0 \\ \frac{1}{3}&0 \\ 0&1 \\ \frac{1}{3}&0 \end{bmatrix} \begin{bmatrix} 1&1&0&1 \\ 0&0&1&0 \end{bmatrix}$

$= \begin{bmatrix} \frac{1}{3}&\frac{1}{3}&0&\frac{1}{3} \\ \frac{1}{3}&\frac{1}{3}&0&\frac{1}{3} \\ 0&0&1&0 \\ \frac{1}{3}&\frac{1}{3}&0&\frac{1}{3} \end{bmatrix}$

c) The vector in $V$ closest to the vector $b = (0, 1, 0, -1)$ is simply the projection $Pb$ of $b$ onto $V$. But since $b$ is in $V^\perp$ it is orthogonal to all vectors in $V$ and its projection $Pb$ onto $V$ is the zero vector.

This can also be seen by explicitly doing the matrix multiplication:

$Pb = \begin{bmatrix} \frac{1}{3}&\frac{1}{3}&0&\frac{1}{3} \\ \frac{1}{3}&\frac{1}{3}&0&\frac{1}{3} \\ 0&0&1&0 \\ \frac{1}{3}&\frac{1}{3}&0&\frac{1}{3} \end{bmatrix} \begin{bmatrix} 0 \\ 1 \\ 0 \\ -1 \end{bmatrix} = \begin{bmatrix} 0 \\ 0 \\ 0 \\ 0 \end{bmatrix} = 0$

UPDATE: Corrected a typo in the statement of question (b).

NOTE: This continues a series of posts containing worked out exercises from the (out of print) book Linear Algebra and Its Applications, Third Edition by Gilbert Strang.

If you find these posts useful I encourage you to also check out the more current Linear Algebra and Its Applications, Fourth Edition, Dr Strang’s introductory textbook Introduction to Linear Algebra, Fifth Edition and the accompanying free online course, and Dr Strang’s other books.

This entry was posted in linear algebra and tagged , , , , . Bookmark the permalink.