## Linear Algebra and Its Applications, Exercise 3.4.16

Exercise 3.4.16. Given the matrix $A$ whose columns are the following two vectors $a_1$ and $a_3$ [sic]:

$a_1 = \begin{bmatrix} 1 \\ 2 \\ 2 \end{bmatrix} \quad a_3 = \begin{bmatrix} 1 \\ 3 \\ 1 \end{bmatrix}$

factor $A$ as $A = QR$. If there are $n$ vectors $a_j$ with $m$ elements each, what are the dimensions of $A$, $Q$, and $R$?

Answer: With $a_1$ and $a_3$ as the two columns of $A$, we first choose $a_1' = a_1 = (1, 2, 2)$. We then have

$a_3' = a_3 - \frac{(a_1')^Ta_3}{(a_1')^Ta_1'}a_1' = a_3 - \frac{1 \cdot 1 + 2 \cdot 3 + 2 \cdot 1}{1^2 + 2^2 + 2^2}a_1' = a_3 - \frac{9}{9}a_1' = a_3 - a_1'$

$= \begin{bmatrix} 1 \\ 3 \\ 1 \end{bmatrix} - \begin{bmatrix} 1 \\ 2 \\ 2 \end{bmatrix} = \begin{bmatrix} 0 \\ 1 \\ -1 \end{bmatrix}$

Now that we have calculated the orthogonal vectors $a_1'$ and $a_3'$ we can normalize them to create the orthonormal vectors $q_1$ and $q_3$. We have

$\|a_1'\| = \sqrt{1^2 + 2^2 + 2^2} = \sqrt{9} = 3$

$\|a_3'\| = \sqrt{0^2 + 1^2 + (-1)^2} = \sqrt{2}$

so that

$q_1 = a_1' / \|a_1'\| = \frac{1}{3} \begin{bmatrix} 1 \\ 2 \\ 2 \end{bmatrix} = \begin{bmatrix} \frac{1}{3} \\ \frac{2}{3} \\ \frac{2}{3} \end{bmatrix}$

$q_3 = a_3' / \|a_3'\| = \frac{1}{\sqrt{2}} \begin{bmatrix} 0 \\ 1 \\ -1 \end{bmatrix} = \begin{bmatrix} 0 \\ \frac{1}{\sqrt{2}} \\ -\frac{1}{\sqrt{2}} \end{bmatrix}$

The matrix $Q$ is the 3 by 2 matrix with columns $q_1$ and $q_3$:

$Q = \begin{bmatrix} \frac{1}{3}&0 \\ \frac{2}{3}&\frac{1}{\sqrt{2}} \\ \frac{2}{3}&-\frac{1}{\sqrt{2}} \end{bmatrix}$

The matrix $R$ is the 2 by 2 matrix calculated as follows:

$R = \begin{bmatrix} q_1^Ta_1&q_1^Ta_3 \\ 0&q_3^Ta_3 \end{bmatrix} = \begin{bmatrix} \frac{1}{3} \cdot 1 + \frac{2}{3} \cdot 2 + \frac{2}{3} \cdot 2&\frac{1}{3} \cdot 1 + \frac{2}{3} \cdot 3 + \frac{2}{3} \cdot 1 \\ 0&0 \cdot 1 + \frac{1}{\sqrt{2}} \cdot 3 + (-\frac{1}{\sqrt{2}}) \cdot 1 \end{bmatrix}$

$= \begin{bmatrix} \frac{9}{3}&\frac{9}{3} \\ 0&\frac{2}{\sqrt{2}} \end{bmatrix} = \begin{bmatrix} 3&3 \\ 0&\sqrt{2} \end{bmatrix}$

If there are $n$ vectors $a_j$ with $m$ elements each, since they form the columns of $A$ the shape of $A$ will be $m$ by $n$. The matrix $Q$ contains one orthonormal column for each column in $A$, and its orthonormal columns have the same number of elements as the columns of $A$, so $Q$ is also $m$ by $n$. Finally, $R$ is a square matrix with one column for each column of $Q$, so it is $n$ by $n$.

NOTE: This continues a series of posts containing worked out exercises from the (out of print) book Linear Algebra and Its Applications, Third Edition by Gilbert Strang.

If you find these posts useful I encourage you to also check out the more current Linear Algebra and Its Applications, Fourth Edition, Dr Strang’s introductory textbook Introduction to Linear Algebra, Fifth Edition and the accompanying free online course, and Dr Strang’s other books.

## Linear Algebra and Its Applications, Exercise 3.4.15

Exercise 3.4.15. Given the matrix

$A = \begin{bmatrix} 1&1 \\ 2&-1 \\ -2&4 \end{bmatrix}$

find the orthonormal vectors $q_1$ and $q_2$ that span the column space of $A$. Next find the vector $q_3$ that completes the orthonormal set, and describe the subspace of $A$ of which $q_3$ is an element. Finally, for $b = (1, 2, 7)$ find the least squares solution $\bar{x}$ to $Ax = b$.

Answer: With $a$ and $b$ as the columns of $A$, we first choose $a' = a = (1, 2, -2)$. We then have

$b' = b - \frac{(a')^Tb}{(a')^Ta'}a' = b - \frac{1 \cdot 1 + 2 \cdot (-1) + (-2) \cdot 4}{1^2 + 2^2 + (-2)^2}a' = b - \frac{-9}{9}a' = b + a'$

$= \begin{bmatrix} 1 \\ -1 \\ 4 \end{bmatrix} + \begin{bmatrix} 1 \\ 2 \\ -2 \end{bmatrix} = \begin{bmatrix} 2 \\ 1 \\ 2 \end{bmatrix}$

Now that we have calculated the orthogonal vectors $a'$ and $b'$ we can normalize them to create the orthonormal vectors $q_1$ and $q_2$. We have

$\|a'\| = \sqrt{1^2 + 2^2 + (-2)^2} = \sqrt{9} = 3$

$\|b'\| = \sqrt{2^2 + 1^2 + 2^2} = \sqrt{9} = 3$

so that

$q_1 = a' / \|a'\| = \frac{1}{3} \begin{bmatrix} 1 \\ 2 \\ -2 \end{bmatrix} = \begin{bmatrix} \frac{1}{3} \\ \frac{2}{3} \\ -\frac{2}{3} \end{bmatrix}$

$q_2 = b' / \|b'\| = \frac{1}{3} \begin{bmatrix} 2 \\ 1 \\ 2 \end{bmatrix} = \begin{bmatrix} \frac{2}{3} \\ \frac{1}{3} \\ \frac{2}{3} \end{bmatrix}$

Since $a$ and $b$ span the column space of $A$, and the orthonormal vectors $q_1$ and $q_2$ are linear combinations of $a$ and $b$, $q_1$ and $q_2$ also span the column space of $A$.

Next, we calculate $q_3$. We can do this by orthogonalizing any vector $c$ that is linearly independent of $a$ and $b$. For ease of calculation we start with $c = (1, 0, 0)$. We then have

$c' = c - \frac{(a')^Tc}{(a')^Ta'}a' - \frac{(b')^Tc}{(b')^Tb'}b' = c - \frac{1 \cdot 1 + 2 \cdot 0 + (-2) \cdot 0}{1^2 + 2^2 + (-2)^2}a' - \frac{2 \cdot 1 + 1 \cdot 0 + 2 \cdot 0}{1^2 + 2^2 + (-2)^2}b' = c - \frac{1}{9}a' - \frac{2}{9}b'$

$= \begin{bmatrix} 1 \\ 0 \\ 0 \end{bmatrix} - \frac{1}{9} \begin{bmatrix} 1 \\ 2 \\ -2 \end{bmatrix} - \frac{2}{9} \begin{bmatrix} 2 \\ 1 \\ 2 \end{bmatrix} = \begin{bmatrix} \frac{4}{9} \\ -\frac{4}{9} \\ -\frac{2}{9} \end{bmatrix}$

To normalize $c'$ we divide by

$\|c'\| = \sqrt{(-\frac{4}{9})^2 + (-\frac{4}{9})^2 + (-\frac{2}{9})^2} = \sqrt{\frac{36}{81}} = \frac{2}{3}$

so that

$q_3 = c' / \|c'\| = \frac{3}{2} \begin{bmatrix} \frac{4}{9} \\ -\frac{4}{9} \\ -\frac{2}{9} \end{bmatrix} = \begin{bmatrix} \frac{2}{3} \\ -\frac{2}{3} \\ -\frac{1}{3} \end{bmatrix}$

Of the four fundamental subspaces of $A$, the left nullspace $\mathcal{N}(A^T)$ is orthogonal to the column space $\mathcal{R}(A)$. Since $q_1$ and $q_2$ span the column space $\mathcal{R}(A)$ and $q_3$ is orthogonal to $q_1$ and $q_2$, $q_3$ must be an element of the left nullspace $\mathcal{N}(A^T)$.

Finally, to find the least squares solution $\bar{x}$ to $Ax = b$ where $b = (1, 2, 7)$, we factor $A = QR$ and take advantage of the fact that $R\bar{x} = Q^Tb$.

The matrix $Q$ is simply the 3 by 2 matrix with columns $q_1$ and $q_2$:

$Q = \begin{bmatrix} \frac{1}{3}&\frac{2}{3} \\ \frac{2}{3}&\frac{1}{3} \\ -\frac{2}{3}&\frac{2}{3} \end{bmatrix}$

The upper triangular matrix $R$ is a 2 by 2 matrix with

$R = \begin{bmatrix} q_1^Ta&q_1^Tb \\ 0&q_2^Tb \end{bmatrix} = \begin{bmatrix} \frac{1}{3} \cdot 1 + \frac{2}{3} \cdot 2 + (-\frac{2}{3}) \cdot (-2)&\frac{1}{3} \cdot 1 + \frac{2}{3} \cdot (-1) + (-\frac{2}{3}) \cdot 4 \\ 0&\frac{2}{3} \cdot 1 + \frac{1}{3} \cdot (-1) + \frac{2}{3} \cdot 4 \end{bmatrix}$

$= \begin{bmatrix} 3&-3 \\ 0&3 \end{bmatrix}$

On the right side of the equation $R\bar{x} = Q^Tb$ we have

$Q^Tb = \begin{bmatrix} \frac{1}{3}&\frac{2}{3}&-\frac{2}{3} \\ \frac{2}{3}&\frac{1}{3}&\frac{2}{3} \end{bmatrix} \begin{bmatrix} 1 \\ 2 \\ 7 \end{bmatrix} = \begin{bmatrix} -\frac{9}{3} \\ \frac{18}{3} \end{bmatrix} = \begin{bmatrix} -3 \\ 6 \end{bmatrix}$

so that the entire system is then

$\begin{bmatrix} 3&-3 \\ 0&3 \end{bmatrix} \begin{bmatrix} \bar{x_1} \\ \bar{x_2} \end{bmatrix} = \begin{bmatrix} -3 \\ 6 \end{bmatrix}$

From the second equation we have $3\bar{x_2} = 6$ or $\bar{x_2} = 2$. Substituting into the first equation we have $3\bar{x_1} - 3\bar{x_2} = 3\bar{x_1} - 3 \cdot 2 = 3$ so that $3\bar{x_1} = 3 + 6 = 9$ or $\bar{x_1} = 3$.

The least squares solution to $Ax = b$ with $b = (1, 2, 7)$ is therefore $\bar{x} = (3, 2)$.

NOTE: This continues a series of posts containing worked out exercises from the (out of print) book Linear Algebra and Its Applications, Third Edition by Gilbert Strang.

If you find these posts useful I encourage you to also check out the more current Linear Algebra and Its Applications, Fourth Edition, Dr Strang’s introductory textbook Introduction to Linear Algebra, Fifth Edition and the accompanying free online course, and Dr Strang’s other books.

## Linear Algebra and Its Applications, Exercise 3.4.14

Exercise 3.4.14. Given the vectors

$a = \begin{bmatrix} 1 \\ 1 \\ 0 \end{bmatrix} \quad b = \begin{bmatrix} 1 \\ 0 \\ 1 \end{bmatrix} \quad c = \begin{bmatrix} 0 \\ 1 \\ 1 \end{bmatrix}$

find the corresponding orthonormal vectors $q_1$, $q_2$, and $q_3$.

Answer: We first choose $a' = a$. We then have

$b' = b - \frac{(a')^Tb}{(a')^Ta'}a' = b - \frac{1 \cdot 1 + 1 \cdot 0 + 0 \cdot 1}{1^2 + 1^2 + 0^2}a' = b - \frac{1}{2}a'$

$= \begin{bmatrix} 1 \\ 0 \\ 1 \end{bmatrix} - \frac{1}{2} \begin{bmatrix} 1 \\ 1 \\ 0 \end{bmatrix} = \begin{bmatrix} \frac{1}{2} \\ -\frac{1}{2} \\ 1 \end{bmatrix}$

We then have

$c' = c - \frac{(a')^Tc}{(a')^Ta'}a' - \frac{(b')^Tc}{(b')^Tb'}b' = c - \frac{1 \cdot 0 + 1 \cdot 1 + 0 \cdot 1}{1^2 + 1^2 + 0^2}a' - \frac{\frac{1}{2} \cdot 0 + (-\frac{1}{2}) \cdot 1 + 1 \cdot 1}{(\frac{1}{2})^2 + (-\frac{1}{2})^2+ 1^2}b'$

$= c' - \frac{1}{2}a' - \frac{\frac{1}{2}}{\frac{3}{2}}b' = c - \frac{1}{2}a' - \frac{1}{3}b'$

$= \begin{bmatrix} 0 \\ 1 \\ 1 \end{bmatrix} - \frac{1}{2} \begin{bmatrix} 1 \\ 1 \\ 0 \end{bmatrix} - \frac{1}{3} \begin{bmatrix} \frac{1}{2} \\ -\frac{1}{2} \\ 1 \end{bmatrix} = \begin{bmatrix} 0 \\ 1 \\ 1 \end{bmatrix} - \begin{bmatrix} \frac{1}{2} \\ \frac{1}{2} \\ 0 \end{bmatrix} - \begin{bmatrix} \frac{1}{6} \\ -\frac{1}{6} \\ \frac{1}{3} \end{bmatrix}$

$= \begin{bmatrix} -\frac{2}{3} \\ \frac{2}{3} \\ \frac{2}{3} \end{bmatrix}$

Now that we have calculated the orthogonal vectors $a'$, $b'$, and $c'$, we can normalize them to create the orthonormal vectors $q_1$, $q_2$, and $q_3$. We have

$\|a'\| = \sqrt{1^2+1^2 + 0^2} = \sqrt{2}$

$\|b'\| = \sqrt{(\frac{1}{2})^2 + (-\frac{1}{2})^2 + 1^2} = \sqrt{\frac{3}{2}} = \frac{\sqrt{3}}{\sqrt{2}}$

$\|c'\| = \sqrt{(-\frac{2}{3})^2 + (\frac{2}{3})^2 + (\frac{2}{3})^2} = \sqrt{\frac{12}{9}} = \sqrt{\frac{4}{3}} = \frac{2}{\sqrt{3}}$

so that

$q_1 = a' / \|a'\| = \frac{1}{\sqrt{2}} \begin{bmatrix} 1 \\ 1 \\ 0 \end{bmatrix} = \begin{bmatrix} \frac{1}{\sqrt{2}} \\ \frac{1}{\sqrt{2}} \\ 0 \end{bmatrix}$

$q_2 = b' / \|b'\| = \frac{\sqrt{2}}{\sqrt{3}} \begin{bmatrix} \frac{1}{2} \\ -\frac{1}{2} \\ 1 \end{bmatrix} = \begin{bmatrix} \frac{1}{\sqrt{2}\sqrt{3}} \\ -\frac{1}{\sqrt{2}\sqrt{3}} \\ \frac{2}{\sqrt{2}\sqrt{3}} \end{bmatrix}$

$q_3 = c' / \|c'\| = \frac{\sqrt{3}}{2} \begin{bmatrix} -\frac{2}{3} \\ \frac{2}{3} \\ \frac{2}{3} \end{bmatrix} = \begin{bmatrix} -\frac{1}{\sqrt{3}} \\ \frac{1}{\sqrt{3}} \\ \frac{1}{\sqrt{3}} \end{bmatrix}$

NOTE: This continues a series of posts containing worked out exercises from the (out of print) book Linear Algebra and Its Applications, Third Edition by Gilbert Strang.

If you find these posts useful I encourage you to also check out the more current Linear Algebra and Its Applications, Fourth Edition, Dr Strang’s introductory textbook Introduction to Linear Algebra, Fifth Edition and the accompanying free online course, and Dr Strang’s other books.

## Linear Algebra and Its Applications, Exercise 3.4.13

Exercise 3.4.13. Given the vectors

$a = \begin{bmatrix} 0 \\ 0 \\ 1\end{bmatrix} \quad b = \begin{bmatrix} 0 \\ 1 \\ 1 \end{bmatrix} \quad c = \begin{bmatrix} 1 \\ 1 \\ 1 \end{bmatrix}$

and the matrix $A$ whose columns are $a$, $b$, and $c$, use Gram-Schmidt orthogonalization to factor $A = QR$.

Answer: We first choose $a' = a$. We then have

$b' = b - \frac{(a')^Tb}{(a')^Ta'}a' = b - \frac{0 \cdot 0 + 0 \cdot 1 + 1 \cdot 1}{0 \cdot 0 + 0 \cdot 0 + 1 \cdot 1}a' = b - \frac{1}{1}a' = b - a'$

$= \begin{bmatrix} 0 \\ 1 \\ 1 \end{bmatrix} - \begin{bmatrix} 0 \\ 0 \\ 1 \end{bmatrix} = \begin{bmatrix} 0 \\ 1 \\ 0 \end{bmatrix}$

We then have

$c' = c - - \frac{(a')^Tc}{(a')^Ta'}a' - \frac{(b')^Tc}{(b')^Tb'}b' = c - \frac{0 \cdot 1 + 0 \cdot 1 + 1 \cdot 1}{0 \cdot 0 + 0 \cdot 0 + 1 \cdot 1}a' - \frac{0 \cdot 1 + 1 \cdot 1 + 0 \cdot 1}{0 \cdot 0 + 1 \cdot 1 + 0 \cdot 0}b'$

$= c - \frac{1}{1}a' - \frac{1}{1}b' = c - a' - b'$

$= \begin{bmatrix} 1 \\ 1 \\ 1 \end{bmatrix} - \begin{bmatrix} 0 \\ 0 \\ 1 \end{bmatrix} - \begin{bmatrix} 0 \\ 1 \\ 0 \end{bmatrix} = \begin{bmatrix} 0 \\ 0 \\ 1 \end{bmatrix}$

We have $\|a'\| = \|b'\| = \|c'\| = 1$, so $q_1 = a'$, $q_2 = b'$, and $q_3 = c'$. The matrix $Q$ is then

$Q = \begin{bmatrix} 0&0&1 \\ 0&1&0 \\ 1&0&0 \end{bmatrix}$

The matrix $R$ is then

$R = \begin{bmatrix} q_1^Ta&q_1^Tb&q_1^Tc \\ 0&q_2^Tb&q_2^Tc \\ 0&0&q_3^Tc \end{bmatrix}$

$= \begin{bmatrix} (0 \cdot 0 + 0 \cdot 0 + 1 \cdot 1)&(0 \cdot 0 + 0 \cdot 1 + 1 \cdot 1)&(0 \cdot 1 + 0 \cdot 1 + 1 \cdot 1) \\ 0&(0 \cdot 0 + 1 \cdot 1 + 0 \cdot 1)&(0 \cdot 1 + 1 \cdot 1 + 0 \cdot 1) \\ 0&0&(0 \cdot 1 + 0 \cdot 1 + 1 \cdot 1) \end{bmatrix}$

$= \begin{bmatrix} 1&1&1 \\ 0&1&1 \\ 0&0&1 \end{bmatrix}$

The product of the two matrices is then

$QR = \begin{bmatrix} 0&0&1 \\ 0&1&0 \\ 1&0&0 \end{bmatrix} \begin{bmatrix} 1&1&1 \\ 0&1&1 \\ 0&0&1 \end{bmatrix} = \begin{bmatrix} 0&0&1 \\ 0&1&1 \\ 1&1&1 \end{bmatrix} = A$

NOTE: This continues a series of posts containing worked out exercises from the (out of print) book Linear Algebra and Its Applications, Third Edition by Gilbert Strang.

If you find these posts useful I encourage you to also check out the more current Linear Algebra and Its Applications, Fourth Edition, Dr Strang’s introductory textbook Introduction to Linear Algebra, Fifth Edition and the accompanying free online course, and Dr Strang’s other books.

## Linear Algebra and Its Applications, Exercise 3.4.12

Exercise 3.4.12. Given the vectors $a_1 = (1, 1)$ and $a_2 = (4, 0)$, find a scalar $c$ such that $a_2 - ca_1$ is orthogonal to $a_1$. Given the matrix $A = \begin{bmatrix} 1&4 \\ 1&0 \end{bmatrix}$ whose columns are $a_1$ and $a_2$ respectively, find matrices $Q$ and $R$ such that $Q$ is orthogonal and $A = QR$.

Answer: We must have $a_1^T(a_2 - ca_1) = 0$. This implies that $a_1^Ta_2 = c(a_1^Ta_1)$ or

$c = \frac{a_1^Ta_2}{a_1^Ta_1} = \frac{1 \cdot 4 + 1 \cdot 0}{1 \cdot 1 + 1 \cdot 1} = \frac{4}{2} = 2$

So the scalar multiplying $a_1$ is 2.

As a check, we then have

$a_2 - ca_1 = \begin{bmatrix} 4 \\ 0 \end{bmatrix} - 2 \cdot \begin{bmatrix} 1 \\ 1 \end{bmatrix} = \begin{bmatrix} 2 \\ -2 \end{bmatrix}$

with $a_1^T(a_2 - ca_1) = 1 \cdot 2 + 1 \cdot -2 = 2 - 2 = 0$. So the new vector is orthogonal to $a_1$.

We now attempt to factor $A = \begin{bmatrix} 1&4 \\ 1&0 \end{bmatrix}$ into $QR$. If we perform Gram-Schmidt orthogonalization on $A$, the first column of $Q$ is $q_1 = a_1 / \|a_1\|$. We have $\|a_1\| = \sqrt{a_1^Ta_1} = \sqrt{2}$, so $q_1 = (\frac{1}{\sqrt{2}}, \frac{1}{\sqrt{2}})$. The length $\|a_1\| = \sqrt{2}$ would then be the 1, 1 entry of the matrix $R$. We thus have

$Q = \begin{bmatrix} \frac{1}{\sqrt{2}}&? \\ \frac{1}{\sqrt{2}}&? \end{bmatrix} \qquad R = \begin{bmatrix} \sqrt{2}&? \\ ?&? \end{bmatrix}$

The second column $q_2$ of $Q$ is created by first subtracting from $a_2$ the projection of $a_2$ onto $q_1$ and then normalizing the result. The result of subtracting the projection is

$a_2 - (q_1^Ta_2)q_1 = \begin{bmatrix} 4 \\ 0 \end{bmatrix} - (\frac{1}{\sqrt{2}} \cdot 4 + \frac{1}{\sqrt{2}} \cdot 0) \cdot \begin{bmatrix} \frac{1}{\sqrt{2}} \\ \frac{1}{\sqrt{2}} \end{bmatrix}$

$= \begin{bmatrix} 4 \\ 0 \end{bmatrix} - \frac{4}{\sqrt{2}} \cdot \begin{bmatrix} \frac{1}{\sqrt{2}} \\ \frac{1}{\sqrt{2}} \end{bmatrix} = \begin{bmatrix} 4 \\ 0 \end{bmatrix} - \begin{bmatrix} 2 \\ 2 \end{bmatrix} = \begin{bmatrix} 2 \\ -2 \end{bmatrix}$

The length of this vector is $\sqrt{2^2 + (-2)^2} = \sqrt{8} = 2 \sqrt{2}$, so we have

$q_2 = \frac{1}{2 \sqrt{2}} \begin{bmatrix} 2 \\ -2 \end{bmatrix} = \begin{bmatrix} \frac{1}{\sqrt{2}} \\ -\frac{1}{\sqrt{2}} \end{bmatrix}$

The matrix $Q$ is then

$Q = \begin{bmatrix} \frac{1}{\sqrt{2}}&\frac{1}{\sqrt{2}} \\ \frac{1}{\sqrt{2}}&-\frac{1}{\sqrt{2}} \end{bmatrix}$

The second diagonal entry of the matrix $R$ is the length $2\sqrt{2}$ used in computing $q_2$. The off-diagonal element of $R$ is the value $q_1^Ta_2 = \frac{4}{\sqrt{2}} = 2 \sqrt{2}$ used in subtracting from $a_2$ the component in the direction of $q_1$. We thus have

$R = \begin{bmatrix} \sqrt{2}&2\sqrt{2} \\ 0&2\sqrt{2} \end{bmatrix}$

The product of the two matrices is then

$QR = \begin{bmatrix} \frac{1}{\sqrt{2}}&\frac{1}{\sqrt{2}} \\ \frac{1}{\sqrt{2}}&-\frac{1}{\sqrt{2}} \end{bmatrix} \begin{bmatrix} \sqrt{2}&2\sqrt{2} \\ 0&2\sqrt{2} \end{bmatrix} = \begin{bmatrix} 1&4 \\ 1&0 \end{bmatrix} = A$

NOTE: This continues a series of posts containing worked out exercises from the (out of print) book Linear Algebra and Its Applications, Third Edition by Gilbert Strang.

If you find these posts useful I encourage you to also check out the more current Linear Algebra and Its Applications, Fourth Edition, Dr Strang’s introductory textbook Introduction to Linear Algebra, Fifth Edition and the accompanying free online course, and Dr Strang’s other books.

## Linear Algebra and Its Applications, Exercise 3.4.11

Exercise 3.4.11. If the matrix $Q$ is both upper triangular and orthogonal, show that $Q$ must be a diagonal matrix.

Answer: Let $Q$ be an $n$ by $n$ matrix. Since $Q$ is upper triangular we have

$Q = \begin{bmatrix} q_{11}&q_{12}&\cdots&q_{1n} \\ &q_{22}&\cdots&q_{2n} \\ &&\ddots&\vdots \\ &&&q_{nn} \end{bmatrix}$

where $q_{ij} = 0$ for $i > j$. Our goal is to prove that $Q$ is also diagonal, with $q_{ij} = 0$ for $i \ne j$.

Since $Q$ is an orthogonal matrix all of its columns are orthonormal, and all of its rows are also orthonormal. (See Remark 2 on page 169.) So, in particular, for column 1 we have $\sum_i q_{i1}^2 = 1$ and for row 1 we have $\sum_j q_{1j}^2 = 1$.

But since $Q$ is upper triangular we also have $q_{i1} = 0$ for $i > 1$, so that for column 1 we have

$1 = \sum_i q_{i1}^2 = q_{11}^2 + \sum_{i > 1} q_{i1}^2 = q_{11}^2 + \sum_{i > 1} 0^2 = q_{11}^2$

Since $q_{11}^2 = 1$ we have $q_{11} = \pm 1$.

Turning to row 1 we have

$1 = \sum_j q_{1j}^2 = q_{11}^2 + \sum_{j > 1} q_{1j}^2 = 1 + \sum_{j > 1} q_{1j}^2$

or $\sum_{j > 1} q_{1j}^2 = 0$. But since the square of any nonzero number is positive this implies that $q_{1j} = 0$ for $j > 1$.

For row 1 we thus have $q_{1j} = 0$ for $j \ne 1$. In other words, for row 1 all off-diagonal elements are zero. Given the previous result that $q_{11} = \pm 1$ the matrix $Q$ must therefore look as follows:

$Q = \begin{bmatrix} \pm 1&&& \\ &q_{22}&\cdots&q_{2n} \\ &&\ddots&\vdots \\ &&&q_{nn} \end{bmatrix}$

The argument then proceeds by induction for the other rows: Suppose that for some $k \ge 1$ we have all off-diagonal elements equal to zero for rows 1 through $k$. More formally, we assume that for $1 \le i \le k$ we have $q_{ij} = 0$ for $i \ne j$.

Consider the situation for row $k+1$. Since column $k + 1$ is orthonormal we have

$1 = \sum_i q_{i,k+1}^2 = \sum_{i < k+1} q_{i,k+1}^2 + q_{k+1,k+1}^2 + \sum_{i > k+1} q_{i,k+1}^2$

Since $Q$ is upper triangular we have $q_{i,k+1} = 0$ for $i > k+1$. All elements in the second sum above are therefore zero:

$1 = \sum_i q_{i,k+1}^2 = \sum_{i < k+1} q_{i,k+1}^2 + q_{k+1,k+1}^2 + \sum_{i > k+1} 0^2$

$= \sum_{i < k+1} q_{i,k+1}^2 + q_{k+1,k+1}^2$

But by our assumption all off-diagonal elements in rows 1 through $k$ are also zero. Therefore $q_{i,k+1} = 0$ for $1 \le i \le k$ (or $i < k+1$). All elements in the remaining sum are therefore zero, and we have

$1 = \sum_{i < k+1} q_{i,k+1}^2 + q_{k+1,k+1}^2 = \sum_{i < k+1} 0^2 + q_{k+1,k+1}^2 = q_{k+1,k+1}^2$

so that $q_{k+1,k+1}^2 = 1$ or $q_{k+1,k+1} = \pm 1$.

Since row $k + 1$ is orthonormal we have

$1 = \sum_j q_{k+1,j}^2 = \sum_{j < k+1} q_{k+1,j}^2 + q_{k+1,k+1}^2 + \sum_{j > k+1} q_{k+1,j}^2$

But since $Q$ is upper triangular we have $q_{k+1,j} = 0$ for $k + 1 > j$ (or $j < k+1$), and from above we have $q_{k+1,k+1}^2 = 1$. We thus have

$1 = \sum_{j < k+1} q_{k+1,j}^2 + q_{k+1,k+1}^2 + \sum_{j > k+1} q_{k+1,j}^2$

$= \sum_j 0^2 + 1 + \sum_{j > k+1} q_{k+1,j}^2 = 1 + \sum_{j > k+1} q_{k+1,j}^2$

so that $\sum_{j > k+1} q_{k+1,j}^2 = 0$ and thus $q_{k+1,j} = 0$ for $j > k+1$.

Since we already had $q_{k+1,j} = 0$ for $j < k+1$ we therefore have $q_{k+1,j} = 0$ for $j \ne k+1$.

In other words, all off-diagonal entries in row $k+1$ are zero given our assumption that off-diagonal entries in rows 1 through $k$ were zero. Since this assumption is true for row 1, we have all off-diagonal entries equal to zero for all rows 1 through $n$. More formally, $q_{ij} = 0$ for $j \ne i$ and $1 \le i \le n$.

Since $q_{ij} = 0$ for $i \ne j$ any orthogonal upper triangular matrix $Q$ is therefore also a diagonal matrix. In addition, each element $q_{ii}$ on the diagonal must be either 1 or -1.

NOTE: This continues a series of posts containing worked out exercises from the (out of print) book Linear Algebra and Its Applications, Third Edition by Gilbert Strang.

If you find these posts useful I encourage you to also check out the more current Linear Algebra and Its Applications, Fourth Edition, Dr Strang’s introductory textbook Introduction to Linear Algebra, Fifth Edition and the accompanying free online course, and Dr Strang’s other books.

## Linear Algebra and Its Applications, Exercise 3.4.10

Exercise 3.4.10. Given the two orthonormal vectors $q_1$ and $q_2$ and an arbitrary vector $b$, what linear combination of $q_1$ and $q_2$ is the least distance from $b$? Show that the difference between $b$ and that combination (i.e., the error vector) is orthogonal to both $q_1$ and $q_2$.

Answer: This exercise is similar to the previous one. Any linear combination of $q_1$ and $q_2$ is in the plane formed by $q_1$ and $q_2$, and the combination closest to $b$ is the projection $p$ of $b$ onto that plane. Because $q_1$ and $q_2$ are orthonormal that projection is equal to the sum of the separate projections of $b$ onto $q_1$ and $q_2$ respectively:

$p = (q_1^Tb)q_1 + (q_2^Tb)q_2$

So $(q_1^Tb)q_1 + (q_2^Tb)q_2$ is the closest combination to $b$.

The error vector is then

$e = b - p = b - (q_1^Tbq_1 + q_2^Tbq_2)$

Taking the dot product of the error vector with $q_1$ we have

$q_1^Te = q_1^T [b - (q_1^Tbq_1 + q_2^Tbq_2)] = q_1^Tb - q_1^T(q_1^Tb)q_1 - q_1^T(q_2^Tb)q_2$

$= q_1^Tb - (q_1^Tb)q_1^Tq_1 - (q_2^Tb)q_1^Tq_2$

Since $q_1$ and $q_2$ are orthonormal we have $q_1^Tq_1 = 1$ and $q_1^Tq_2 = 0$. So we have

$q_1^Te = q_1^Tb - (q_1^Tb) \cdot 1 - (q_2^Tb) \cdot 0 = q_1^Tb - q_1^Tb = 0$

Similarly we have

$q_2^Te = q_2^T [b - (q_1^Tbq_1 + q_2^Tbq_2)] = q_2^Tb - q_2^T(q_1^Tb)q_1 - q_2^T(q_2^Tb)q_2$

$= q_2^Tb - (q_1^Tb)q_2^Tq_1 - (q_2^Tb)q_2^Tq_2 = q_2^Tb - q_2^Tb = 0$

NOTE: This continues a series of posts containing worked out exercises from the (out of print) book Linear Algebra and Its Applications, Third Edition by Gilbert Strang.

If you find these posts useful I encourage you to also check out the more current Linear Algebra and Its Applications, Fourth Edition, Dr Strang’s introductory textbook Introduction to Linear Algebra, Fifth Edition and the accompanying free online course, and Dr Strang’s other books.

## Linear Algebra and Its Applications, Exercise 3.4.9

Exercise 3.4.9. Given the three orthonormal vectors $q_1$, $q_2$, and $q_3$, what linear combination of $q_1$ and $q_2$ is the least distance from $q_3$?

Answer: Any linear combination of $q_1$ and $q_2$ is in the plane formed by $q_1$ and $q_2$. The combination closest to $q_3$ is simply the projection of $q_3$ onto that plane. But because $q_3$ is orthogonal to both $q_1$ and $q_2$ it is orthogonal to that plane, and its projection onto the plane is the zero vector. So the linear combination of $q_1$ and $q_2$ closest to $q_3$ is $0 \cdot q_1 + 0 \cdot q_2$.

NOTE: This continues a series of posts containing worked out exercises from the (out of print) book Linear Algebra and Its Applications, Third Edition by Gilbert Strang.

If you find these posts useful I encourage you to also check out the more current Linear Algebra and Its Applications, Fourth Edition, Dr Strang’s introductory textbook Introduction to Linear Algebra, Fifth Edition and the accompanying free online course, and Dr Strang’s other books.

## Linear Algebra and Its Applications, Exercise 3.4.8

Exercise 3.4.8. Project the vector $b = (1, 2)$ onto the two non-orthogonal vectors $a_1 = (1, 0)$ and $a_2 = (1, 1)$ and show that the sum of the two projections does not equal $b$ (as it would if $a_1$ and $a_2$ were orthogonal).

Answer: The projection of $b$ onto $a_1$ is $(a_1^Tb/a_1^Ta_1)a_1$. We have $a_1^Tb = (1 \cdot 1 + 0 \cdot 2) = 1$ and $a_1^Ta_1 = (1 \cdot 1 + 0 \cdot 0) = 1$. So the projection of $b$ onto $a_1$ is $\frac{1}{1} a_1 = (1, 0)$.

Similarly, the projection of $b$ onto $a_2$ is $(a_2^Tb/a_2^Ta_2)a_2$. We have $a_2^Tb = (1 \cdot 1 + 1 \cdot 2) = 3$ and $a_2^Ta_2 = (1 \cdot 1 + 1 \cdot 1) = 2$. So the projection of $b$ onto $a_2$ is $\frac{3}{2} a_2 = (\frac{3}{2}, \frac{3}{2})$.

The sum of the two projections is $(1, 0) + (\frac{3}{2}, \frac{3}{2}) = (\frac{5}{2}, \frac{3}{2})$, which is not equal to $b$.

NOTE: This continues a series of posts containing worked out exercises from the (out of print) book Linear Algebra and Its Applications, Third Edition by Gilbert Strang.

If you find these posts useful I encourage you to also check out the more current Linear Algebra and Its Applications, Fourth Edition, Dr Strang’s introductory textbook Introduction to Linear Algebra, Fifth Edition and the accompanying free online course, and Dr Strang’s other books.

## Linear Algebra and Its Applications, Exercise 3.4.7

Exercise 3.4.7. Given $b = x_1q_1 + x_2q_2 + \cdots + x_nq_n$ where $q_1, q_2, \dots, q_n$ are orthonormal vectors, compute $b^Tb$ and show that

$\|b\|^2 = x_1^2 + x_2^2 + \cdots + x_n^2$

Answer:We have $b = \sum_i x_iq_i$ so that

$b^Tb = (\sum_i x_iq_i)^T(\sum_j x_jq_j) = (\sum_i x_iq_i^T)(\sum_j x_jq_j)$

since the transpose of a sum is equal to the sum of the transposes. The product of the sums can then be decomposed into two sums of products as follows:

$(\sum_i x_iq_i^T)(\sum_j x_jq_j) = \sum_{i=j} (x_iq_i^T)(x_jq_j) + \sum_{i \ne j} (x_iq_i^T)(x_jq_j)$

$= \sum_{i=j} x_ix_jq_i^Tq_j + \sum_{i \ne j} x_ix_jq_i^Tq_j$

But since $q_1, q_2, \dots, q_n$ are orthonormal vectors we have $q_i^Tq_j = 1$ when $i = j$ and $q_i^Tq_j = 0$ when $i \ne j$, so that

$\sum_{i=j} x_ix_jq_i^Tq_j + \sum_{i \ne j} x_ix_jq_i^Tq_j = \sum_{i = j} x_ix_j \cdot 1 + \sum_{i \ne j} x_ix_j \cdot 0 = \sum_i x_i^2$

We thus have

$\|b\|^2 = \sum_i x_i^2 = x_1^2 + x_2^2 + \cdots + x_n^2$

NOTE: This continues a series of posts containing worked out exercises from the (out of print) book Linear Algebra and Its Applications, Third Edition by Gilbert Strang.

If you find these posts useful I encourage you to also check out the more current Linear Algebra and Its Applications, Fourth Edition, Dr Strang’s introductory textbook Introduction to Linear Algebra, Fifth Edition and the accompanying free online course, and Dr Strang’s other books.