## Linear Algebra and Its Applications, Review Exercise 1.12

Review exercise 1.12. State whether the following are true or false. If a statement is true explain why it is true. If a statement is false provide a counter-example.

(a) If $A$ is invertible and $B$ has the same rows as $A$ but in reverse order, then $B$ is invertible as well.

(b) If $A$ and $B$ are both symmetric matrices then their product $AB$ is also a symmetric matrix.

(c) If $A$ and $B$ are both invertible then their product $BA$ is also invertible.

(d) If $A$ is a nonsingular matrix then it can be factored into the product $A = LU$ of a lower triangular and upper triangular matrix.

Answer: (a) True. If $B$ has the same rows as $A$ but in reverse order then we have $B = PA$ where $P$ is the permutation matrix that reverses the order of rows. For example, for the 3 by 3 case we have $P = \begin{bmatrix} 0&0&1 \\ 0&1&0 \\ 1&0&0 \end{bmatrix}$

If we apply $P$ twice then it restores the order of the rows back to the original order; in other words $P^2 = I$ so that $P^{-1} = P$.

If $A$ is invertible then $A^{-1}$ exists. Consider the product $A^{-1}P$. We have $B(A^{-1}P) = (PA)(A^{-1}P) = P(A^{-1}A)P = PIP = P^2 = I$

so that $A^{-1}P$ is a right inverse for $B$. We also have $(A^{-1}P)B = (A^{-1}P)(PA) = A^{-1}P^2A = A^{-1}IA = A^{-1}A = I$

so that $A^{-1}P$ is a left inverse for $B$ as well. Since $A^{-1}P$ is both a left and right inverse for $B$ we have $B^{-1} = A^{-1}P$ so that $B$ is invertible if $A$ is.

Incidentally, note that while multiplying by $P$ on the left reverses the order of the rows, multiplying by $P$ on the right reverse the order of the columns. For example, in the 3 by 3 case we have $\begin{bmatrix} 1&2&3 \\ 4&5&6 \\ 7&8&9 \end{bmatrix} \begin{bmatrix} 0&0&1 \\ 0&1&0 \\ 1&0&0 \end{bmatrix} = \begin{bmatrix} 3&2&1 \\ 6&5&4 \\ 9&8&7 \end{bmatrix}$

Thus if $A^{-1}$ exists and $B = PA$ then $B^{-1} = A^{-1}P$ exists and consists of $A^{-1}$ with its columns reversed.

(b) False. The product of two symmetric matrices is not necessarily itself a symmetric matrix, as shown by the following counterexample: $\begin{bmatrix} 2&3 \\ 3&1 \end{bmatrix} \begin{bmatrix} 3&5 \\ 5&1 \end{bmatrix} = \begin{bmatrix} 21&13 \\ 14&16 \end{bmatrix}$

(c) True. Suppose that both $A$ and $B$ are invertible; then both $A^{-1}$ and $B^{-1}$ exist. Consider the product matrices $BA$ and $A^{-1}B^{-1}$. We have $(BA)(A^{-1}B{-1}) = B(AA^{-1})B{-1} = BIB^{-1} = BB^{-1} = I$

and also $(A^{-1}B{-1})(BA) = A^{-1}(B{-1}B)A = A^{-1}IA = A^{-1}A = I$

So $A^{-1}B{-1}$ is both a left and right inverse for $BA$ and thus $(BA)^{-1} = A^{-1}B{-1}$. If both $A$ and $B$ are invertible then their product $BA$ is also.

(d) False. A matrix $A$ cannot necessarily be factored into the form $A = LU$ because you may need to do row exchanges in order for elimination to succeed. Consider the following counterexample: $A = \begin{bmatrix} 0&1&2 \\ 1&1&1 \\ 1&2&1 \end{bmatrix}$

This matrix requires exchanging the first and second rows before elimination can commence. We can do this by multiplying by an appropriate permutation matrix: $PA = \begin{bmatrix} 0&1&0 \\ 1&0&0 \\ 0&0&1 \end{bmatrix} \begin{bmatrix} 0&1&2 \\ 1&1&1 \\ 1&2&1 \end{bmatrix} = \begin{bmatrix} 1&1&1 \\ 0&1&2 \\ 1&2&1 \end{bmatrix}$

We then multiply the (new) first row by 1 and subtract it from the third row (i.e., the multiplier $l_{31} = 1$): $\begin{bmatrix} 1&1&1 \\ 0&1&2 \\ 1&2&1 \end{bmatrix} \rightarrow \begin{bmatrix} 1&1&1 \\ 0&1&2 \\ 0&1&0 \end{bmatrix}$

and then multiply the second row by 1 and subtract it from the third ( $l_{32} = 1$): $\begin{bmatrix} 1&1&1 \\ 0&1&2 \\ 0&1&0 \end{bmatrix} \rightarrow \begin{bmatrix} 1&1&1 \\ 0&1&2 \\ 0&0&-2 \end{bmatrix}$

We then have $L = \begin{bmatrix} 1&0&0 \\ 0&1&0 \\ 1&1&1 \end{bmatrix} \quad U = \begin{bmatrix} 1&1&1 \\ 0&1&2 \\ 0&0&-2 \end{bmatrix}$

and $LU = \begin{bmatrix} 1&0&0 \\ 0&1&0 \\ 1&1&1 \end{bmatrix} \begin{bmatrix} 1&1&1 \\ 0&1&2 \\ 0&0&-2 \end{bmatrix} = \begin{bmatrix} 1&1&1 \\ 0&1&2 \\ 1&2&1 \end{bmatrix} = PA \ne A$

So a matrix $A$ cannot always be factored into the form $A = LU$.

NOTE: This continues a series of posts containing worked out exercises from the (out of print) book Linear Algebra and Its Applications, Third Edition by Gilbert Strang.

If you find these posts useful I encourage you to also check out the more current Linear Algebra and Its Applications, Fourth Edition , Dr Strang’s introductory textbook Introduction to Linear Algebra, Fourth Edition and the accompanying free online course, and Dr Strang’s other books .

This entry was posted in linear algebra. Bookmark the permalink.