## Partitioning a matrix into block diagonal form, part 5

In previous posts in this series I informally explored how to partition a matrix into block diagonal form (part 1), described a method for constructing a partitioning vector $v$ for an $n$ by $n$ square matrix $A$ (part 2), showed that the vector thus constructed partitions $A$ into block diagonal form (part 3), and proved a preliminary result about the relationship between that vector and any other vector partitioning $A$ into block diagonal form (part 4). In this post I complete the task of showing that the vector $v$ is maximal and unique: $A$ cannot also be partitioned as a block diagonal matrix with $s$ partitions, where $s > r$, and if we partition $A$ into block diagonal form with $r$ partitions using a vector $w$ then we have $w = v$.

In the previous posts I showed that if we have a vector $w$ that also partitions A into block diagonal form, using $s$ partitions:

$A = \begin{bmatrix} A'_{1}&0&\cdots&0 \\ 0&A'_{2}&\cdots&0 \\ \vdots&\vdots&\ddots&\vdots \\ 0&0&\cdots&A'_{s} \end{bmatrix}$

then for all $\alpha$ where $1 \le \alpha \le s+1$ we have $w_{\alpha} = v_{\beta}$ for some $\beta$ where $1 \le \beta \le r+1$. In other words, any partition boundary specified by $w$ is also a partition boundary specified by $v$.

Note in addition that if $w_{\alpha} = v_{\beta}$ for some $\alpha$ and $\beta$, and also $w_{\gamma} = v_{\delta}$ for some $\gamma \ne \alpha$, then $\delta \ne \beta$. This follows from the fact that $w$ and $v$ are partition vectors: Assume that $\alpha < \gamma$; then we have $w_{\alpha} < w_{\gamma}$ and thus $v_{\beta} < v_{\delta}$, which in turn implies that $\beta < \delta$. (The proof of this last statement is by contradiction: If we instead had $\beta \ge \delta$ then because $v$ is a partition vector we would have $v_{\beta} \ge v_{\delta}$.) Similarly $\alpha > \gamma$ implies that $\beta > \delta$. So each entry $w_{\alpha}$ of $w$ corresponds to a unique entry $v_{\beta}$ of $v$.

I’m now ready to prove that $v$ is a maximal partitioning of $A$ into block diagonal form with $r$ partitions. Assume that $w$ also partitions $A$ into block diagonal form with $s$ partitions. We know from above that each entry $w_{\alpha}$ of $w$ corresponds to a unique entry $v_{\beta}$ of $v$. But there are only $r+1$ entries in $v$, so that the number of entries $s+1$ in $w$ must be less than or equal to $r+1$. (Otherwise two or more entries of $w$ would be equal to the same entry of $v$.) We thus have $s \le r$. The vector $v$ is therefore a maximal partitioning of $A$ into block diagonal form.

My final task is to prove that $v$ is a unique partitioning of $A$ into block diagonal form with $r$ partitions. Suppose that $w$ also partitions $A$ into block diagonal form with $r$ partitions:

$A = \begin{bmatrix} A'_{1}&0&\cdots&0 \\ 0&A'_{2}&\cdots&0 \\ \vdots&\vdots&\ddots&\vdots \\ 0&0&\cdots&A'_{r} \end{bmatrix}$

Since $w$ has $r+1$ entries, each equal to a unique entry of $v$, and $v$ also has $r+1$ entries, each of the entries of $v$ is in turn equal to a entry of $w$. In other words, for all $\delta$, where $1 \le \delta \le r+1$, there exists a $\gamma$ such that $v_{\delta} = w_{\gamma}$. (The proof is this is by contradiction: Suppose for $\delta$ there was no $\gamma$ such that $v_{\delta} = w_{\gamma}$. Then since $w$ has $r+1$ entries, the same number as $v$, at least two of the entries of $w$ would have to be equal to the same entry of $v$, and hence equal to each other. But this is impossible since $w$ is a partition vector.)

Moreover, if $v_{\delta} = w_{\gamma}$ for some $\delta$ and $\gamma$, and $v_{\beta} = w_{\alpha}$ for some $\beta \ne \delta$, then $\gamma \ne \alpha$. (The proof of this again follows directly from the fact that $v$ and $w$ are partition vectors: Assume without loss of generality that $\delta < \beta$; then we have $v_{\delta} < v_{\beta}$ and thus $w_{\gamma} < w_{\alpha}$, which in turn implies that $\gamma < \alpha$.) So each entry $v_{\delta}$ of $v$ corresponds to a unique entry $w_{\gamma}$ of $w$.

I nw claim that $w_{\alpha} = v_{\alpha}$ for $1 \le \alpha \le r+1$). We already have $w_1 = v_1$ by the definition of a partition vector. Now suppose $w_{\alpha} = v_{\alpha}$ for some $\alpha$, and consider $w_{\alpha +1}$ and$v_{\alpha +1}$ (assuming they exist, that is where $\alpha \le r$).

We know that $w_{\alpha +1} = v_{\beta}$ for some $\beta$ where $1 \le \beta \le r+1$. Suppose $\beta < \alpha+1$. Then we would have $\beta \le \alpha$ and thus $w_{\alpha +1} = v_{\beta} \le v_{\alpha} = w_{\alpha}$. But this cannot be, since $w$ is a partition vector and we are guaranteed that $w_{\alpha +1} > w_{\alpha}$. So we must have $\beta \ge \alpha+1$. We also know that $v_{\alpha+1} = w_{\gamma}$ for some $\gamma$ where $1 \le \gamma \le r+1$. By a similar argument we must also have $\gamma \ge \alpha+1$.

Now since $\gamma \ge \alpha+1$ we have $w_{\gamma} \ge w_{\alpha+1}$, and since $\beta \ge \alpha+1$ we have $v_{\beta} \ge v_{\alpha+1}$. But we also have $v_{\beta} = w_{\alpha+1} \le w_{\gamma} = v_{\alpha+1}$. Since we have both  $v_{\beta} \ge v_{\alpha+1}$ and $v_{\beta} \le v_{\alpha+1}$ we must have $v_{\beta} = v_{\alpha+1}$. This in turn implies that $\beta = \alpha+1$ and thus $w_{\alpha+1} = v_{\beta} = v_{\alpha+1}$.

We thus have $w_1 = v_1$, and given $w_{\alpha} = v_{\alpha}$ for $1 \le \alpha \le r$ we have $w_{\alpha+1} = v_{\alpha+1}$. By induction we have $w_{\alpha} = v_{\alpha}$ for $1 \le \alpha \le r+1$. Since both $w$ and $v$ have only $r+1$ entries we therefore have $w = v$.

Since $w$ partitions $A$ into block diagonal form with $r$ partitions, and since $w = v$ as just shown, for any diagonal submatrix $A'_{\alpha}$ where $1 \le \alpha \le r$ we have

$A'_{\alpha} = \begin{bmatrix} a_{w_\alpha+1, w_\alpha+1}&a_{w_\alpha+1,w_\alpha + 2}&\cdots&a_{w_\alpha+1,w_{\alpha+1}} \\ a_{w_\alpha+2, w_\alpha+1}&a_{w_\alpha+2,v_\alpha+2}&\cdots&a_{w_\alpha+2,w_{\alpha+1}} \\ \vdots&\vdots&\ddots&\vdots \\ a_{w_{\alpha+1}, w_\alpha+1}&a_{w_{\alpha+1},w_\alpha+2}&\cdots&a_{w_{\alpha+1},w_{\alpha+1}} \end{bmatrix}$

$= \begin{bmatrix} a_{v_\alpha+1, v_\alpha+1}&a_{v_\alpha+1,v_\alpha + 2}&\cdots&a_{v_\alpha+1,v_{\alpha+1}} \\ a_{v_\alpha+2, v_\alpha+1}&a_{v_\alpha+2,v_\alpha+2}&\cdots&a_{v_\alpha+2,v_{\alpha+1}} \\ \vdots&\vdots&\ddots&\vdots \\ a_{v_{\alpha+1}, v_\alpha+1}&a_{v_{\alpha+1},v_\alpha+2}&\cdots&a_{v_{\alpha+1},v_{\alpha+1}} \end{bmatrix} = A_{\alpha}$

Since $A'_{\alpha} = A_{\alpha}$ for $1 \le \alpha \le r$, the vector $v$ uniquely partitions $A$ into block diagonal form with $r$ partitions. Combined with the result above this concludes the proof: For any $n$ by $n$ square matrix $A$ we can uniquely and maximally partition $A$ into block diagonal form with one or more diagonal submatrices, using a vector $v$ constructed according to the method outlined in part 2 of this series.

This entry was posted in linear algebra. Bookmark the permalink.