## Partitioning a matrix into block diagonal form, part 4

In my previous two posts I described a method for constructing a partitioning vector $v$ for an $n$ by $n$ square matrix $A$ (part 2 of this series) and showed that the vector thus constructed partitions $A$ into block diagonal form (part 3). In this post I begin the task of showing that the vector $v$ is maximal and unique: $A$ cannot also be partitioned as a block diagonal matrix with $s$ partitions, where $s > r$, and if we partition $A$ into block diagonal form with $r$ partitions using a vector $w$ then we have $w = v$.

In the previous posts I showed that we can construct a vector $v$ that will partition $A$ into block diagonal form with $r$ partitions (where $r$ is one less than the length of $v$), such that

$A = \begin{bmatrix} A_{1}&0&\cdots&0 \\ 0&A_{2}&\cdots&0 \\ \vdots&\vdots&\ddots&\vdots \\ 0&0&\cdots&A_{r} \end{bmatrix}$

Now suppose that we have a vector $w$ that also partitions A into block diagonal form, using $s$ partitions:

$A = \begin{bmatrix} A'_{1}&0&\cdots&0 \\ 0&A'_{2}&\cdots&0 \\ \vdots&\vdots&\ddots&\vdots \\ 0&0&\cdots&A'_{s} \end{bmatrix}$

My approach in this post is to try to find some sort of connection between $w$ and $v$. In particular, I claim that for all $\alpha$ where $1 \le \alpha \le s+1$ we have $w_{\alpha} = v_{\beta}$ for some $\beta$ where $1 \le \beta \le r+1$. In other words, any partition boundary specified by $w$ is also a partition boundary specified by $v$.

My proof is by induction. Since $w$ and $v$ are both partition vectors we already have $w_1 = v_1 = 0$ by definition. The claim therefore holds true for $\alpha = 1$ (with $\beta = 1$ in this case). Suppose that for some $\alpha$ we have $w_{\alpha} = v_{\beta}$ for some $\beta$ (with $1 \le \beta \le r+1$), and consider $w_{\alpha+1}$. We want to show that $w_{\alpha+1} = v_{\gamma}$ for some $\gamma$ where $1 \le \gamma \le r+1$.

We can select $\gamma > \beta$ such that $1 \le \gamma \le r+1$ and $w_{\alpha} \le v_{\gamma-1} < w_{\alpha+1} \le v_{\gamma}$. This is done as follows: We have $w_{\alpha+1} \le n = v_{r+1}$ so we know that $w_{\alpha+1} \le v_k$ for at least one $k$ (i.e., $k = r+1$). We select $\gamma$ to be the smallest $k$ less than or equal to $r+1$ for which $w_{\alpha+1} \le v_k$. By the definition of $\gamma$ and the fact that $v_{\gamma-1} < v_{\gamma}$ we then conclude that $w_{\alpha+1} > v_{\gamma-1}$. (If we had $w_{\alpha+1} \le v_{\gamma-1}$ then $\gamma$ would not be the smallest $k$ less than or equal to $r+1$ for which $w_{\alpha+1} \le v_k$.) We also have $\gamma > \beta$, which means that $\gamma - 1 \ge \beta$. We then have $v_{\gamma-1} \ge v_\beta = w_{\alpha}$. Finally, $\gamma - 1 \ge \beta$ and $\beta \ge 1$ implies that $\gamma \ge 2$ and thus $\gamma \ge 1$. The result is that $\gamma$ has the properties noted above.

From the above we see that $w_{\alpha+1} \le v_{\gamma}$. I now claim that $w_{\alpha+1} = v_{\gamma}$. The proof is by contradiction. Suppose instead that $w_{\alpha+1} < v_{\gamma}$. The submatrix $A'_{\alpha+1}$ includes entries from rows $w_\alpha + 1$ through $w_{\alpha+1}$ of $A$ and columns $w_\alpha + 1$ through $w_{\alpha+1}$ of $A$. Since $w$ partitions $A$ into block diagonal form we know that any submatrices to the right of and below $A'_{\alpha+1}$ are zero (if such submatrices exist).

The requirement that all submatrices to the right of $A'_{\alpha}$ be zero then means that we must have $a_{ij} = 0$ if $w_{\alpha} + 1 \le i \le w_{\alpha+1}$ and $j > w_{\alpha+1}$, and the requirement that all submatrices below $A'_{\alpha}$ be zero means that we must also have $a_{ij} = 0$ if $i > w_{\alpha+1}$ and $w_{\alpha} + 1 \le j \le w_{\alpha+1}$.

From the way we selected $\gamma$ we have $w_{\alpha} \le v_{\gamma-1} < w_{\alpha+1}$, which means that we also have $a_{ij} = 0$ if $v_{\gamma-1} + 1 \le i \le w_{\alpha+1}$ and $j > w_{\alpha+1}$, and $a_{ij} = 0$ if $i > w_{\alpha+1}$ and $v_{\gamma-1} + 1 \le j \le w_{\alpha+1}$.

But recall from the definition of $v$ that $v_{\gamma}$ was chosen to be the smallest $k$ such that $a_{ij} = 0$ if either $v_{\gamma-1} + 1 \le i \le k$ and $j > k$, or if $i > k$ and $v_{\gamma-1} + 1 \le j \le k$; alternatively $v_{\gamma}$ was set to $n$ if no such $k$ existed. In the former case we must have $v_{\gamma} \le w_{\alpha+1}$, which contradicts our assumption that $v_{\gamma} > w_{\alpha+1}$. In the latter case our assumption that $v_{\gamma} > w_{\alpha+1}$ implies that there is indeed a $k$ fulfilling the criteria above, namely $k = w_{\alpha+1}$, which contradicts our assumption about how $v$ was defined.

We therefore conclude that given $w_{\alpha} = v_{\beta}$ for some $\beta$ (where $1 \le \beta \le r+1$), there exists $\gamma$ such that $w_{\alpha+1} = v_{\gamma}$ (where $1 \le \gamma \le r+1$). Combined with the fact that this is true for $\alpha = 1$, we conclude that for any $\alpha$ (where $1 \le \alpha \le s+1$) there exists $\beta$ such that $1 \le \beta \le r+1$ and $w_{\alpha} = v_{\beta}$, so that any partition boundary specified by $w$ is also a partition boundary specified by $v$.

In part 5 of this series I use this result to complete the task of showing that the vector $v$ is maximal and unique.

This entry was posted in linear algebra. Bookmark the permalink.