## Partitioning a matrix into block diagonal form, part 2

In part 1 of this series I outlined an informal method to partition a square matrix into block diagonal form. In this post I provide a formal description of such a method, applicable to any $n$ by $n$ square matrix. In doing this I use the concept of a partitioning vector introduced in my original post on multiplying block matrices.

(Note that it’s possible that a better approach to partitioning exists, not to mention better notation. However this is the best I’ve been able to come up with.)

Let $A$ be an $n$ by $n$ matrix. Following the discussion in my original post, we can divide the $n$ rows and $n$ columns of $A$ into $q$ partitions, where $1 \le q \le n$, using a vector $u$ of nonnegative integer values defined as follows:

$u = \begin{bmatrix} u_1&u_2&\cdots&u_{q+1} \end{bmatrix}$

$u_1 = 0, u_{q+1} = n$

$u_i < u_{i+1} \text{ for } 1 \le i \le q$

where $u_i + 1$ is the first row and column in the $i^{th}$ partition, $u_{i+1}$ is the last row and column in the $i^{th}$ partition, and $u_{i+1} - u_i \ge 1$ is the number of rows and columns in the $i^{th}$ partition.

We then define $q^2$ submatrices of $A$, with a given submatrix defined in terms of the entries of $A$ as follows:

$A_{\alpha \beta} = \begin{bmatrix} a_{u_\alpha+1, u_\beta+1}&a_{u_\alpha+1,u_\beta + 2}&\cdots&a_{u_\alpha+1,u_{\beta+1}} \\ a_{u_\alpha+2, u_\beta+1}&a_{u_\alpha+2,u_\beta+2}&\cdots&a_{u_\alpha+2,u_{\beta+1}} \\ \vdots&\vdots&\ddots&\vdots \\ a_{u_{\alpha+1}, u_\beta+1}&a_{u_{\alpha+1},u_\beta+2}&\cdots&a_{u_{\alpha+1},u_{\beta+1}} \end{bmatrix}$

so that the $(i, j)$ entry of $A_{\alpha \beta}$ has the value $(A_{\alpha \beta})_{ij} = a_{u_\alpha+i,u_\beta+j}$.

When $q = 1$ we have $u = \begin{bmatrix} 0&n \end{bmatrix}$ and $A = A_{11}$. When $q = n$ we have $u = \begin{bmatrix} 0&1&\cdots&n \end{bmatrix}$ so that $u_i = i-1, u_j = j-1$ and $A_{\alpha \beta} = \begin{bmatrix} a_{\alpha \beta} \end{bmatrix}$. (For a more detailed discussion of why this is so please see the previous post on multiplying block matrices.)

We now attempt to partition $A$ into block diagonal form by constructing a partitioning vector $v$ as follows:

We first set $v_1 = 0$, so that the first submatrix on the diagonal, $A_1$, will start with row 1 and column 1. Then for $\alpha = 1, 2, 3, \dotsc$, in turn we calculate $v_{\alpha + 1}$ by looking for the smallest $k > v_{\alpha}$ such that

$a_{ij} = 0 \text{ if } v_{\alpha} + 1 \le i \le k \text{ and } j > k$

and

$a_{ij} = 0 \text{ if } i > k \text{ and } v_{\alpha} + 1 \le j \le k$

Note that the first property corresponds to having all zeroes to the right of the submatrix $A_{\alpha}$ on the diagonal while the second property corresponds to having all zeroes below $A_{\alpha}$.

If no such $k$ exists then we set $v_{\alpha+1} = n$ and have completed partitioning the matrix $A$, with the number of partitions produced by the partitioning vector $v$ then being $r = \alpha$.

Otherwise (i.e., if such a $k$ does exist) we set $v_{\alpha+1} = k$, and repeat the process for $\alpha + 2$ and so on until the process terminates.

Note that we are guaranteed that the process described above terminates in no more than $n$ steps. As described above, we start by setting $v_1 = 0$, and then for our first step we set $\alpha = 1$ and compute $v_2$. If in computing $v_2$ we cannot find a suitable $k$ then the process terminates in one step as we set $v_2 = n$, with the resulting vector $v = \begin{bmatrix} 0&n \end{bmatrix}$.

If, on the other hand, we are able to find a suitable $k$ then by the definition of $k$ we have $v_2 = k > v_1$ and therefore $v_2 \ge v_1 + 1 = 1$. We then repeat the process with $\alpha = 2$ to find $v_3$ and by similar reasoning have $v_3 \ge v_2 + 1$ and thus $v_3 \ge 2$. In general we see that $v_{\alpha} \ge \alpha - 1$.

Suppose that the process has not terminated when we come to $\alpha = n$ (i.e., the $n^{th}$ step in the process). From the previous paragraph we know that $v_{n} \ge n-1$. Substituting into the above conditions relating to $k$, in assigning $v_{n+1}$ we are looking for $k > v_n \ge n-1$ where

$a_{ij} = 0 \text{ if } v_n + 1 \le i \le k \text{ and } j > k$

and

$a_{ij} = 0 \text{ if } i > k \text{ and } v_n + 1 \le j \le k$

Since $k > v_n \ge n-1$ we conclude that $k \ge n$. Then the first condition above cannot hold, because $j > k \ge n$ implies that $j > n$, and $A$ is an $n$ by $n$ matrix for which there are no elements $a_{ij}$ for $j > n$. Similarly the second condition above cannot hold, because $i > k \ge n$ implies that $i > n$, and there are no elements $a_{ij}$ for $i > n$. Therefore we terminate the process and set $v_{n+1} = n$. We are thus guaranteed that the process of constructing the vector $v$ will terminate in at most $n$ steps.

Also note that the vector $v$ thus defined is a valid partition of $A$ according to our definition above:

• The first element $v_1 = 0$.
• All elements of $v$ are nonnegative integers.  ($v_1$ is zero as noted, and all the other elements $v_i$ are chosen to be row and column indices of $A$, and hence are positive integers.)
• We have $v_i < v_{i+1}$ for all $i$. (This follows from the way $v_{\alpha+1}$ was defined relative to $v_{\alpha}$.)
• Since there are at most $n$ steps after the initial assignment of $v_1 = 0$, the maximum length of $v$ is $n+1$, and for the number of partitions $r$ (which is one less than the length of $v$) we thus have $1 \le r \le n$.

Having constructed the partitioning vector $v$, in my next post I show that $v$ partitions A into block diagonal form.

This entry was posted in linear algebra. Bookmark the permalink.

### 2 Responses to Partitioning a matrix into block diagonal form, part 2

1. I want something like “fuzzy” block diagonal form of a (pos def, sym) matrix which is an adjacency matrix with weights, such that “weak” elements=links, defined by some criteria, graph topological or in some norm are outside the band / blocks on the diagonal that after will contain strong edges / elements defined as inverse of “weak”.

This is kind of like the problem of minimizing matrix bandwidth solved by e.g.
http://en.wikipedia.org/wiki/Cuthill%E2%80%93McKee_algorithm

Do you have an implementation of your method described above?
Do you have some pointers for me in the literature or other thoughts?

• hecker says:

Unfortunately I’m not a professional mathematician, have no familiarity with the relevant literature, and thus cannot provide pointers for your particular problem. (I also haven’t implemented this as a computer program.)

I just have a B.S. in (applied) mathematics, and that was many many years ago. I did this series of posts because I was really (really) bothered by the fact that I couldn’t find online in the usual places (Wikipedia, Wolfram MathWorld, ProofWiki, etc.) any rigorous description of how to partition a matrix into block diagonal form such that the partitioning was maximal and unique. So I decided to try my hand at writing one.

I have no idea whether the resulting proof would pass muster with “real” mathematicians. I also have to believe that I’m just (imperfectly) replicating work that someone somewhere has already done in better form. Or maybe people think this is an obvious fact and therefore trivial and uninteresting–but it didn’t seem obvious to me at least.