Monday, March 3, 2014

Cyclic Tridiagonal Matrices

In a recent post, I talked about how the Thomas algorithm can be used to solve a tridiagonal system of equations very efficiently. Recall, that a tridiagonal matrix looks like, \[A = \begin{bmatrix} b_1 & c_1 & 0 & ... & 0\\ a_2 & b_2 & c_2 & ... & 0 \\ 0 & & \ddots & & 0 \\ 0 & ... & a_{n-1} & b_{n-1} & c_{n-1} \\ 0 & & ... & a_n & b_n\\ \end{bmatrix}.\] In some applications with periodic boundaries, a variant called the cyclic tridiagonal matrix occurs, \[A' = \begin{bmatrix} b_1 & c_1 & 0 & ... & \beta\\ a_2 & b_2 & c_2 & ... & 0 \\ 0 & & \ddots & & 0 \\ 0 & ... & a_{n-1} & b_{n-1} & c_{n-1} \\ \alpha & & ... & a_n & b_n\\ \end{bmatrix}.\] The efficient Thomas algorithm, and any optimized program we may have written to implement it, cannot be applied.

However, thanks to the Sherman-Morrison formula, we are still in business!

Following the lead from Numerical Recipes, we can set \[u = \begin{bmatrix} \gamma \\ 0 \\ \vdots \\ 0 \\ \alpha \end{bmatrix}, \quad \quad v = \begin{bmatrix}1 \\ 0 \\ \vdots \\ 0 \\ \beta/\gamma \end{bmatrix},\] and modify the elements of the tridiagonal part \(A\) of the cyclic tridiagonal matrix \(A'\), \(b_1^\prime = b_1 - \gamma\), and \(b_n^{\prime} = b_n - \alpha \beta/\gamma\).

Here \(\gamma\) is arbitrary, but is set to \(-b_1\) by default.

Thus \(A' = A + uv^T\).

To solve \(A' x = b\), in addition to matrix multiplication, we need to make two calls to the tridiagonal system. Thus, generally speaking, the asymptotic cost remains \(\mathcal{O}(n)\), but is about twice as costly as a standard tridiagonal system.

1 comment:

Unknown said...

how do you finish A'x = b? does inverse of A is tridiagonal and easy to find and you use sherman morris formula to find two matrices wchich sum equal to (A + uvT) and then on them run tridiagonal algorithm?