In a
recent post, I mentioned how the upfront cost of LU decomposition pays for itself when solving a system of linear equations \[Ax=b_i, \quad \quad i = 1, 2, ... N.\]
Now suppose the (presumably large) matrix \(A\) was modified slightly to form a new matrix \(A'\). Let us assume that we had a LU factorization of the original unmodified matrix, which would have allowed us to solve \(Ax=b\) very efficiently.
If we now wanted to solve \(A'x=b\), can we somehow avoid having to perform a brand new LU factorization on the modified matrix \(A'\)?
Recall that the factorization itself doesn't come very cheap.
The
Sherman-Morrison formula (SMF) says "sure can!" as long as the modification can be expressed as an outer-product of two vectors. That is, \[A' = A + uv^T.\] As an example, suppose that the element in the \((j,k)\) position \(a_{jk}\) was changed to \(a'_{kj}\). This modification can be expressed as,\[A' = A + (a'_{jk} - a_{jk}) \mathbf{e}_{j} \mathbf{e}_{k}^T,\] where \(\mathbf{e}_{j}\) is a vector with "1" in the row \(j\) and zeroes everywhere else.
The Sherman-Morrison formula says \[(A')^{-1} = (A+uv^T)^{-1} = A^{-1} - {A^{-1}uv^T A^{-1} \over 1 + v^T A^{-1}u}.\] The formula can be readily verified, as shown
on wikipedia.
Note: The scalar analog to question the SMF seeks to address is: given \(1/a\), how can you find \(1/(a+b)\)?
The SMF answer would be: \[1/a \times \left(1- {b/a \over 1 + b/a} \right).\] Returning to the original problem \(A'x = b \implies x = (A')^{-1} b\). Thus \[x = A^{-1}b - {A^{-1}uv^T A^{-1}b \over 1 + v^T A^{-1}u}.\] We can set \(y = A^{-1}b\), and \(z=A^{-1}u\). Since a LU factorization is available for \(A\), we can solve for \(y\) and \(z\) in \(\mathcal{O}(m^2)\) operations, where \(m\) is the size of the square matrix \(A\).
This gives us \[x = y - \left({v^T y \over 1 + v^T z}\right) z.\] In general, this can be performed in \(\mathcal{O}(3 m^2)\) operations, as opposed to to \(\mathcal{O}(m^3)\) for a fresh factorization.