Binomial inverse theorem

In mathematics, the binomial inverse theorem is useful for expressing matrix inverses in different ways.

If A, U, B, V are matrices of sizes p×p, p×q, q×q, q×p, respectively, then


\left(\mathbf{A}+\mathbf{UBV}\right)^{-1}=
\mathbf{A}^{-1} - \mathbf{A}^{-1}\mathbf{UB}\left(\mathbf{B}+\mathbf{BVA}^{-1}\mathbf{UB}\right)^{-1}\mathbf{BVA}^{-1}

provided A and B + BVA−1UB are nonsingular. Nonsingularity of the latter requires that B−1 exist since it equals B(I+VA—1UB) and the rank of the latter cannot exceed the rank of B.[1]

Since B is invertible, the two B terms flanking the parenthetical quantity inverse in the right-hand side can be replaced with (B−1)−1, which results in


\left(\mathbf{A}+\mathbf{UBV}\right)^{-1}=
\mathbf{A}^{-1} - \mathbf{A}^{-1}\mathbf{U}\left(\mathbf{B}^{-1}+\mathbf{VA}^{-1}\mathbf{U}\right)^{-1}\mathbf{VA}^{-1}.

This is the Woodbury matrix identity, which can also be derived using matrix blockwise inversion.

A more general formula exists when B is singular and possibly even non-square:[1]

(\mathbf{A+UBV})^{-1}=\mathbf{A}^{-1}-\mathbf{A}^{-1}\mathbf{U}(\mathbf{I+BVA}^{-1}\mathbf{U})^{-1}\mathbf{BVA}^{-1}.

Formulas also exist for certain cases in which A is singular.[2]

Verification

First notice that

\left(\mathbf{A} + \mathbf{UBV}\right) \mathbf{A}^{-1}\mathbf{UB} = \mathbf{UB} + \mathbf{UBVA}^{-1}\mathbf{UB} = \mathbf{U} \left(\mathbf{B} + \mathbf{BVA}^{-1}\mathbf{UB}\right).

Now multiply the matrix we wish to invert by its alleged inverse:

\left(\mathbf{A} + \mathbf{UBV}\right) \left( \mathbf{A}^{-1} - \mathbf{A}^{-1}\mathbf{UB}\left(\mathbf{B} + \mathbf{BVA}^{-1}\mathbf{UB}\right)^{-1}\mathbf{BVA}^{-1} \right)
= \mathbf{I}_p + \mathbf{UBVA}^{-1} - \mathbf{U} \left(\mathbf{B} + \mathbf{BVA}^{-1}\mathbf{UB}\right) \left(\mathbf{B} + \mathbf{BVA}^{-1}\mathbf{UB}\right)^{-1}\mathbf{BVA}^{-1}
= \mathbf{I}_p + \mathbf{UBVA}^{-1} - \mathbf{U BVA}^{-1} = \mathbf{I}_p \!

which verifies that it is the inverse.

So we get that if A−1 and \left(\mathbf{B} + \mathbf{BVA}^{-1}\mathbf{UB}\right)^{-1} exist, then \left(\mathbf{A} + \mathbf{UBV}\right)^{-1} exists and is given by the theorem above.[3]

Special cases

First

If p = q and U = V = Ip is the identity matrix, then


\left(\mathbf{A}+\mathbf{B}\right)^{-1} = \mathbf{A}^{-1} - \mathbf{A}^{-1}\mathbf{B}\left(\mathbf{B}+\mathbf{BA}^{-1}\mathbf{B}\right)^{-1}\mathbf{BA}^{-1}.

Remembering the identity


\left(\mathbf{C} \mathbf{D}\right)^{-1} = \mathbf{D}^{-1} \mathbf{C}^{-1} ,

we can also express the previous equation in the simpler form as


\left(\mathbf{A}+\mathbf{B}\right)^{-1} = \mathbf{A}^{-1} - \mathbf{A}^{-1}\left(\mathbf{I}+\mathbf{B}\mathbf{A}^{-1}\right)^{-1}\mathbf{B}\mathbf{A}^{-1}.

Continuing with the merging of the terms of the far right-hand side of the above equation results in Hua's identity


\left(\mathbf{A}+\mathbf{B}\right)^{-1} = \mathbf{A}^{-1} - \left(\mathbf{A}+\mathbf{A}\mathbf{B}^{-1}\mathbf{A}\right)^{-1}.


Second

If B = Iq is the identity matrix and q = 1, then U is a column vector, written u, and V is a row vector, written vT. Then the theorem implies the Sherman-Morrison formula:


\left(\mathbf{A}+\mathbf{uv}^\mathrm{T}\right)^{-1} = \mathbf{A}^{-1}- \frac{\mathbf{A}^{-1}\mathbf{uv}^\mathrm{T}\mathbf{A}^{-1}}{1+\mathbf{v}^\mathrm{T}\mathbf{A}^{-1}\mathbf{u}}.

This is useful if one has a matrix A with a known inverse A−1 and one needs to invert matrices of the form A+uvT quickly for various u and v.

Third

If we set A = Ip and B = Iq, we get

\left(\mathbf{I}_p + \mathbf{UV}\right)^{-1} = \mathbf{I}_p - \mathbf{U}\left(\mathbf{I}_q + \mathbf{VU}\right)^{-1}\mathbf{V}.

In particular, if q = 1, then

\left(\mathbf{I}+\mathbf{uv}^\mathrm{T}\right)^{-1} = \mathbf{I} - \frac{\mathbf{uv}^\mathrm{T}}{1+\mathbf{v}^\mathrm{T}\mathbf{u}},

which is a particular case of the Sherman-Morrison formula given above.

See also

References

  1. 1 2 Henderson, H. V., and Searle, S. R. (1981), "On deriving the inverse of a sum of matrices", SIAM Review 23, pp. 53-60.
  2. Kurt S. Riedel, "A Sherman—Morrison—Woodbury Identity for Rank Augmenting Matrices with Application to Centering", SIAM Journal on Matrix Analysis and Applications, 13 (1992)659-662, doi:10.1137/0613040 preprint MR 1152773
  3. Gilbert Strang (2003). Introduction to Linear Algebra (3rd ed.). Wellesley-Cambridge Press: Wellesley, MA. ISBN 0-9614088-9-8.
This article is issued from Wikipedia - version of the Tuesday, January 19, 2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.