Hilbert–Schmidt theorem

In mathematical analysis, the Hilbert–Schmidt theorem, also known as the eigenfunction expansion theorem, is a fundamental result concerning compact, self-adjoint operators on Hilbert spaces. In the theory of partial differential equations, it is very useful in solving elliptic boundary value problems.

Statement of the theorem

Let (H,  , ) be a real or complex Hilbert space and let A : H  H be a bounded, compact, self-adjoint operator. Then there is a sequence of non-zero real eigenvalues λi, i = 1, ..., N, with N equal to the rank of A, such that |λi| is monotonically non-increasing and, if N = +,

\lim_{i \to + \infty} \lambda_{i} = 0.

Furthermore, if each eigenvalue of A is repeated in the sequence according to its multiplicity, then there exists an orthonormal set φi, i = 1, ..., N, of corresponding eigenfunctions, i.e.

A \varphi_{i} = \lambda_{i} \varphi_{i} \mbox{ for } i = 1, \dots, N.

Moreover, the functions φi form an orthonormal basis for the range of A and A can be written as

A u = \sum_{i = 1}^{N} \lambda_{i} \langle \varphi_{i}, u \rangle \varphi_{i} \mbox{ for all } u \in H.

References

This article is issued from Wikipedia - version of the Wednesday, August 05, 2015. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.