Method of moments (probability theory)

This article is about the method of moments in probability theory. See method of moments (disambiguation) for other techniques bearing the same name.

In probability theory, the method of moments is a way of proving convergence in distribution by proving convergence of a sequence of moment sequences.[1] Suppose X is a random variable and that all of the moments

\operatorname {E} (X^{k})\,

exist. Further suppose the probability distribution of X is completely determined by its moments, i.e., there is no other probability distribution with the same sequence of moments (cf. the problem of moments). If

\lim _{n\to \infty }\operatorname {E} (X_{n}^{k})=\operatorname {E} (X^{k})\,

for all values of k, then the sequence {Xn} converges to X in distribution.

The method of moments was introduced by Pafnuty Chebyshev for proving the central limit theorem; Chebyshev cited earlier contributions by Irénée-Jules Bienaymé.[2] More recently, it has been applied by Eugene Wigner to prove Wigner's semicircle law, and has since found numerous applications in the theory of random matrices.[3]

Notes

  1. Prokhorov, A.V. "Moments, method of (in probability theory)". In M. Hazewinkel. Encyclopaedia of Mathematics (online). ISBN 1-4020-0609-8. MR 1375697.
  2. Fischer, H. (2011). "4. Chebyshev's and Markov's Contributions.". A history of the central limit theorem. From classical to modern probability theory. Sources and Studies in the History of Mathematics and Physical Sciences. New York: Springer. ISBN 978-0-387-87856-0. MR 2743162.
  3. Anderson, G.W.; Guionnet, A.; Zeitouni, O. (2010). "2.1". An introduction to random matrices. Cambridge: Cambridge University Press. ISBN 978-0-521-19452-5.
This article is issued from Wikipedia - version of the Sunday, December 13, 2015. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.