Precision (statistics)

In statistics, the dual term variability is preferred to the use of precision. Variability is the amount of imprecision.

There can be differences in usage of the term for particular statistical models but, in common statistical usage, the precision is defined to be the reciprocal of the variance, while the precision matrix is the matrix inverse of the covariance matrix.[1]

One particular use of the precision matrix is in the context of Bayesian analysis of the multivariate normal distribution: for example, Bernardo & Smith[2] prefer to parameterise the multivariate normal distribution in terms of the precision matrix rather than the covariance matrix because of certain simplifications that then arise.

History

The term precision in this sense (“mensura praecisionis observationum”) first appeared in the works of Gauss (1809) “Theoria motus corporum coelestium in sectionibus conicis solem ambientium” (page 212). Gauss’s definition differs from the modern one by a factor of \scriptstyle\sqrt2. He writes, for the density function of a normal random variable with precision h,


    \varphi\Delta = \tfrac{h}{\surd\pi}\, e^{-hh\Delta\Delta} .

Later Whittaker & Robinson (1924) “Calculus of observations” called this quantity the modulus, but this term has dropped out of use.[3]

References

  1. Dodge Y. (2003) The Oxford Dictionary of Statistical Terms, OUP. ISBN 0-19-920613-9
  2. Bernardo, J. M. & Smith, A.F.M. (2000) Bayesian Theory, Wiley ISBN 0-471-49464-X
  3. "Earliest known uses of some of the words in mathematics".
This article is issued from Wikipedia - version of the Wednesday, October 28, 2015. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.