Gauss's inequality

In probability theory, Gauss's inequality (or the Gauss inequality) gives an upper bound on the probability that a unimodal random variable lies more than any given distance from its mode.

Let X be a unimodal random variable with mode m, and let τ 2 be the expected value of (X  m)2. (τ 2 can also be expressed as (μ  m)2 + σ 2, where μ and σ are the mean and standard deviation of X.) Then for any positive value of k,


\Pr(\mid X - m \mid > k) \leq \begin{cases}
\left( \frac{2\tau}{3k} \right)^2 & \text{if } k \geq \frac{2\tau}{\sqrt{3}} \\[6pt]
1 - \frac{k}{\tau\sqrt{3}}        & \text{if } 0 \leq k \leq \frac{2\tau}{\sqrt{3}}.
\end{cases}

The theorem was first proved by Carl Friedrich Gauss in 1823.

See also

References

This article is issued from Wikipedia - version of the Thursday, May 08, 2014. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.