Multidimensional Chebyshev's inequality

In probability theory, the multidimensional Chebyshev's inequality is a generalization of Chebyshev's inequality, which puts a bound on the probability of the event that a random variable differs from its expected value by more than a specified amount.

Let X be an N-dimensional random vector with expected value \mu=\mathbb{E} \left[ X \right] and covariance matrix

V=\mathbb{E} \left[ \left(X - \mu \right) \left( X - \mu \right)^T \right]. \,

If V is a positive-definite matrix, for any real number t>0:


\mathrm{Pr}\left( \sqrt{\left( X-\mu\right)^T \, V^{-1} \, \left( X-\mu\right) } > t \right) \le \frac{N}{t^2}

Proof

Since V is positive-definite, so is V^{-1}. Define the random variable


y = \left( X-\mu\right)^T \, V^{-1} \, \left( X-\mu\right) .

Since y is positive, Markov's inequality holds:


\begin{array}{lll}\mathrm{Pr}\left( \sqrt{\left( X-\mu\right)^T \, V^{-1} \, \left( X-\mu\right) } > t\right) &= \mathrm{Pr}\left( \sqrt{y} > t\right)\\
&=\mathrm{Pr}\left( y > t^2 \right) \\
&\le \frac{\mathbb{E}[y]}{t^2} .\end{array}

Finally,

\begin{array}{lll}\mathbb{E}[y] &= \mathbb{E}[\left( X-\mu\right)^T \, V^{-1} \, \left( X-\mu\right)]\\
&=\mathbb{E}[ \mathrm{trace} (  V^{-1} \, \left( X-\mu\right) \,   \left( X-\mu\right)^T )]\\
&= \mathrm{trace} (  V^{-1} V ) = N \end{array}.
This article is issued from Wikipedia - version of the Saturday, February 15, 2014. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.