Rayleigh distribution

Not to be confused with Rayleigh mixture distribution.
Rayleigh
Probability density function


Cumulative distribution function


Parameters scale: \sigma>0\,
Support x\in [0,+\infty)
PDF \frac{x}{\sigma^2} e^{-x^2/2\sigma^2}
CDF 1 - e^{-x^2/2\sigma^2}
Quantile Q(F;\sigma)=\sigma \sqrt{-\ln[(1 - F)^2]}
Mean \sigma \sqrt{\frac{\pi}{2}}
Median \sigma\sqrt{2\ln(2)}\,
Mode \sigma\,
Variance \frac{4 - \pi}{2} \sigma^2
Skewness \frac{2\sqrt{\pi}(\pi - 3)}{(4-\pi)^{3/2}}
Ex. kurtosis -\frac{6\pi^2 - 24\pi +16}{(4-\pi)^2}
Entropy 1+\ln\left(\frac{\sigma}{\sqrt{2}}\right)+\frac{\gamma}{2}
MGF 1+\sigma t\,e^{\sigma^2t^2/2}\sqrt{\frac{\pi}{2}}
\left(\textrm{erf}\left(\frac{\sigma t}{\sqrt{2}}\right)\!+\!1\right)
CF 1\!-\!\sigma te^{-\sigma^2t^2/2}\sqrt{\frac{\pi}{2}}\!\left(\textrm{erfi}\!\left(\frac{\sigma t}{\sqrt{2}}\right)\!-\!i\right)

In probability theory and statistics, the Rayleigh distribution /ˈrli/ is a continuous probability distribution for positive-valued random variables.

A Rayleigh distribution is often observed when the overall magnitude of a vector is related to its directional components. One example where the Rayleigh distribution naturally arises is when wind velocity is analyzed into its orthogonal 2-dimensional vector components. Assuming that each component is uncorrelated, normally distributed with equal variance, and zero mean, then the overall wind speed (vector magnitude) will be characterized by a Rayleigh distribution. A second example of the distribution arises in the case of random complex numbers whose real and imaginary components are independently and identically distributed Gaussian with equal variance and zero mean. In that case, the absolute value of the complex number is Rayleigh-distributed.

The distribution is named after Lord Rayleigh[1]

Definition

The probability density function of the Rayleigh distribution is[2]

f(x;\sigma) = \frac{x}{\sigma^2} e^{-x^2/(2\sigma^2)}, \quad x \geq 0,

where \sigma is the scale parameter of the distribution. The cumulative distribution function is[2]

F(x;\sigma) = 1 - e^{-x^2/(2\sigma^2)}

for x \in [0,\infty).

Relation to random vector lengths

Consider the two-dimensional vector  Y = (U,V) which has components that are Gaussian-distributed, centered at zero, and independent. Then  f_U(u; \sigma) = \frac{e^{-u^2/2\sigma^2}}{\sqrt{2\pi\sigma^2}} , and similarly for  f_V(v; \sigma) .

Let  x be the length of  Y . It is distributed as

f(x; \sigma) =  \frac{1}{2\pi\sigma^2} \int_{-\infty}^\infty du \, \int_{-\infty}^\infty dv \, e^{-u^2/2\sigma^2} e^{-v^2/2\sigma^2} \delta(x-\sqrt{u^2+v^2}).

By transforming to the polar coordinate system one has

 f(x; \sigma) = \frac{1}{2\pi\sigma^2} \int_0^{2\pi} \, d\phi \int_0^\infty dr \, \delta(r-x) r e^{-r^2/2\sigma^2}= \frac{x}{\sigma^2} e^{-x^2/2\sigma^2},

which is the Rayleigh distribution. It is straightforward to generalize to vectors of dimension other than 2. There are also generalizations when the components have unequal variance or correlations.

Properties

The raw moments are given by:

\mu_k = \sigma^k2^\frac{k}{2}\,\Gamma\left(1 + \frac{k}{2}\right)

where \Gamma(z) is the Gamma function.

The mean and variance of a Rayleigh random variable may be expressed as:

\mu(X) = \sigma \sqrt{\frac{\pi}{2}}\ \approx 1.253 \sigma

and

\textrm{var}(X) = \frac{4 - \pi}{2} \sigma^2 \approx 0.429 \sigma^2

The mode is \sigma and the maximum pdf is

 f_\text{max} = f(\sigma;\sigma) = \frac{1}{\sigma} e^{-\frac{1}{2}} \approx \frac{1}{\sigma} 0.606

The skewness is given by:

\gamma_1 = \frac{2\sqrt{\pi}(\pi - 3)}{(4 - \pi)^\frac{3}{2}} \approx 0.631

The excess kurtosis is given by:

\gamma_2 = -\frac{6\pi^2 - 24\pi + 16}{(4 - \pi)^2} \approx 0.245

The characteristic function is given by:

\varphi(t) = 1 - \sigma te^{-\frac{1}{2}\sigma^2t^2}\sqrt{\frac{\pi}{2}} \left[\textrm{erfi} \left(\frac{\sigma t}{\sqrt{2}}\right) - i\right]

where \operatorname{erfi}(z) is the imaginary error function. The moment generating function is given by


  M(t) = 1 + \sigma t\,e^{\frac{1}{2}\sigma^2t^2}\sqrt{\frac{\pi}{2}}
           \left[\textrm{erf}\left(\frac{\sigma t}{\sqrt{2}}\right) + 1\right]

where \operatorname{erf}(z) is the error function.

Differential entropy

The differential entropy is given by

H = 1 + \ln\left(\frac{\sigma}{\sqrt{2}}\right) + \frac{\gamma}{2}

where \gamma is the Euler–Mascheroni constant.

Differential equation

The pdf of the Rayleigh distribution is a solution of the following differential equation:

\left\{\begin{array}{l}
\sigma^2 x f'(x)+f(x) \left(x^2-\sigma^2\right)=0 \\[10pt]
f(1)=\frac{\exp\left(-\frac{1}{2 \sigma^2}\right)}{\sigma^2}
\end{array}\right\}

Parameter estimation

Given a sample of N independent and identically distributed Rayleigh random variables x_i with parameter \sigma,

\widehat{\sigma^2}\approx \!\,\frac{1}{2N}\sum_{i=1}^N x_i^2 is an unbiased maximum likelihood estimate.
\hat{\sigma}\approx \!\,\sqrt{\frac{1}{2N}\sum_{i=1}^N x_i^2} is a biased estimator that can be corrected via the formula
\sigma = \hat{\sigma} \frac {\Gamma(N)\sqrt{N}} {\Gamma(N + \frac {1} {2})} = \hat{\sigma} \frac {4^N N!(N-1)!\sqrt{N}} {(2N)!\sqrt{\pi}}[3]

Confidence intervals

To find the (1  α) confidence interval, first find the two numbers \chi_1^2, \ \chi_2^2 where:

  Pr(\chi^2(2N) \leq \chi_1^2) = \alpha/2, \quad Pr(\chi^2(2N) \leq \chi_2^2) = 1 - \alpha/2

then

  \frac{N\overline{x^2}}{\chi_2^2} \leq \widehat{\sigma}^2 \leq \frac{N\overline{x^2}}{\chi_1^2}[4]

Generating random variates

Given a random variate U drawn from the uniform distribution in the interval (0, 1), then the variate

X=\sigma\sqrt{-2 \ln(U)}\,

has a Rayleigh distribution with parameter \sigma. This is obtained by applying the inverse transform sampling-method.

Related distributions

[Q=R^2] \sim \chi^2(N)\ .
\left[Y=\sum_{i=1}^N R_i^2\right] \sim \Gamma(N,2\sigma^2) .

Applications

An application of the estimation of σ can be found in magnetic resonance imaging (MRI). As MRI images are recorded as complex images but most often viewed as magnitude images, the background data is Rayleigh distributed. Hence, the above formula can be used to estimate the noise variance in an MRI image from background data.[6] [7]

Proof of correctness – Unequal variances

See also

References

  1. "The Wave Theory of Light", Encyclopedic Britannica 1888; "The Problem of the Random Walk", Nature 1905 vol.72 p.318
  2. 1 2 Papoulis, Athanasios; Pillai, S. (2001) Probability, Random Variables and Stochastic Processe. ISBN 0073660116, ISBN 9780073660110
  3. Siddiqui, M. M. (1964) "Statistical inference for Rayleigh distributions", The Journal of Research of the National Bureau of Standards, Sec. D: Radio Science, Vol. 68D, No. 9, p. 1007
  4. Siddiqui, M. M. (1961) "Some Problems Connected With Rayleigh Distributions", The Journal of Research of the National Bureau of Standards, Sec. D: Radio Propagation, Vol. 66D, No. 2, p. 169
  5. Hogema, Jeroen (2005) "Shot group statistics"
  6. Sijbers J., den Dekker A. J., Raman E. and Van Dyck D. (1999) "Parameter estimation from magnitude MR images", International Journal of Imaging Systems and Technology, 10(2), 109114
  7. den Dekker A. J., Sijbers J., (2014) "Data distributions in magnetic resonance images: a review", Physica Medica,
  8. http://physicspages.com/2012/12/24/coordinate-transformations-the-jacobian-determinant/
This article is issued from Wikipedia - version of the Wednesday, March 09, 2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.