Logistic distribution

Logistic
Probability density function

Cumulative distribution function

Parameters μ location (real)
s > 0 scale (real)
Support x ∈ (−∞, ∞)
PDF \frac{e^{-\frac{x-\mu}{s}}} {s\left(1+e^{-\frac{x-\mu}{s}}\right)^2}\!
CDF \frac{1}{1+e^{-\frac{x-\mu}{s}}}\!
Mean \mu
Median \mu
Mode \mu
Variance \tfrac{s^2 \pi^2}{3}
Skewness 0
Ex. kurtosis 1.2
Entropy \ln(s) + 2 = \ln(\sigma) + 1.404576,
where σ is the standard deviation.
MGF e^{\mu t}\operatorname{B}(1-st, 1+st)
for st ∈ (−1, 1) Beta function
CF e^{it\mu}\frac{\pi st}{\sinh(\pi st)}

In probability theory and statistics, the logistic distribution is a continuous probability distribution. Its cumulative distribution function is the logistic function, which appears in logistic regression and feedforward neural networks. It resembles the normal distribution in shape but has heavier tails (higher kurtosis). The Tukey lambda distribution can be considered a generalization of the logistic distribution since it adds a shape parameter, λ (the Tukey distribution is logistic when λ is zero).

Specification

Probability density function

The probability density function (pdf) of the logistic distribution is given by:

f(x; \mu,s) = \frac{e^{\frac{x-\mu}{s}}} {s\left(1+e^{\frac{x-\mu}{s}}\right)^2} =\frac{1}{4s} \operatorname{sech}^2\!\left(\frac{x-\mu}{2s}\right).

Because the pdf can be expressed in terms of the square of the hyperbolic secant function "sech", it is sometimes referred to as the sech-square(d) distribution.[1]

See also: hyperbolic secant distribution

Cumulative distribution function

The logistic distribution receives its name from its cumulative distribution function (cdf), which is an instance of the family of logistic functions. The cumulative distribution function of the logistic distribution is also a scaled version of the hyperbolic tangent.

F(x; \mu, s) = \frac{1}{1+e^{-\frac{x-\mu}{s}}} = \frac12 + \frac12 \;\operatorname{tanh}\!\left(\frac{x-\mu}{2s}\right).

In this equation, x is the random variable, μ is the mean, and s is a scale parameter proportional to the standard deviation.

Quantile function

The inverse cumulative distribution function (quantile function) of the logistic distribution is a generalization of the logit function. Its derivative is called the quantile density function. They are defined as follows:

Q(p;\mu,s) = \mu + s\,\ln\left(\frac{p}{1-p}\right).
Q'(p;s) = \frac{s}{p(1-p)}.

Alternative parameterization

An alternative parameterization of the logistic distribution can be derived by expressing the scale parameter, s, in terms of the standard deviation, \sigma, using the substitution s\,=\,q\,\sigma, where q\,=\,\sqrt{3}/{\pi}\,=\,0.551328895+. The alternative forms of the above functions are reasonably straightforward.

Applications

The logistic distribution—and the S-shaped pattern of its cumulative distribution function (the logistic function) and quantile function (the logit function)—have been extensively used in many different areas. One of the most common applications is in logistic regression, which is used for modeling categorical dependent variables (e.g., yes-no choices or a choice of 3 or 4 possibilities), much as standard linear regression is used for modeling continuous variables (e.g., income or population). Specifically, logistic regression models can be phrased as latent variable models with error variables following a logistic distribution. This phrasing is common in the theory of discrete choice models, where the logistic distribution plays the same role in logistic regression as the normal distribution does in probit regression. Indeed, the logistic and normal distributions have a quite similar shape. However, the logistic distribution has heavier tails, which often increases the robustness of analyses based on it compared with using the normal distribution.

Other applications:

Fitted cumulative logistic distribution to October rainfalls using CumFreq, see also Distribution fitting

Both the United States Chess Federation and FIDE have switched their formulas for calculating chess ratings from the normal distribution to the logistic distribution; see Elo rating system.

The logistic distribution arises as limit distribution of a finite-velocity damped random motion described by a telegraph process in which the random times between consecutive velocity changes have independent exponential distributions with linearly increasing parameters.[4]

Related distributions

\mu+\beta\log \left(e^{X} -1 \right) \sim \mathrm{Logistic}(\mu,\beta).
\mu-\beta\log\left(\frac{X}{Y}\right) \sim \mathrm{Logistic}(\mu,\beta).

Derivations

Higher order moments

The n-th order central moment can be expressed in terms of the quantile function:

\operatorname{E}[(X-\mu)^n] = \int_{-\infty}^\infty (x-\mu)^n dF(x) = \int_0^1\big(Q(p)-\mu\big)^n dp = s^n \int_0^1 \left[\ln\!\left(\frac{p}{1-p}\right)\right]^n dp.

This integral is well-known[5] and can be expressed in terms of Bernoulli numbers:

    \operatorname{E}[(X-\mu)^n] = s^n\pi^n(2^n-2)\cdot|B_n|.

See also

Notes

  1. Johnson, Kotz & Balakrishnan (1995, p.116).
  2. Ritzema (ed.), H.P. (1994). Frequency and Regression Analysis (PDF). Chapter 6 in: Drainage Principles and Applications, Publication 16, International Institute for Land Reclamation and Improvement (ILRI), Wageningen, The Netherlands. pp. 175–224. ISBN 90-70754-33-9.
  3. Davies, John H. (1998). The Physics of Low-dimensional Semiconductors: An Introduction. Cambridge University Press. ISBN 9780521484916.
  4. A. Di Crescenzo, B. Martinucci (2010) "A damped telegraph random process with logistic stationary distribution", J. Appl. Prob., vol. 47, pp. 84–96.
  5. A001896

References

External links

Wikimedia Commons has media related to Logistic distribution.
This article is issued from Wikipedia - version of the Wednesday, March 09, 2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.