Good–Turing frequency estimation

Good–Turing frequency estimation is a statistical technique for estimating the probability of encountering an object of a hitherto unseen species, given a set of past observations of objects from different species. (In drawing balls from an urn, the 'objects' would be balls and the 'species' would be the distinct colors of the balls (finite but unknown in number). After drawing R_\text{red} red balls, R_\text{black} black balls and R_\text{green} green balls, we would ask what is the probability of drawing a red ball, a black ball, a green ball or one of a previously unseen color.

Historical background

Good–Turing frequency estimation was developed by Alan Turing and his assistant I. J. Good as part of their efforts at Bletchley Park to crack German ciphers for the Enigma machine during World War II. Turing at first modeled the frequencies as a multinomial distribution, but found it inaccurate. Good developed smoothing algorithms to improve the estimator's accuracy.

The discovery was recognized as significant when published by Good in 1953,[1] but the calculations were difficult so it was not used as widely as it might have been.[2] The method even gained some literary fame due to the Robert Harris novel Enigma.

In the 1990s, Geoffrey Sampson worked with William A. Gale of AT&T, to create and implement a simplified and easier-to-use variant of the Good–Turing method[3][4] described below.

The method

First notation and some required data structures are defined:

N_r = |\{ x \mid R_x = r \}|

For example, N_1 is the number of species for which only one individual was observed. Note that the total number of objects observed, N, can be found from

N = \sum_{r=1}^\infty r N_r.

The first step in the calculation is to find an estimate of the total probability of unseen species. This estimate is[5]

p_0 = \frac{N_1}{N}.

The next step is to find an estimate of probability for species which were seen r times. For a single species this estimate is:

p_r = \frac{(r+1) S(N_{r+1})}{N S(N_r)}.

To estimate a probability of encountering any species from this group (i.e., the group of species seen r times) one can use the following formula:

\frac{(r+1) S(N_{r+1})}{N}.

Here, the notation S( ) means the smoothed or adjusted value of the frequency shown in parenthesis (see also empirical Bayes method). An overview of how to perform this smoothing follows.

We would like to make a plot of \log N_r versus \log r but this is problematic because for large r many N_r will be zero. Instead a revised quantity, \log Z_r, is plotted versus \log r, where Zr is defined as

Z_r = \frac{N_r}{0.5(t-q)},

and where q, r and t are consecutive subscripts having N_q, N_r, N_t non-zero. When r is 1, take q to be 0. When r is the last non-zero frequency, take t to be 2r  q.

The assumption of Good–Turing estimation is that the number of occurrence for each species follows a binomial distribution.[6]

A simple linear regression is then fitted to the log–log plot. For small values of r it is reasonable to set S(N_r) = N_r (that is, no smoothing is performed), while for large values of r, values of S(N_r) are read off the regression line. An automatic procedure (not described here) can be used to specify at what point the switch from no smoothing to linear smoothing should take place.[7] Code for the method is available in the public domain.[8]

See also

References

  1. Good, I.J. (1953). "The population frequencies of species and the estimation of population parameters". Biometrika 40 (34): 237264. doi:10.1093/biomet/40.3-4.237. JSTOR 2333344. MR 61330.
  2. Newsise: Scientists Explain and Improve Upon 'Enigmatic' Probability Formula, a popular review of Orlitsky A, Santhanam NP, Zhang J. (2003). "Always Good Turing: asymptotically optimal probability estimation". Science 302 (5644): 427–31. Bibcode:2003Sci...302..427O. doi:10.1126/science.1088284. PMID 14564004.
  3. Sampson, Geoffrey and Gale, William A. (1995) Good‐turing frequency estimation without tears
  4. Orlitsky, Alon; Suresh, Ananda (2015). "Competitive distribution estimation: Why is Good-Turing Good?" (PDF). Neural Information Processing Systems (NIPS): 1–9. Retrieved 28 March 2016.
  5. Gale, William A. (1995). "Good–Turing smoothing without tears". Journal of Quantitative Linguistics 2 (3): 3. doi:10.1080/09296179508590051.
  6. Lecture 11: The Good–Turing Estimate. CS 6740, Cornell University, 2010
  7. Church, K and Gale, W (1991). "A comparison of the enhanced Good–Turing and deleted estimation methods for estimating probabilities of English bigrams".
  8. Sampson, Geoffrey (2005) Simple Good–Turing Frequency Estimator (code in C)
This article is issued from Wikipedia - version of the Tuesday, March 29, 2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.