Hannan–Quinn information criterion

In statistics, the Hannan–Quinn information criterion (HQC) is a criterion for model selection. It is an alternative to Akaike information criterion (AIC) and Bayesian information criterion (BIC). It is given as

 \mathrm{HQC} = -2 L_{max} + 2 k \log \log n, \

where L_{max} is the log-likelihood, k is the number of parameters, and n is the number of observations.

Burnham & Anderson (2002, p. 287) say that HQC, "while often cited, seems to have seen little use in practice". They also note that HQC, like BIC, but unlike AIC, is not an estimator of Kullback–Leibler divergence. Claeskens & Hjort (2008, ch. 4) note that HQC, like BIC, but unlike AIC, is not asymptotically efficient, and further point out that whatever method is being used for fine-tuning the criterion will be more important in practice than the term log log n, since this latter number is small even for very large n.

See also

References

This article is issued from Wikipedia - version of the Saturday, December 05, 2015. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.