Hannan–Quinn information criterion


In statistics, the Hannan–Quinn information criterion is a criterion for model selection. It is an alternative to Akaike information criterion and Bayesian information criterion. It is given as
where is the log-likelihood, k is the number of parameters, and n is the number of observations.
Burnham & Anderson say that HQC, "while often cited, seems to have seen little use in practice". They also note that HQC, like BIC, but unlike AIC, is not an estimator of Kullback–Leibler divergence. Claeskens & Hjort note that HQC, like BIC, but unlike AIC, is not asymptotically efficient; however, it misses the optimal estimation rate by a very small factor. They further point out that whatever method is being used for fine-tuning the criterion will be more important in practice than the term, since this latter number is small even for very large ; however, the term ensures that, unlike AIC, HQC is strongly consistent. It follows from the law of the iterated logarithm that any strongly consistent method must miss efficiency by at least a factor, so in this sense HQC is asymptotically very well-behaved. Van der Pas and Grünwald prove that model selection based on a modified Bayesian estimator, the so-called switch distribution, in many cases behaves asymptotically like HQC, while retaining the advantages of Bayesian methods such as the use of priors etc.