Abstract
The Kullback information criterion, KIC and its univariate bias-corrected version, KICc may be viewed as estimators of the expected Kullback-Leibler symmetric divergence. This correspondence examines the overfitting properties of KIC and KICc through the probabilities of overfitting both in finite samples and asymptotically. It is shown that KIC and KICc have much smaller probabilities of overfitting than the Akaike information criterion, AIC, and its bias-corrected version AICc.
Original language | English |
---|---|
Pages (from-to) | 3055-3060 |
Number of pages | 6 |
Journal | Signal Processing |
Volume | 86 |
Issue number | 10 |
DOIs | |
Publication status | Published - Oct 2006 |