A novel AIC variant for linear regression models based on a bootstrap correction

Abd Krim Seghouane*

*Corresponding author for this work

    Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

    1 Citation (Scopus)

    Abstract

    The Akaike information criterion, AIC, and its corrected version, AIC c are two methods for selecting normal linear regression models. Both criteria were designed as estimators of the expected Kullback-Leibler information between the model generating the data and the approximating candidate model. In this paper, a new corrected variants of AIC is derived for the purpose of small sample linear regression model selection. The new proposed variant of AIC is based on asymptotic approximation of bootstrap type estimates of Kullback-Leibler information. Simulation results which illustrate better performance of the proposed AIC correction when applied to polynomial regression in comparison to AlC, AICc and other criteria are presented. Asymptotic justifications for the proposed criterion are provided in the Appendix.

    Original languageEnglish
    Title of host publicationProceedings of the 2008 IEEE Workshop on Machine Learning for Signal Processing, MLSP 2008
    Pages139-144
    Number of pages6
    DOIs
    Publication statusPublished - 2008
    Event2008 IEEE Workshop on Machine Learning for Signal Processing, MLSP 2008 - Cancun, Mexico
    Duration: 16 Oct 200819 Oct 2008

    Publication series

    NameProceedings of the 2008 IEEE Workshop on Machine Learning for Signal Processing, MLSP 2008

    Conference

    Conference2008 IEEE Workshop on Machine Learning for Signal Processing, MLSP 2008
    Country/TerritoryMexico
    CityCancun
    Period16/10/0819/10/08

    Fingerprint

    Dive into the research topics of 'A novel AIC variant for linear regression models based on a bootstrap correction'. Together they form a unique fingerprint.

    Cite this