Multivariate regression model selection from small samples using Kullback's symmetric divergence

Abd Krim Seghouane*

*Corresponding author for this work

    Research output: Contribution to journalArticlepeer-review

    15 Citations (Scopus)

    Abstract

    The Kullback Information Criterion, KIC, and its univariate bias-corrected version, KICc, are two new developed criteria for model selection. The two criteria can be viewed as estimators of the expected Kullback symmetric divergence and they have a fixed bias corrected term. In this paper, a new small sample model selection criterion for multivariate regression models is developed. The proposed criterion is named KICvc, where the notation "vc" stands for vector correction, and it can be considered as an extension of KIC for multivariate regression models. KICvc adjusts KIC to be an unbiased estimator for the variant of the Kullback symmetric divergence, assuming that the true model is correctly specified or overfitted. Furthermore, KICvc provides better multivariate regression model order or dimension choices than KIC in small samples. Simulation results shows that the proposed criterion estimates the model order more accurately than other asymptotically efficient methods when applied to multivariate regression model selection in small samples. As a result, KICvc serves as an effective tool for selecting a multivariate regression model of appropriate dimension. A theoretical justification of the proposed criterion is presented.

    Original languageEnglish
    Pages (from-to)2074-2084
    Number of pages11
    JournalSignal Processing
    Volume86
    Issue number8
    DOIs
    Publication statusPublished - Aug 2006

    Fingerprint

    Dive into the research topics of 'Multivariate regression model selection from small samples using Kullback's symmetric divergence'. Together they form a unique fingerprint.

    Cite this