Vector autoregressive model-order selection from finite samples using Kullback's symmetric divergence

Abd Krim Seghouane*

*Corresponding author for this work

    Research output: Contribution to journalArticlepeer-review

    23 Citations (Scopus)

    Abstract

    In this paper, a new small-sample model selection criterion for vector autoregressive (VAR) models is developed. The proposed criterion is named Kullback information criterion (KICvc), where the notation vc stands for vector correction, and it can be considered as an extension of the KIC, for VAR models. KICvc adjusts KIC to be an unbiased estimator for the variant of the Kullback symmetric divergence, assuming that the true model is correctly specified or overfitted. Furthermore, KICvc provides better VAR model-order choices than KIC in small samples. Simulation results show that the proposed criterion selects the model order more accurately than other asymptotically efficient methods when applied to VAR model selection in small samples. As a result, KICvcserves as an effective tool for selecting a VAR model of appropriate order. A theoretical justification of the proposed criterion is presented.

    Original languageEnglish
    Pages (from-to)2327-2335
    Number of pages9
    JournalIEEE Transactions on Circuits and Systems I: Regular Papers
    Volume53
    Issue number10
    DOIs
    Publication statusPublished - Oct 2006

    Fingerprint

    Dive into the research topics of 'Vector autoregressive model-order selection from finite samples using Kullback's symmetric divergence'. Together they form a unique fingerprint.

    Cite this