Multivariate regression model selection with KIC for extrapolation cases

Abd Krim Seghouane*

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Citation (Scopus)

Abstract

The Kullback Information Criterion, KIC and its multivariate bias-corrected version, KICVC are two alternatively developed criteria for model selection. The two criteria can be viewed as estimators of the expected Kullback symmetric divergence. In this paper, a new criterion is proposed in order to select a well fitted model for an extrapolation case. The proposed criterion is named, PKIC, where "P" stands for prediction, and is derived as an exact unbiased estimator of an adapted cost function that is based on the Kullback symmetric divergence and the future design matrix. PKIC is an unbiased estimator of its cost function assuming that the true model is correctly specified or overfilled. A simulation study illustrating that model selection with PKIC performs well for some extrapolation cases is presented.

Original languageEnglish
Title of host publicationProceedings of the International Joint Conference on Neural Networks, IJCNN 2005
Pages1292-1295
Number of pages4
DOIs
Publication statusPublished - 2005
Externally publishedYes
EventInternational Joint Conference on Neural Networks, IJCNN 2005 - Montreal, QC, Canada
Duration: 31 Jul 20054 Aug 2005

Publication series

NameProceedings of the International Joint Conference on Neural Networks
Volume2

Conference

ConferenceInternational Joint Conference on Neural Networks, IJCNN 2005
Country/TerritoryCanada
CityMontreal, QC
Period31/07/054/08/05

Fingerprint

Dive into the research topics of 'Multivariate regression model selection with KIC for extrapolation cases'. Together they form a unique fingerprint.

Cite this