SPARSE SLICED INVERSE REGRESSION VIA CHOLESKY MATRIX PENALIZATION

Linh H. Nghiem*, Francis K.C. Hui, Samuel Müller, A. H. Welsh

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)

Abstract

We introduce a new sparse sliced inverse regression estimator called Cholesky matrix penalization, and its adaptive version, for achieving sparsity when estimating the dimensions of a central subspace. The new estimators use the Cholesky decomposition of the covariance matrix of the covariates and include a regularization term in the objective function to achieve sparsity in a computationally efficient manner. We establish the theoretical values of the tuning parameters that achieve estimation and variable selection consistency for the central subspace. Furthermore, we propose a new projection information criterion to select the tuning parameter for our proposed estimators, and prove that the new criterion facilitates selection consistency. The Cholesky matrix penalization estimator inherits the advantages of the matrix lasso and the lasso sliced inverse regression estimator. Furthermore, it shows superior performance in numerical studies and can be extended to other sufficient dimension reduction methods in the literature.

Original languageEnglish
Pages (from-to)2431-2453
Number of pages23
JournalStatistica Sinica
Volume32
DOIs
Publication statusPublished - 2022

Fingerprint

Dive into the research topics of 'SPARSE SLICED INVERSE REGRESSION VIA CHOLESKY MATRIX PENALIZATION'. Together they form a unique fingerprint.

Cite this