TY - JOUR
T1 - Tuning Parameter Selection for the Adaptive Lasso Using ERIC
AU - Hui, Francis K.C.
AU - Warton, David I.
AU - Foster, Scott D.
N1 - Publisher Copyright:
© 2015, American Statistical Association.
PY - 2015/1/2
Y1 - 2015/1/2
N2 - The adaptive Lasso is a commonly applied penalty for variable selection in regression modeling. Like all penalties though, its performance depends critically on the choice of the tuning parameter. One method for choosing the tuning parameter is via information criteria, such as those based on AIC and BIC. However, these criteria were developed for use with unpenalized maximum likelihood estimators, and it is not clear that they take into account the effects of penalization. In this article, we propose the extended regularized information criterion (ERIC) for choosing the tuning parameter in adaptive Lasso regression. ERIC extends the BIC to account for the effect of applying the adaptive Lasso on the bias-variance tradeoff. This leads to a criterion whose penalty for model complexity is itself a function of the tuning parameter. We show the tuning parameter chosen by ERIC is selection consistent when the number of variables grows with sample size, and that this consistency holds in a wider range of contexts compared to using BIC to choose the tuning parameter. Simulation show that ERIC can significantly outperform BIC and other information criteria proposed (for choosing the tuning parameter) in selecting the true model. For ultra high-dimensional data (p > n), we consider a two-stage approach combining sure independence screening with adaptive Lasso regression using ERIC, which is selection consistent and performs strongly in simulation. Supplementary materials for this article are available online.
AB - The adaptive Lasso is a commonly applied penalty for variable selection in regression modeling. Like all penalties though, its performance depends critically on the choice of the tuning parameter. One method for choosing the tuning parameter is via information criteria, such as those based on AIC and BIC. However, these criteria were developed for use with unpenalized maximum likelihood estimators, and it is not clear that they take into account the effects of penalization. In this article, we propose the extended regularized information criterion (ERIC) for choosing the tuning parameter in adaptive Lasso regression. ERIC extends the BIC to account for the effect of applying the adaptive Lasso on the bias-variance tradeoff. This leads to a criterion whose penalty for model complexity is itself a function of the tuning parameter. We show the tuning parameter chosen by ERIC is selection consistent when the number of variables grows with sample size, and that this consistency holds in a wider range of contexts compared to using BIC to choose the tuning parameter. Simulation show that ERIC can significantly outperform BIC and other information criteria proposed (for choosing the tuning parameter) in selecting the true model. For ultra high-dimensional data (p > n), we consider a two-stage approach combining sure independence screening with adaptive Lasso regression using ERIC, which is selection consistent and performs strongly in simulation. Supplementary materials for this article are available online.
KW - BIC
KW - Consistency
KW - High-dimensional data
KW - Information criteria
KW - Penalized likelihood
KW - Regularization parameter
KW - Variable selection
UR - http://www.scopus.com/inward/record.url?scp=84928252356&partnerID=8YFLogxK
U2 - 10.1080/01621459.2014.951444
DO - 10.1080/01621459.2014.951444
M3 - Article
SN - 0162-1459
VL - 110
SP - 262
EP - 269
JO - Journal of the American Statistical Association
JF - Journal of the American Statistical Association
IS - 509
ER -