Influence Diagnostics for High-Dimensional Lasso Regression

Bala Rajaratnam, Steven Roberts*, Doug Sparks, Honglin Yu

*Corresponding author for this work

    Research output: Contribution to journalArticlepeer-review

    15 Citations (Scopus)

    Abstract

    The increased availability of high-dimensional data, and appeal of a “sparse” solution has made penalized likelihood methods commonplace. Arguably the most widely utilized of these methods is l1 regularization, popularly known as the lasso. When the lasso is applied to high-dimensional data, observations are relatively few; thus, each observation can potentially have tremendous influence on model selection and inference. Hence, a natural question in this context is the identification and assessment of influential observations. We address this by extending the framework for assessing estimation influence in traditional linear regression, and demonstrate that it is equally, if not more, relevant for assessing model selection influence for high-dimensional lasso regression. Within this framework, we propose four new “deletion methods” for gauging the influence of an observation on lasso model selection: df-model, df-regpath, df-cvpath, and df-lambda. Asymptotic cut-offs for each measure, even when p → ∞, are developed. We illustrate that in high-dimensional settings, individual observations can have a tremendous impact on lasso model selection. We demonstrate that application of our measures can help reveal relationships in high-dimensional real data that may otherwise remain hidden. Supplementary materials for this article are available online.

    Original languageEnglish
    Pages (from-to)877-890
    Number of pages14
    JournalJournal of Computational and Graphical Statistics
    Volume28
    Issue number4
    DOIs
    Publication statusPublished - 2 Oct 2019

    Fingerprint

    Dive into the research topics of 'Influence Diagnostics for High-Dimensional Lasso Regression'. Together they form a unique fingerprint.

    Cite this