Regularization and the small-ball method I: Sparse recovery

Guillaume Lecué, Shahar Mendelson

Research output: Contribution to journalArticlepeer-review

46 Citations (Scopus)

Abstract

We obtain bounds on estimation error rates for regularization procedures of the form f ∈ argmin fF N1i=N1 Yi − f(Xi)2 + λ(f) when is a norm and F is convex. Our approach gives a common framework that may be used in the analysis of learning problems and regularization problems alike. In particular, it sheds some light on the role various notions of sparsity have in regularization and on their connection with the size of subdifferentials of in a neighborhood of the true minimizer. As “proof of concept” we extend the known estimates for the LASSO, SLOPE and trace norm regularization.

Original languageEnglish
Pages (from-to)611-641
Number of pages31
JournalAnnals of Statistics
Volume46
Issue number2
DOIs
Publication statusPublished - Apr 2018
Externally publishedYes

Fingerprint

Dive into the research topics of 'Regularization and the small-ball method I: Sparse recovery'. Together they form a unique fingerprint.

Cite this