Improved bounds for sparse recovery from subsampled random convolutions

Shahar Mendelson*, Holger Rauhut, Rachel Ward

*Corresponding author for this work

    Research output: Contribution to journalArticlepeer-review

    14 Citations (Scopus)

    Abstract

    We study the recovery of sparse vectors from subsampled random convolutions via ℓ1-minimization. We consider the setup in which both the subsampling locations as well as the generating vector are chosen at random. For a sub-Gaussian generator with independent entries, we improve previously known estimates: If the sparsity s is small enough, that is, s ≤ √ n/ log(n), we show that m ≥ s log(en/s) measurements are sufficient to recover s-sparse vectors in dimension n with high probability, matching the well-known condition for recovery from standard Gaussian measurements. If s is larger, then essentially m ≥ s log2(s) log(log(s)) log(n) measurements are sufficient, again improving over previous estimates. Our results are shown via the so-called robust null space property which is weaker than the standard restricted isometry property. Our method of proof involves a novel combination of small ball estimates with chaining techniques which should be of independent interest.

    Original languageEnglish
    Pages (from-to)3491-3527
    Number of pages37
    JournalAnnals of Applied Probability
    Volume28
    Issue number6
    DOIs
    Publication statusPublished - Dec 2018

    Fingerprint

    Dive into the research topics of 'Improved bounds for sparse recovery from subsampled random convolutions'. Together they form a unique fingerprint.

    Cite this