Abstract
We study the performance of Empirical Risk Minimization in both noisy and noiseless phase retrieval problems, indexed by subsets of ℝn and relative to subgaussian sampling; that is, when the given data is yi = 〈ai; x0 〉2 + wi for a subgaussian random vector a, independent subgaussian noise w and a fixed but unknown x0 that belongs to a given T ⊂ ℝn. We show that ERM performed in T produces ẍ whose Euclidean distance to either x0 or x0 depends on the gaussian mean-width of T and on the signal-to-noise atio of the problem. The bound coincides with the one for linear regression when ǁx0ǁ2 is of the order of a constant. In addition, we obtain a sharp lower bound for the phase retrieval problem. As examples, we study the class of d-sparse vectors in ℝn and the unit ball in ∫n1.
| Original language | English |
|---|---|
| Article number | 57 |
| Pages (from-to) | 1-27 |
| Number of pages | 27 |
| Journal | Electronic Journal of Probability |
| Volume | 20 |
| DOIs | |
| Publication status | Published - 2015 |
| Externally published | Yes |
Fingerprint
Dive into the research topics of 'Minimax rate of convergence and the performance of empirical risk minimization in phase retrieval'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver