On the importance of small coordinate projections

Shahar Mendelson, Petra Philips

    Research output: Contribution to journalArticlepeer-review

    4 Citations (Scopus)


    It has been recently shown that sharp generalization bounds can be obtained when the function class from which the algorithm chooses its hypotheses is "small" in the sense that the Rademacher averages of this function class are small. We show that a new more general principle guarantees good generalization bounds. The new principle requires that random coordinate projections of the function class evaluated on random samples are "small" with high probability and that the random class of functions allows symmetrization. As an example, we prove that this geometric property of the function class is exactly the reason why the two lately proposed frameworks, the luckiness (Shawe-Taylor et al., 1998) and the algorithmic luckiness (Herbrich and Williamson, 2002), can be used to establish generalization bounds.

    Original languageEnglish
    Pages (from-to)219-238
    Number of pages20
    JournalJournal of Machine Learning Research
    Publication statusPublished - 1 Mar 2004


    Dive into the research topics of 'On the importance of small coordinate projections'. Together they form a unique fingerprint.

    Cite this