Lower bounds for the empirical minimization algorithm

Shahar Mendelson*

*Corresponding author for this work

    Research output: Contribution to journalArticlepeer-review

    35 Citations (Scopus)

    Abstract

    In this correspondence, we present a simple argument that proves that under mild geometric assumptions on the class F and the set of target functions Τ, the empirical minimization algorithm cannot yield a uniform error rate that is faster than 1√k in the function learning setup. This result holds for various loss functionals and the target functions from Τ that cause the slow uniform error rate are clearly exhibited.

    Original languageEnglish
    Pages (from-to)3797-3803
    Number of pages7
    JournalIEEE Transactions on Information Theory
    Volume54
    Issue number8
    DOIs
    Publication statusPublished - Aug 2008

    Fingerprint

    Dive into the research topics of 'Lower bounds for the empirical minimization algorithm'. Together they form a unique fingerprint.

    Cite this