The entropy regularization information criterion

Alex J. Smola, John Shawe-Taylor, Bernhard Schölkopf, Robert C. Williamson

    Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

    1 Citation (Scopus)

    Abstract

    Effective methods of capacity control via uniform convergence bounds for function expansions have been largely limited to Support Vector machines, where good bounds are obtainable by the entropy number approach. We extend these methods to systems with expansions in terms of arbitrary (parametrized) basis functions and a wide range of regularization methods covering the whole range of general linear additive models. This is achieved by a data dependent analysis of the eigenvalues of the corresponding design matrix.

    Original languageEnglish
    Title of host publicationAdvances in Neural Information Processing Systems 12 - Proceedings of the 1999 Conference, NIPS 1999
    PublisherNeural Information Processing Systems Foundation
    Pages342-348
    Number of pages7
    ISBN (Print)0262194503, 9780262194501
    Publication statusPublished - 2000
    Event13th Annual Neural Information Processing Systems Conference, NIPS 1999 - Denver, CO, United States
    Duration: 29 Nov 19994 Dec 1999

    Publication series

    NameAdvances in Neural Information Processing Systems
    ISSN (Print)1049-5258

    Conference

    Conference13th Annual Neural Information Processing Systems Conference, NIPS 1999
    Country/TerritoryUnited States
    CityDenver, CO
    Period29/11/994/12/99

    Fingerprint

    Dive into the research topics of 'The entropy regularization information criterion'. Together they form a unique fingerprint.

    Cite this