Skip to main navigation Skip to search Skip to main content

From margin to sparsity

Thore Graepel, Ralf Herbrich, Robert C. Williamson

    Research output: Chapter in Book/Report/Conference proceedingConference Paperpeer-review

    32 Citations (Scopus)

    Abstract

    We present an improvement of Novikoff's perceptron convergence theorem. Reinterpreting this mistake bound as a margin dependent sparsity guarantee allows us to give a PAC-style generalisation error bound for the classifier learned by the perceptron learning algorithm. The bound value crucially depends on the margin a support vector machine would achieve on the same data set using the same kernel. Ironically, the bound yields better guarantees than are currently available for the support vector solution itself.

    Original languageEnglish
    Title of host publicationAdvances in Neural Information Processing Systems 13 - Proceedings of the 2000 Conference, NIPS 2000
    PublisherNeural Information Processing Systems Foundation
    ISBN (Print)0262122413, 9780262122412
    Publication statusPublished - 2001
    Event14th Annual Neural Information Processing Systems Conference, NIPS 2000 - Denver, CO, United States
    Duration: 27 Nov 20002 Dec 2000

    Publication series

    NameAdvances in Neural Information Processing Systems
    ISSN (Print)1049-5258

    Conference

    Conference14th Annual Neural Information Processing Systems Conference, NIPS 2000
    Country/TerritoryUnited States
    CityDenver, CO
    Period27/11/002/12/00

    Fingerprint

    Dive into the research topics of 'From margin to sparsity'. Together they form a unique fingerprint.

    Cite this