Bundle methods for machine learning

Alexander Smola, S Vishwanathan, Quoc Le

    Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

    Abstract

    We present a globally convergent method for regularized risk minimization problems. Our method applies to Support Vector estimation, regression, Gaussian Processes, and any other regularized risk minimization setting which leads to a convex optimization problem. SVMPerf can be shown to be a special case of our approach. In addition to the unified framework we present tight convergence bounds, which show that our algorithm converges in O(1=ε) steps to ε precision for general convex problems and in O(log(1=ε)) steps for continuously differentiable problems. We demonstrate in experiments the performance of our approach.
    Original languageEnglish
    Title of host publicationAdvances in Neural Information Processing Systems 20: Proceedings of the 2007 Conference
    EditorsPlatt, John C., Koller, Daphne, Singer, Yoram and Roweis, Sam
    Place of PublicationVancouver Canada
    PublisherMIT Press
    Pages1377-1384
    EditionPeer Reviewed
    ISBN (Print)9781605603520
    Publication statusPublished - 2009
    EventConference on Advances in Neural Information Processing Systems (NIPS 2007) - Vancouver Canada
    Duration: 1 Jan 2009 → …
    http://books.nips.cc/nips20.html

    Conference

    ConferenceConference on Advances in Neural Information Processing Systems (NIPS 2007)
    Period1/01/09 → …
    OtherDecember 3-6 2007
    Internet address

    Fingerprint

    Dive into the research topics of 'Bundle methods for machine learning'. Together they form a unique fingerprint.

    Cite this