Step size-adapted online support vector learning

Alexandros Karatzoglou*, S. V.N. Vishwanathan, Nicol N. Schraudolph, Alex J. Smola

*Corresponding author for this work

    Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

    1 Citation (Scopus)

    Abstract

    We present an online Support Vector Machine (SVM) that uses Stochastic Meta-Descent (SMD) to adapt its step size automatically. We formulate the online learning problem as a stochastic gradient descent in Reproducing Kernel Hubert Space (RKHS) and translate SMD to the nonparametric setting, where its gradient trace parameter is no longer a coefficient vector but an element of the RKHS. We derive efficient updates that allow us to perform the step size adaptation in linear time. We apply the online SVM framework to a variety of loss functions and in particular show how to achieve efficient online multiclass classification. Experimental evidence suggests that our algorithm outperforms existing methods.

    Original languageEnglish
    Title of host publicationProceedings - 8th International Symposium on Signal Processing and its Applications, ISSPA 2005
    Pages823-826
    Number of pages4
    DOIs
    Publication statusPublished - 2005
    Event8th International Symposium on Signal Processing and its Applications, ISSPA 2005 - Sydney, Australia
    Duration: 28 Aug 200531 Aug 2005

    Publication series

    NameProceedings - 8th International Symposium on Signal Processing and its Applications, ISSPA 2005
    Volume2

    Conference

    Conference8th International Symposium on Signal Processing and its Applications, ISSPA 2005
    Country/TerritoryAustralia
    CitySydney
    Period28/08/0531/08/05

    Fingerprint

    Dive into the research topics of 'Step size-adapted online support vector learning'. Together they form a unique fingerprint.

    Cite this