Implicit online learning with kernels

Li Cheng*, S. V.N. Vishwanathan, Dale Schuurmans, Shaojun Wang, Terry Caelli

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

50 Citations (Scopus)

Abstract

We present two new algorithms for online learning in reproducing kernel Hilbert spaces. Our first algorithm, ILK (implicit online learning with kernels), employs a new, implicit update technique that can be applied to a wide variety of convex loss functions. We then introduce a bounded memory version, SILK (sparse ILK), that maintains a compact representation of the predictor without compromising solution quality, even in non-stationary environments. We prove loss bounds and analyze the convergence rate of both. Experimental evidence shows that our proposed algorithms outperform current methods on synthetic and real data.

Original languageEnglish
Title of host publicationAdvances in Neural Information Processing Systems 19 - Proceedings of the 2006 Conference
Pages249-256
Number of pages8
Publication statusPublished - 2007
Externally publishedYes
Event20th Annual Conference on Neural Information Processing Systems, NIPS 2006 - Vancouver, BC, Canada
Duration: 4 Dec 20067 Dec 2006

Publication series

NameAdvances in Neural Information Processing Systems
ISSN (Print)1049-5258

Conference

Conference20th Annual Conference on Neural Information Processing Systems, NIPS 2006
Country/TerritoryCanada
CityVancouver, BC
Period4/12/067/12/06

Fingerprint

Dive into the research topics of 'Implicit online learning with kernels'. Together they form a unique fingerprint.

Cite this