Leaving the Span

Manfred K. Warmuth*, S. V.N. Vishwanathan

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

13 Citations (Scopus)

Abstract

We discuss a simple sparse linear problem that is hard to learn with any algorithm that uses a linear combination of the training instances as its weight vector. The hardness holds even if we allow the learner to embed the instances into any higher dimensional feature space (and use a kernel function to define the dot product between the embedded instances). These algorithms are inherently limited by the fact that after seeing k instances only a weight space of dimension k can be spanned. Our hardness result is surprising because the same problem can be efficiently learned using the exponentiated gradient (EG) algorithm: Now the component-wise logarithms of the weights are essentially a linear combination of the training instances and after seeing k instances. This algorithm enforces additional constraints on the weights (all must be non-negative and sum to one) and in some cases these constraints alone force the rank of the weight space to grow as fast as 2k.

Original languageEnglish
Title of host publicationLearning Theory - 18th Annual Conference on Learning Theory, COLT 2005, Proceedings
PublisherSpringer Verlag
Pages366-381
Number of pages16
ISBN (Print)3540265562, 9783540265566
DOIs
Publication statusPublished - 2005
Externally publishedYes
Event18th Annual Conference on Learning Theory, COLT 2005 - Learning Theory - Bertinoro, Italy
Duration: 27 Jun 200530 Jun 2005

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume3559 LNAI
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference18th Annual Conference on Learning Theory, COLT 2005 - Learning Theory
Country/TerritoryItaly
CityBertinoro
Period27/06/0530/06/05

Fingerprint

Dive into the research topics of 'Leaving the Span'. Together they form a unique fingerprint.

Cite this