Abstract
Incorporating invariances into a learning algorithm is a common problem in machine learning. We provide a convex formulation which can deal with arbitrary loss functions and arbitrary losses. In addition, it is a drop-in replacement for most optimization algorithms for kernels, including solvers of the SVMStruct family. The advantage of our setting is that it relies on column generation instead of modifying the underlying optimization problem directly.
Original language | English |
---|---|
Title of host publication | Advances in Neural Information Processing Systems 20: Proceedings of the 2007 Conference |
Editors | Platt, John C., Koller, Daphne, Singer, Yoram and Roweis, Sam |
Place of Publication | Vancouver Canada |
Publisher | MIT Press |
Pages | 1489-1496 |
Edition | Peer Reviewed |
ISBN (Print) | 9781605603520 |
Publication status | Published - 2009 |
Event | Conference on Advances in Neural Information Processing Systems (NIPS 2007) - Vancouver Canada Duration: 1 Jan 2009 → … http://books.nips.cc/nips20.html |
Conference
Conference | Conference on Advances in Neural Information Processing Systems (NIPS 2007) |
---|---|
Period | 1/01/09 → … |
Other | December 3-6 2007 |
Internet address |