Abstract
We propose a new class of support vector algorithms for regression and classification. In these algorithms, a parameter ν lets one effectively control the number of support vectors. While this can be useful in its own right, the parameterization has the additional benefit of enabling us to eliminate one of the other free parameters of the algorithm: the accuracy parameter ε in the regression case, and the regularization constant C in the classification case. We describe the algorithms, give some theoretical results concerning the meaning and the choice of ν, and report experimental results.
Original language | English |
---|---|
Pages (from-to) | 1207-1245 |
Number of pages | 39 |
Journal | Neural Computation |
Volume | 12 |
Issue number | 5 |
DOIs | |
Publication status | Published - May 2000 |