Covering numbers for support vector machines

Ying Guo*, Peter L. Bartlett, John Shawe-Taylor, Robert C. Williamson

*Corresponding author for this work

    Research output: Contribution to journalArticlepeer-review

    39 Citations (Scopus)

    Abstract

    Support vector (SV) machines are linear classifiers that use the maximum margin hyperplane in a feature space defined by a kernel function. Until recently, the only bounds on the generalization performance of SV machines (within Valiant's probably approximated correct framework) took no account of the kernel used except in its effect on the margin and radius. More recently, it has been shown that one can bound the relevant covering numbers using tools from functional analysis. In this paper, we show that the resulting bound can be greatly simplified. The new bound involves the eigenvalues of the integral operator induced by the kernel. It shows that the effective dimension depends on the rate of decay of these eigenvalues. We present an explicit calculation of covering numbers for an SV machine using a Gaussian kernel, which is significantly better than that implied by previous results.

    Original languageEnglish
    Pages (from-to)239-250
    Number of pages12
    JournalIEEE Transactions on Information Theory
    Volume48
    Issue number1
    DOIs
    Publication statusPublished - Jan 2002

    Fingerprint

    Dive into the research topics of 'Covering numbers for support vector machines'. Together they form a unique fingerprint.

    Cite this