Efficient cross-validation for kernelized least-squares regression with sparse basis expansions

Tapio Pahikkala*, Hanna Suominen, Jorma Boberg

*Corresponding author for this work

    Research output: Contribution to journalArticlepeer-review

    14 Citations (Scopus)

    Abstract

    We propose an efficient algorithm for calculating hold-out and cross-validation (CV) type of estimates for sparse regularized least-squares predictors. Holding out H data points with our method requires O(min(H 2 n,Hn 2)) time provided that a predictor with n basis vectors is already trained. In addition to holding out training examples, also some of the basis vectors used to train the sparse regularized least-squares predictor with the whole training set can be removed from the basis vector set used in the hold-out computation. In our experiments, we demonstrate the speed improvements provided by our algorithm in practice, and we empirically show the benefits of removing some of the basis vectors during the CV rounds.

    Original languageEnglish
    Pages (from-to)381-407
    Number of pages27
    JournalMachine Learning
    Volume87
    Issue number3
    DOIs
    Publication statusPublished - Jun 2012

    Fingerprint

    Dive into the research topics of 'Efficient cross-validation for kernelized least-squares regression with sparse basis expansions'. Together they form a unique fingerprint.

    Cite this