Decision-theoretic sparsification for Gaussian process preference learning

M. Ehsan Abbasnejad, Edwin V. Bonilla, Scott Sanner

    Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

    Abstract

    We propose a decision-theoretic sparsification method for Gaussian process preference learning. This method overcomes the loss-insensitive nature of popular sparsification approaches such as the Informative Vector Machine (IVM). Instead of selecting a subset of users and items as inducing points based on uncertainty-reduction principles, our sparsification approach is underpinned by decision theory and directly incorporates the loss function inherent to the underlying preference learning problem. We show that by selecting different specifications of the loss function, the IVM's differential entropy criterion, a value of information criterion, and an upper confidence bound (UCB) criterion used in the bandit setting can all be recovered from our decision-theoretic framework. We refer to our method as the Valuable Vector Machine (VVM) as it selects the most useful items during sparsification to minimize the corresponding loss. We evaluate our approach on one synthetic and two real-world preference datasets, including one generated via Amazon Mechanical Turk and another collected from Facebook. Experiments show that variants of the VVM significantly outperform the IVM on all datasets under similar computational constraints.

    Original languageEnglish
    Title of host publicationMachine Learning and Knowledge Discovery in Databases - European Conference, ECML PKDD 2013, Proceedings
    Pages515-530
    Number of pages16
    EditionPART 2
    DOIs
    Publication statusPublished - 2013
    EventEuropean Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, ECML PKDD 2013 - Prague, Czech Republic
    Duration: 23 Sept 201327 Sept 2013

    Publication series

    NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
    NumberPART 2
    Volume8189 LNAI
    ISSN (Print)0302-9743
    ISSN (Electronic)1611-3349

    Conference

    ConferenceEuropean Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, ECML PKDD 2013
    Country/TerritoryCzech Republic
    CityPrague
    Period23/09/1327/09/13

    Fingerprint

    Dive into the research topics of 'Decision-theoretic sparsification for Gaussian process preference learning'. Together they form a unique fingerprint.

    Cite this