Efficient variational inference for Gaussian process regression networks

Trung V. Nguyen, Edwin V. Bonilla

    Research output: Contribution to journalConference articlepeer-review

    17 Citations (Scopus)

    Abstract

    In multi-output regression applications the correlations between the response variables may vary with the input space and can be highly non-linear. Gaussian process regres- sion networks (GPRNs) are exible and effec- tive models to represent such complex adap- tive output dependencies. However, infer- ence in GPRNs is intractable. In this pa- per we propose two efficient variational infer- ence methods for GPRNs. The first method, gprn-mf, adopts a mean-field approach with full Gaussians over the GPRN's parameters as its factorizing distributions. The second method, gprn-npv, uses a nonparametric variational inference approach. We derive an- alytical forms for the evidence lower bound on both methods, which we use to learn the variational parameters and the hyper- parameters of the GPRN model. We ob- tain closed-form updates for the parameters of gprn-mf and show that, while having rel- atively complex approximate posterior dis- tributions, our approximate methods require the estimation of O(N) variational parame- ters rather than O(N2) for the parameters' covariances. Our experiments on real data sets show that gprn-npv may give a better approximation to the posterior distribution compared to gprn-mf, in terms of both pre- dictive performance and stability.

    Original languageEnglish
    Pages (from-to)472-480
    Number of pages9
    JournalJournal of Machine Learning Research
    Volume31
    Publication statusPublished - 2013
    Event16th International Conference on Artificial Intelligence and Statistics, AISTATS 2013 - Scottsdale, United States
    Duration: 29 Apr 20131 May 2013

    Fingerprint

    Dive into the research topics of 'Efficient variational inference for Gaussian process regression networks'. Together they form a unique fingerprint.

    Cite this