Tighter variational representations of f-divergences via restriction to probability measures

Avraham Ruderman*, Mark D. Reid, Dario García-García, James Petterson

*Corresponding author for this work

    Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

    17 Citations (Scopus)

    Abstract

    We show that the variational representations for f-divergences currently used in the literature can be tightened. This has implications to a number of methods recently proposed based on this representation. As an example application we use our tighter representation to derive a general f-divergence estimator based on two i.i.d. samples and derive the dual program for this estimator that performs well empirically. We also point out a connection between our estimator and MMD.

    Original languageEnglish
    Title of host publicationProceedings of the 29th International Conference on Machine Learning, ICML 2012
    Pages671-678
    Number of pages8
    Publication statusPublished - 2012
    Event29th International Conference on Machine Learning, ICML 2012 - Edinburgh, United Kingdom
    Duration: 26 Jun 20121 Jul 2012

    Publication series

    NameProceedings of the 29th International Conference on Machine Learning, ICML 2012
    Volume1

    Conference

    Conference29th International Conference on Machine Learning, ICML 2012
    Country/TerritoryUnited Kingdom
    CityEdinburgh
    Period26/06/121/07/12

    Fingerprint

    Dive into the research topics of 'Tighter variational representations of f-divergences via restriction to probability measures'. Together they form a unique fingerprint.

    Cite this