Risk-based generalizations of f-divergences

Darío García-García*, Ulrike Von Luxburg, Raúl Santos-Rodríguez

*Corresponding author for this work

    Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

    10 Citations (Scopus)

    Abstract

    We derive a generalized notion of f-divergences, called (f,l)-divergences. We show that this generalization enjoys many of the nice properties of/-divergences, although it is a richer family. It also provides alternative definitions of standard divergences in terms of surrogate risks. As a first practical application of this theory, we derive a new estimator for the Kulback-Leibler divergence that we use for clustering sets of vectors.

    Original languageEnglish
    Title of host publicationProceedings of the 28th International Conference on Machine Learning, ICML 2011
    Pages417-424
    Number of pages8
    Publication statusPublished - 2011
    Event28th International Conference on Machine Learning, ICML 2011 - Bellevue, WA, United States
    Duration: 28 Jun 20112 Jul 2011

    Publication series

    NameProceedings of the 28th International Conference on Machine Learning, ICML 2011

    Conference

    Conference28th International Conference on Machine Learning, ICML 2011
    Country/TerritoryUnited States
    CityBellevue, WA
    Period28/06/112/07/11

    Fingerprint

    Dive into the research topics of 'Risk-based generalizations of f-divergences'. Together they form a unique fingerprint.

    Cite this