Entropies and cross-entropies of exponential families

Frank Nielsen*, Richard Nock

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

53 Citations (Scopus)

Abstract

Statistical modeling of images plays a crucial role in modern image processing tasks like segmentation, object detection and restoration. Although Gaussian distributions are conveniently handled mathematically, the role of many other types of distributions has been revealed and emphasized by natural image statistics. In this paper, we consider a versatile class of distributions called exponential families that encompasses many well-known distributions, such as Gaussian, Poisson, multinomial, Gamma/Beta and Dirichlet distributions, just to name a few. For those families, we derive mathematical expressions for their Shannon entropy and cross-entropy, give a geometric interpretation, and show that they admit closed-form formula up to some entropic normalizing constant depending on the carrier measure but independent of the member of the family. This allows one to design algorithms that can compare exactly entropies and cross-entropies of exponential family distributions although some of them have strictus sensus no known closed forms (eg., Poisson). We discuss about maximum entropy and touch upon the entropy of mixtures of exponential families for which we provide a relative entropy upper bound.

Original languageEnglish
Title of host publication2010 IEEE International Conference on Image Processing, ICIP 2010 - Proceedings
PublisherIEEE Computer Society
Pages3621-3624
Number of pages4
ISBN (Print)9781424479948
DOIs
Publication statusPublished - 2010
Externally publishedYes

Publication series

NameProceedings - International Conference on Image Processing, ICIP
ISSN (Print)1522-4880

Fingerprint

Dive into the research topics of 'Entropies and cross-entropies of exponential families'. Together they form a unique fingerprint.

Cite this