TY - GEN

T1 - Jensen-Bregman Voronoi diagrams and centroidal tessellations

AU - Nielsen, Frank

AU - Nock, Richard

PY - 2010

Y1 - 2010

N2 - The Jensen-Bregman divergence is a distortion measure defined by the Jensen difference provided by a strictly convex function. Jensen-Bregman divergences extend the well-known Jensen-Shannon divergence by allowing to choose an arbitrary convex function generator instead of the standard Shannon entropy. This class of Jensen-Bregman divergences notably includes the squared Euclidean distance. Although Jensen-Bregman divergences are symmetric distances by construction, they are not metrics as they violate the triangle inequality. We study the geometric properties and combinatorial complexities of both the Voronoi diagrams and the centroidal Voronoi diagrams induced by such as class of information-theoretic divergences. We show that those Jensen-Bregman divergences appear naturally into two contexts: (1) when symmetrizing Bregman divergences, and (2) when computing the Bhattacharyya distances of statistical distributions. The Bhattacharyya distance measures the amount of overlap of distributions, and is popularly used to provide both lower and upper bounds in machine learning: the Bayes' misclassification error. Since the Bhattacharyya distance of popular distributions in statistics called the exponential families (including familiar Gaussian, Poisson, multinomial, Beta/Gamma families, etc.) can be computed equivalently as Jensen-Bregman divergences, the Jensen-Bregman Voronoi diagrams allow one also to study statistical Voronoi diagrams induced by an entropic function.

AB - The Jensen-Bregman divergence is a distortion measure defined by the Jensen difference provided by a strictly convex function. Jensen-Bregman divergences extend the well-known Jensen-Shannon divergence by allowing to choose an arbitrary convex function generator instead of the standard Shannon entropy. This class of Jensen-Bregman divergences notably includes the squared Euclidean distance. Although Jensen-Bregman divergences are symmetric distances by construction, they are not metrics as they violate the triangle inequality. We study the geometric properties and combinatorial complexities of both the Voronoi diagrams and the centroidal Voronoi diagrams induced by such as class of information-theoretic divergences. We show that those Jensen-Bregman divergences appear naturally into two contexts: (1) when symmetrizing Bregman divergences, and (2) when computing the Bhattacharyya distances of statistical distributions. The Bhattacharyya distance measures the amount of overlap of distributions, and is popularly used to provide both lower and upper bounds in machine learning: the Bayes' misclassification error. Since the Bhattacharyya distance of popular distributions in statistics called the exponential families (including familiar Gaussian, Poisson, multinomial, Beta/Gamma families, etc.) can be computed equivalently as Jensen-Bregman divergences, the Jensen-Bregman Voronoi diagrams allow one also to study statistical Voronoi diagrams induced by an entropic function.

KW - Bhattacharyya distance

KW - Bregman divergences

KW - Burbea-Rao divergence

KW - Exponential families

KW - F-divergence

KW - Hellinger-Kakutani- Matusita divergences

KW - Information geometry

KW - Jensen difference

KW - Jensen's inequality

KW - Jensen-Shannon divergence

KW - α-divergence

UR - http://www.scopus.com/inward/record.url?scp=77955693545&partnerID=8YFLogxK

U2 - 10.1109/ISVD.2010.17

DO - 10.1109/ISVD.2010.17

M3 - Conference contribution

SN - 9780769541129

T3 - ISVD 2010 - 7th International Symposium on Voronoi Diagrams in Science and Engineering

SP - 56

EP - 65

BT - ISVD 2010 - 7th International Symposium on Voronoi Diagrams in Science and Engineering

T2 - 7th International Symposium on Voronoi Diagrams in Science and Engineering, ISVD 2010

Y2 - 28 June 2010 through 30 June 2010

ER -