A dependence maximization view of clustering

Le Song*, Alex Smola, Arthur Gretton, Karsten M. Borgwardt

*Corresponding author for this work

Research output: Contribution to conferencePaperpeer-review

92 Citations (Scopus)

Abstract

We propose a family of clustering algorithms based on the maximization of dependence between the input variables and their cluster labels, as expressed by the Hilbert-Schmidt Independence Criterion (HSIC). Under this framework, we unify the geometric, spectral, and statistical dependence views of clustering, and subsume many existing algorithms as special cases (e.g. k-means and spectral clustering). Distinctive to our framework is that kernels can also be applied on the labels, which can endow them with particular structures. We also obtain a perturbation bound on the change in k-means clustering.

Original languageEnglish
Pages815-822
Number of pages8
DOIs
Publication statusPublished - 2007
Externally publishedYes
Event24th International Conference on Machine Learning, ICML 2007 - Corvalis, OR, United States
Duration: 20 Jun 200724 Jun 2007

Conference

Conference24th International Conference on Machine Learning, ICML 2007
Country/TerritoryUnited States
CityCorvalis, OR
Period20/06/0724/06/07

Fingerprint

Dive into the research topics of 'A dependence maximization view of clustering'. Together they form a unique fingerprint.

Cite this