Supervised feature selection via dependence estimation

Le Song*, Alex Smola, Arthur Gretton, Karsten M. Borgwardt, Justin Bedo

*Corresponding author for this work

Research output: Contribution to conferencePaperpeer-review

291 Citations (Scopus)

Abstract

We introduce a framework for filtering features that employs the Hilbert-Schmidt Independence Criterion (HSIC) as a measure of dependence between the features and the labels. The key idea is that good features should maximise such dependence. Feature selection for various supervised learning problems (including classification and regression) is unified under this framework, and the solutions can be approximated using a backward-elimination algorithm. We demonstrate the usefulness of our method on both artificial and real world datasets.

Original languageEnglish
Pages823-830
Number of pages8
DOIs
Publication statusPublished - 2007
Externally publishedYes
Event24th International Conference on Machine Learning, ICML 2007 - Corvalis, OR, United States
Duration: 20 Jun 200724 Jun 2007

Conference

Conference24th International Conference on Machine Learning, ICML 2007
Country/TerritoryUnited States
CityCorvalis, OR
Period20/06/0724/06/07

Fingerprint

Dive into the research topics of 'Supervised feature selection via dependence estimation'. Together they form a unique fingerprint.

Cite this