Kernel measures of independence for non-iid data

Xinhua Zhang*, Le Song, Arthur Gretton, Alex Smola

*Corresponding author for this work

    Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

    19 Citations (Scopus)

    Abstract

    Many machine learning algorithms can be formulated in the framework of statistical independence such as the Hilbert Schmidt Independence Criterion. In this paper, we extend this criterion to deal with structured and interdependent observations. This is achieved by modeling the structures using undirected graphical models and comparing the Hilbert space embeddings of distributions. We apply this new criterion to independent component analysis and sequence clustering.

    Original languageEnglish
    Title of host publicationAdvances in Neural Information Processing Systems 21 - Proceedings of the 2008 Conference
    PublisherNeural Information Processing Systems
    Pages1937-1944
    Number of pages8
    ISBN (Print)9781605609492
    Publication statusPublished - 2009
    Event22nd Annual Conference on Neural Information Processing Systems, NIPS 2008 - Vancouver, BC, Canada
    Duration: 8 Dec 200811 Dec 2008

    Publication series

    NameAdvances in Neural Information Processing Systems 21 - Proceedings of the 2008 Conference

    Conference

    Conference22nd Annual Conference on Neural Information Processing Systems, NIPS 2008
    Country/TerritoryCanada
    CityVancouver, BC
    Period8/12/0811/12/08

    Fingerprint

    Dive into the research topics of 'Kernel measures of independence for non-iid data'. Together they form a unique fingerprint.

    Cite this