KLDA - An iterative approach to fisher discriminant analysis

Lu Fangfang*, Li Hongdong

*Corresponding author for this work

    Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

    1 Citation (Scopus)

    Abstract

    In this paper, we present an iterative approach to Fisher discriminant analysis called Kullback-Leibler discriminant analysis (KLDA) for both linear and nonlinear feature extraction. We pose the conventional problem of discriminative feature extraction into the setting of function optimization and recover the feature transformation matrix via maximization of the objective function. The proposed objective function is defined by pairwise distances between all pairs of classes and the Kullback-Leibler divergence is adopted to measure the disparity between the distributions of each pair of classes. Our proposed algorithm can be naturally extended to handle nonlinear data by exploiting the kernel trick. Experimental results on the real world databases demonstrate the effectiveness of both the linear and kernel versions of our algorithm.

    Original languageEnglish
    Title of host publication2007 IEEE International Conference on Image Processing, ICIP 2007 Proceedings
    PagesII201-II204
    DOIs
    Publication statusPublished - 2007
    Event14th IEEE International Conference on Image Processing, ICIP 2007 - San Antonio, TX, United States
    Duration: 16 Sept 200719 Sept 2007

    Publication series

    NameProceedings - International Conference on Image Processing, ICIP
    Volume2
    ISSN (Print)1522-4880

    Conference

    Conference14th IEEE International Conference on Image Processing, ICIP 2007
    Country/TerritoryUnited States
    CitySan Antonio, TX
    Period16/09/0719/09/07

    Fingerprint

    Dive into the research topics of 'KLDA - An iterative approach to fisher discriminant analysis'. Together they form a unique fingerprint.

    Cite this