Multi-view gait recognition based on motion regression using multilayer perceptron

Worapan Kusakunniran*, Qiang Wu, Jian Zhang, Hongdong Li

*Corresponding author for this work

    Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

    21 Citations (Scopus)

    Abstract

    It has been shown that gait is an efficient biometric feature for identifying a person at a distance. However, it is a challenging problem to obtain reliable gait feature when viewing angle changes because the body appearance can be different under the various viewing angles. In this paper, the problem above is formulated as a regression problem where a novel View Transformation Model (VTM) is constructed by adopting Multilayer Perceptron (MLP) as regression tool. It smoothly estimates gait feature under an unknown viewing angle based on motion information in a well selected Region of Interest (ROI) under other existing viewing angles. Thus, this proposal can normalize gait features under various viewing angles into a common viewing angle before gait similarity measurement is carried out. Encouraging experimental results have been obtained based on widely adopted benchmark database.

    Original languageEnglish
    Title of host publicationProceedings - 2010 20th International Conference on Pattern Recognition, ICPR 2010
    Pages2186-2189
    Number of pages4
    DOIs
    Publication statusPublished - 2010
    Event2010 20th International Conference on Pattern Recognition, ICPR 2010 - Istanbul, Turkey
    Duration: 23 Aug 201026 Aug 2010

    Publication series

    NameProceedings - International Conference on Pattern Recognition
    ISSN (Print)1051-4651

    Conference

    Conference2010 20th International Conference on Pattern Recognition, ICPR 2010
    Country/TerritoryTurkey
    CityIstanbul
    Period23/08/1026/08/10

    Fingerprint

    Dive into the research topics of 'Multi-view gait recognition based on motion regression using multilayer perceptron'. Together they form a unique fingerprint.

    Cite this