Real-time visual tracking using compressive sensing

Hanxi Li*, Chunhua Shen, Qinfeng Shi

*Corresponding author for this work

    Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

    319 Citations (Scopus)

    Abstract

    The ℓ 1 tracker obtains robustness by seeking a sparse representation of the tracking object via ℓ 1 norm minimization. However, the high computational complexity involved in the 1 tracker may hamper its applications in real-time processing scenarios. Here we propose Real-time Com-pressive Sensing Tracking (RTCST) by exploiting the signal recovery power of Compressive Sensing (CS). Dimensionality reduction and a customized Orthogonal Matching Pursuit (OMP) algorithm are adopted to accelerate the CS tracking. As a result, our algorithm achieves a realtime speed that is up to 5,000 times faster than that of the ℓ 1 tracker. Meanwhile, RTCST still produces competitive (sometimes even superior) tracking accuracy compared to the ℓ 1 tracker. Furthermore, for a stationary camera, a refined tracker is designed by integrating a CS-based background model (CSBM) into tracking. This CSBM-equipped tracker, termed RTCST-B, outperforms most state-of-the-art trackers in terms of both accuracy and robustness. Finally, our experimental results on various video sequences, which are verified by a new metric Tracking Success Probability (TSP), demonstrate the excellence of the proposed algorithms.

    Original languageEnglish
    Title of host publication2011 IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2011
    PublisherIEEE Computer Society
    Pages1305-1312
    Number of pages8
    ISBN (Print)9781457703942
    DOIs
    Publication statusPublished - 2011

    Publication series

    NameProceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition
    ISSN (Print)1063-6919

    Fingerprint

    Dive into the research topics of 'Real-time visual tracking using compressive sensing'. Together they form a unique fingerprint.

    Cite this