Learning target-aware correlation filters for visual tracking

Dongdong Li*, Gongjian Wen, Yangliu Kuai, Jingjing Xiao, Fatih Porikli

*Corresponding author for this work

    Research output: Contribution to journalArticlepeer-review

    17 Citations (Scopus)

    Abstract

    Discriminative Correlation Filters (DCF) have achieved enormous popularity in the tracking community. Generally, DCF based trackers assume that the target can be well shaped by an axis-aligned bounding box. Therefore, in terms of irregularly shaped objects, the learned correlation filter is unavoidably deteriorated by the background pixels inside the bounding box. To tackle this problem, we propose Target-Aware Correlation Filters (TACF) for visual tracking. A target likelihood map is introduced to impose discriminative weight on filter values according to the probability of this location belonging to the foreground target. According to the TACF formulation, we further propose an optimization strategy based on the Preconditioned Conjugate Gradient method for efficient filter learning. With hand-crafted features (HOG), our approach achieves state-of-the-art performance (62.8% AUC) on OTB100 while running in real-time (24 fps) on a single CPU. With shallow convolutional features, our approach achieves 66.7% AUC on OTB100 and the top rank in EAO on the VOT2016 challenge.

    Original languageEnglish
    Pages (from-to)149-159
    Number of pages11
    JournalJournal of Visual Communication and Image Representation
    Volume58
    DOIs
    Publication statusPublished - Jan 2019

    Fingerprint

    Dive into the research topics of 'Learning target-aware correlation filters for visual tracking'. Together they form a unique fingerprint.

    Cite this