TY - GEN
T1 - Enhanced laplacian group sparse learning with lifespan outlier rejection for visual tracking
AU - Bozorgtabar, Behzad
AU - Goecke, Roland
N1 - Publisher Copyright:
© Springer International Publishing Switzerland 2015.
PY - 2015
Y1 - 2015
N2 - Recently, sparse based learning methods have attracted much attention in robust visual tracking due to their effectiveness and promising tracking results. By representing the target object sparsely, utilising only a few adaptive dictionary templates, in this paper, we introduce a new particle filter based tracking method, in which we aim to capture the underlying structure among the particle samples using the proposed similarity graph in a Laplacian group sparse framework, such that the tracking results can be improved. Furthermore, in our tracker, particles contribute with different probabilities in the tracking result with respect to their relative positions in a given frame in regard to the current target object location. In addition, since the new target object can be well modelled by the most recent tracking results, we prefer to utilise the particle samples that are highly associated to the preceding tracking results. We demonstrate that the proposed formulation can be efficiently solved using the Accelerated Proximal method with just a small number of iterations. The proposed approach has been extensively evaluated on 12 challenging video sequences. Experimental results compared to the state-of-the-art methods demonstrate the merits of the proposed tracker.
AB - Recently, sparse based learning methods have attracted much attention in robust visual tracking due to their effectiveness and promising tracking results. By representing the target object sparsely, utilising only a few adaptive dictionary templates, in this paper, we introduce a new particle filter based tracking method, in which we aim to capture the underlying structure among the particle samples using the proposed similarity graph in a Laplacian group sparse framework, such that the tracking results can be improved. Furthermore, in our tracker, particles contribute with different probabilities in the tracking result with respect to their relative positions in a given frame in regard to the current target object location. In addition, since the new target object can be well modelled by the most recent tracking results, we prefer to utilise the particle samples that are highly associated to the preceding tracking results. We demonstrate that the proposed formulation can be efficiently solved using the Accelerated Proximal method with just a small number of iterations. The proposed approach has been extensively evaluated on 12 challenging video sequences. Experimental results compared to the state-of-the-art methods demonstrate the merits of the proposed tracker.
UR - http://www.scopus.com/inward/record.url?scp=84929622836&partnerID=8YFLogxK
U2 - 10.1007/978-3-319-16814-2_37
DO - 10.1007/978-3-319-16814-2_37
M3 - Conference contribution
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 564
EP - 578
BT - Computer Vision - ACCV 2014 - 12th Asian Conference on Computer Vision, Revised Selected Papers
A2 - Cremers, Daniel
A2 - Saito, Hideo
A2 - Reid, Ian
A2 - Yang, Ming-Hsuan
PB - Springer Verlag
T2 - 12th Asian Conference on Computer Vision, ACCV 2014
Y2 - 1 November 2014 through 5 November 2014
ER -