Enhanced Laplacian Group Sparse Learning with Lifespan Outlier Rejection for Visual Tracking

Seyed Bozorgtabar, Roland GOECKE

Research output: A Conference proceeding or a Chapter in BookConference contribution

1 Downloads (Pure)

Abstract

Recently, sparse based learning methods have attracted much attention in robust visual tracking due to their effectiveness and promising tracking results. By representing the target object sparsely, utilising only a few adaptive dictionary templates, in this paper, we introduce a new particle filter based tracking method, in which we aim to capture the underlying structure among the particle samples using the proposed similarity graph in a Laplacian group sparse framework, such that the tracking results can be improved. Furthermore, in our tracker, particles contribute with different probabilities in the tracking result with respect to their relative positions in a given frame in regard to the current target object location. In addition, since the new target object can be well modelled by the most recent tracking results, we prefer to utilise the particle samples that are highly associated to the preceding tracking results. We demonstrate that the proposed formulation can be efficiently solved using the Accelerated Proximal method with just a small number of iterations. The proposed approach has been extensively evaluated on 12 challenging video sequences. Experimental results compared to the state-of-the-art methods demonstrate the merits of the proposed tracker.
Original languageEnglish
Title of host publication12th Asian Conference on Computer Vision (ACCV 2014)
Subtitle of host publicationLecture Notes in Computer Science
EditorsD Cremers, I Reid, H Saito, M-H Yang
Place of PublicationSwitzerland
PublisherSpringer
Pages564-578
Number of pages15
Volume9007
ISBN (Print)9783319168135
DOIs
Publication statusPublished - 2015
Event12th Asian Conference on Computer Vision - Singapore, Singapore, Singapore
Duration: 1 Nov 20145 Nov 2014

Conference

Conference12th Asian Conference on Computer Vision
CountrySingapore
CitySingapore
Period1/11/145/11/14

Fingerprint

Glossaries

Cite this

Bozorgtabar, S., & GOECKE, R. (2015). Enhanced Laplacian Group Sparse Learning with Lifespan Outlier Rejection for Visual Tracking. In D. Cremers, I. Reid, H. Saito, & M-H. Yang (Eds.), 12th Asian Conference on Computer Vision (ACCV 2014): Lecture Notes in Computer Science (Vol. 9007, pp. 564-578). Switzerland: Springer. https://doi.org/10.1007/978-3-319-16814-2_37
Bozorgtabar, Seyed ; GOECKE, Roland. / Enhanced Laplacian Group Sparse Learning with Lifespan Outlier Rejection for Visual Tracking. 12th Asian Conference on Computer Vision (ACCV 2014): Lecture Notes in Computer Science. editor / D Cremers ; I Reid ; H Saito ; M-H Yang. Vol. 9007 Switzerland : Springer, 2015. pp. 564-578
@inproceedings{73fe840834d24b319d77a53f0017d69a,
title = "Enhanced Laplacian Group Sparse Learning with Lifespan Outlier Rejection for Visual Tracking",
abstract = "Recently, sparse based learning methods have attracted much attention in robust visual tracking due to their effectiveness and promising tracking results. By representing the target object sparsely, utilising only a few adaptive dictionary templates, in this paper, we introduce a new particle filter based tracking method, in which we aim to capture the underlying structure among the particle samples using the proposed similarity graph in a Laplacian group sparse framework, such that the tracking results can be improved. Furthermore, in our tracker, particles contribute with different probabilities in the tracking result with respect to their relative positions in a given frame in regard to the current target object location. In addition, since the new target object can be well modelled by the most recent tracking results, we prefer to utilise the particle samples that are highly associated to the preceding tracking results. We demonstrate that the proposed formulation can be efficiently solved using the Accelerated Proximal method with just a small number of iterations. The proposed approach has been extensively evaluated on 12 challenging video sequences. Experimental results compared to the state-of-the-art methods demonstrate the merits of the proposed tracker.",
keywords = "Visual Tracking, Sparse Learning, Laplacian group",
author = "Seyed Bozorgtabar and Roland GOECKE",
year = "2015",
doi = "10.1007/978-3-319-16814-2_37",
language = "English",
isbn = "9783319168135",
volume = "9007",
pages = "564--578",
editor = "D Cremers and I Reid and H Saito and M-H Yang",
booktitle = "12th Asian Conference on Computer Vision (ACCV 2014)",
publisher = "Springer",
address = "Netherlands",

}

Bozorgtabar, S & GOECKE, R 2015, Enhanced Laplacian Group Sparse Learning with Lifespan Outlier Rejection for Visual Tracking. in D Cremers, I Reid, H Saito & M-H Yang (eds), 12th Asian Conference on Computer Vision (ACCV 2014): Lecture Notes in Computer Science. vol. 9007, Springer, Switzerland, pp. 564-578, 12th Asian Conference on Computer Vision, Singapore, Singapore, 1/11/14. https://doi.org/10.1007/978-3-319-16814-2_37

Enhanced Laplacian Group Sparse Learning with Lifespan Outlier Rejection for Visual Tracking. / Bozorgtabar, Seyed; GOECKE, Roland.

12th Asian Conference on Computer Vision (ACCV 2014): Lecture Notes in Computer Science. ed. / D Cremers; I Reid; H Saito; M-H Yang. Vol. 9007 Switzerland : Springer, 2015. p. 564-578.

Research output: A Conference proceeding or a Chapter in BookConference contribution

TY - GEN

T1 - Enhanced Laplacian Group Sparse Learning with Lifespan Outlier Rejection for Visual Tracking

AU - Bozorgtabar, Seyed

AU - GOECKE, Roland

PY - 2015

Y1 - 2015

N2 - Recently, sparse based learning methods have attracted much attention in robust visual tracking due to their effectiveness and promising tracking results. By representing the target object sparsely, utilising only a few adaptive dictionary templates, in this paper, we introduce a new particle filter based tracking method, in which we aim to capture the underlying structure among the particle samples using the proposed similarity graph in a Laplacian group sparse framework, such that the tracking results can be improved. Furthermore, in our tracker, particles contribute with different probabilities in the tracking result with respect to their relative positions in a given frame in regard to the current target object location. In addition, since the new target object can be well modelled by the most recent tracking results, we prefer to utilise the particle samples that are highly associated to the preceding tracking results. We demonstrate that the proposed formulation can be efficiently solved using the Accelerated Proximal method with just a small number of iterations. The proposed approach has been extensively evaluated on 12 challenging video sequences. Experimental results compared to the state-of-the-art methods demonstrate the merits of the proposed tracker.

AB - Recently, sparse based learning methods have attracted much attention in robust visual tracking due to their effectiveness and promising tracking results. By representing the target object sparsely, utilising only a few adaptive dictionary templates, in this paper, we introduce a new particle filter based tracking method, in which we aim to capture the underlying structure among the particle samples using the proposed similarity graph in a Laplacian group sparse framework, such that the tracking results can be improved. Furthermore, in our tracker, particles contribute with different probabilities in the tracking result with respect to their relative positions in a given frame in regard to the current target object location. In addition, since the new target object can be well modelled by the most recent tracking results, we prefer to utilise the particle samples that are highly associated to the preceding tracking results. We demonstrate that the proposed formulation can be efficiently solved using the Accelerated Proximal method with just a small number of iterations. The proposed approach has been extensively evaluated on 12 challenging video sequences. Experimental results compared to the state-of-the-art methods demonstrate the merits of the proposed tracker.

KW - Visual Tracking

KW - Sparse Learning

KW - Laplacian group

U2 - 10.1007/978-3-319-16814-2_37

DO - 10.1007/978-3-319-16814-2_37

M3 - Conference contribution

SN - 9783319168135

VL - 9007

SP - 564

EP - 578

BT - 12th Asian Conference on Computer Vision (ACCV 2014)

A2 - Cremers, D

A2 - Reid, I

A2 - Saito, H

A2 - Yang, M-H

PB - Springer

CY - Switzerland

ER -

Bozorgtabar S, GOECKE R. Enhanced Laplacian Group Sparse Learning with Lifespan Outlier Rejection for Visual Tracking. In Cremers D, Reid I, Saito H, Yang M-H, editors, 12th Asian Conference on Computer Vision (ACCV 2014): Lecture Notes in Computer Science. Vol. 9007. Switzerland: Springer. 2015. p. 564-578 https://doi.org/10.1007/978-3-319-16814-2_37