Efficient multi-target tracking via discovering dense subgraphs

Behzad BOZORGTABAR, Roland GOECKE

    Research output: Contribution to journalArticle

    4 Citations (Scopus)
    12 Downloads (Pure)

    Abstract

    In this paper, we cast multi-target tracking as a dense subgraph discovering problem on the undirected relation graph of all given target hypotheses. We aim to extract multiple clusters (dense subgraphs), in which each cluster contains a set of hypotheses of one particular target. In the presence of occlusion or similar moving targets or when there is no reliable evidence for the target's presence, each target trajectory is expected to be fragmented into multiple tracklets. The proposed tracking framework can efficiently link such fragmented target trajectories to build a longer trajectory specifying the true states of the target. In particular, a discriminative scheme is devised via learning the targets' appearance models. Moreover, the smoothness characteristic of the target trajectory is utilised by suggesting a smoothness tracklet affinity model to increase the power of the proposed tracker to produce persistent target trajectories revealing different targets' moving paths. The performance of the proposed approach has been extensively evaluated on challenging public datasets and also in the context of team sports (e.g. soccer, AFL), where team players tend to exhibit quick and unpredictable movements. Systematic experimental results conducted on a large set of sequences show that the proposed approach performs better than the state-of-the-art trackers, in particular, when dealing with occlusion and fragmented target trajectory.
    Original languageEnglish
    Pages (from-to)205-216
    Number of pages12
    JournalComputer Vision and Image Understanding
    Volume144
    Issue numberC
    DOIs
    Publication statusPublished - 2016

    Fingerprint Dive into the research topics of 'Efficient multi-target tracking via discovering dense subgraphs'. Together they form a unique fingerprint.

  • Cite this