Authors:
Saikat Basu
1
;
Manohar Karki
1
;
Malcolm Stagg
2
;
Robert DiBiano
1
;
Sangram Ganguly
3
and
Supratik Mukhopadhyay
1
Affiliations:
1
Louisiana State University, United States
;
2
Microsoft Corporation, United States
;
3
Bay Area Environmental Research Institute/NASA Ames Research Center, United States
Keyword(s):
Object tracking, Motion model, appearance model, Gaussian Mixture Background Subtraction, optical flow
Related
Ontology
Subjects/Areas/Topics:
Computer Vision, Visualization and Computer Graphics
;
Motion, Tracking and Stereo Vision
;
Optical Flow and Motion Analyses
;
Tracking and Visual Navigation
;
Video Stabilization
;
Video Surveillance and Event Detection
Abstract:
In this paper, we present MAPTrack - a robust tracking framework that uses a probabilistic scheme to combine a motion model of an object with that of its appearance and an estimation of its position. The motion of the object is modelled using the Gaussian Mixture Background Subtraction algorithm, the
appearance of the tracked object is enumerated using a color histogram and the projected location of the tracked object in the image space/frame sequence is computed by applying a Gaussian to the Region of Interest. Our tracking framework is robust to abrupt changes in lighting conditions, can follow an object
through occlusions, and can simultaneously track multiple moving foreground objects of different types (e.g., vehicles, human, etc.) even when they are closely spaced. It is able to start tracks automatically based on a spatio-temporal filtering algorithm. A "dynamic" integration of the framework with optical flow allows
us to track videos resulting from significant camera motion.
A C++ implementation of the framework has outperformed existing visual tracking algorithms on most videos in the Video Image Retrieval and Analysis Tool (VIRAT), TUD, and the Tracking-Learning-Detection (TLD) datasets.
(More)