Please use this identifier to cite or link to this item:
|Title:||Learning Modality-Consistency Feature Templates: A Robust RGB-Infrared Tracking System|
Yuen, Pong C.
|Publisher:||Institute of Electrical and Electronics Engineers (IEEE)|
|Citation:||IEEE Transactions on Industrial Electronics, 2019|
|Abstract:||With a large number of video surveillance systems installed for the requirement from industrial security, the task of object tracking, which aims to locate objects of interest in videos, is very important. Although numerous tracking algorithms for RGB videos have been developed in the decade, the tracking performance and robustness of these systems may be degraded dramatically when the information from RGB video is unreliable (e.g. poor illumination conditions or very low resolution). To address this issue, this paper presents a new tracking system which aims to combine the information from RGB and infrared modalities for object tracking. The proposed tracking systems is based on our proposed machine learning model. Particularly, the learning model can alleviate the modality discrepancy issue under the proposed modality consistency constraint from both representation patterns and discriminability, and generate discriminative feature templates for collaborative representations and discrimination in heterogeneous modalities. Experiments on a variety of challenging RGB-infrared videos demonstrate the effectiveness of the proposed algorithm.|
|Rights:||Copyright © 2019, Institute of Electrical and Electronics Engineers (IEEE). Deposited with reference to the publisher’s open access archiving policy. (http://www.rioxx.net/licenses/all-rights-reserved)|
|Appears in Collections:||Published Articles, Dept. of Computer Science|
Files in This Item:
|91640.pdf||Post-review (final submitted author manuscript)||4.14 MB||Adobe PDF||View/Open|
Items in LRA are protected by copyright, with all rights reserved, unless otherwise indicated.