Spatio-Context-Based Target Tracking with Adaptive Multi-Feature Fusion for Real-World Hazy Scenes
The problem of serious air pollution such as haze and fog and other complex scenes has posed great challenges for target tracking in computer vision. Single-feature approaches demonstrating unstable performances lead to poor tracking. Through the multi-feature fusion and the focus of attention (FOA) mechanism in biological vision system, these problems under complex scenes can be solved. Accordingly, a tracking algorithm based on multi-feature fusion and spatio-context correlation is proposed. Accurate expression of multi-feature fusion by color, texture, and edge features, and adaptive weighted update by information entropy, has greatly enhanced the adaptability to environment variations. Then, combined with the spatio-temporal context algorithm, the target can be achieved accurately. Compared with the state-of-the-art tracking algorithms, the experiment results validate the effectiveness of our method in hazy scenes, furthermore, our method also results in the promotion of image vision quality. Specifically, the average center error is reduced to 0.9440 pixel, and the average overlap rate and the FPS rise to 0.8700 and 4.4302, respectively. Additionally, the halo artifacts and color cast of restored images have been avoided by the popular dehazing algorithm. Our proposed tracking algorithm outperforms the state-of-the-art methods in accuracy, robustness, and real-time even in complex real-world hazy scenes.
KeywordsTarget tracking Multi-feature fusion Adaptive weighted update Spatio-context Hazy scenes
This study was funded by the National Natural Science Foundation of People’s Republic of China (Grant No. 91026005) and the Fundamental Research Funds for the Central Universities (Grant Nos. ZYGX2016J131, ZYGX2016J138).
Compliance with Ethical Standards
This article does not contain any studies with human participants or animals performed by any of the authors.
- 3.Xie L, Yu Y, Huang Z. An online learning target tracking method based on extreme learning machine. Intelligent Control and Automation (WCICA). IEEE. 2016:2080–2085.Google Scholar
- 4.Zhang KH, Zhang L, Yang MH, et al. Fast tracking via spatio-temporal context learning. arXiv preprint arXiv. 2013:1311.1939.Google Scholar
- 7.Zhang K, Zhang L, Yang M. Real-time compressive tracking. In: Proceedings of ECCV. 2012;7574(1):864–877.Google Scholar
- 8.Deilamani MJ, Asli RN. Moving object tracking based on mean shift algorithm and features fusion. Int Symp Artif Intell Signal Process. 2011:48–53.Google Scholar
- 10.Jun-Fei L, Jian-Zhen W, Hong-Qin L. Vehicle traffic flow detection system based on video images under haze environment. image. 2016;3(10).Google Scholar
- 11.Gang W, Xiaoqin Z, Shoubao S, et al. Vehicle tracking incorporating low-rank sparse into particle filter in haze scene. In: 2016 3rd International Conference on Information Science and Control Engineering (ICISCE). IEEE. 2016:739–742.Google Scholar
- 12.Kim SK, Choi KH, Park SY. A framework for object detection by haze removal. J Inst Electron Inf Eng. 2014;51(5):168–76.Google Scholar
- 13.Yuan Y, Zhao Y, Wang X. Day and night vehicle detection and counting in complex environment. In: 2013 28th International Conference of Image and Vision Computing New Zealand (IVCNZ). IEEE. 2013;453–458.Google Scholar
- 14.Zhou Z, Wu D, Peng X, et al. Object tracking based on camshift with multi-feature fusion. J Softw. 2014;9(1):147–53.Google Scholar
- 15.Riahi D, Bilodeau G A. Multiple feature fusion in the dempster-shafer framework for multi-object tracking. Computer and Robot Vision (CRV), 2014 Canadian Conference on. IEEE. 2014;313–320.Google Scholar
- 20.Wang Z, Luo J, Qin K, et al. Model based edge-preserving and guided filter for real-world hazy scenes visibility restoration. Cogn Comput. 2017:1–14.Google Scholar
- 23.Zhang T, Ghanem B, Liu S, et al. Robust visual tracking via structured multi-task sparse learning. Int J Comput Vis. 2013;101(2);1838–1845.Google Scholar
- 24.Zhong W, Lu H, Yang MH. Robust object tracking via sparsity-based collaborative model. In: 2012 I.E. Conference on Computer vision and pattern recognition (CVPR). IEEE. 2012;1838–1845.Google Scholar
- 25.Mei X, Ling H. Robust visual tracking using L1 minimization. In: 2009 I.E. 12th International Conference on Computer Vision. 2009;1436–1443.Google Scholar
- 26.Kristan M., Pflugfelder R., Leonardis A, et al. The visual object tracking VOT 2013 challenge results. In: 2013 I.E. International Conference on Computer Vision Workshops (ICCVW). IEEE. 2013;98–111.Google Scholar