Abstract
The low reliability of feature and tracking box detection remains a problem in visual object tracking. Currently, most discriminative correlation filter (DCF) trackers integrate multiple types of features with fixed weights. However, these approaches cannot be adapted to complex scenes when fixed weights are used for features. In addition, the tracking box of traditional DCF trackers lacks scale and aspect ratio adaptability, which can inevitably lead to excessive background noise. To address these problems, we propose a robust tracking method for unmanned aerial vehicles (UAVs) using dynamic feature weight selection. Specifically, we define a feature weight pool that contains multiple weights for different features. In each frame, we select a weight combination with high reliability from the weight pool. This approach is a form of dynamic feature weight selection since the feature weight may be different for each frame. Furthermore, EdgeBoxes is combined with the DSST and can adapt well to the scale and aspect ratio of the tracking box. Extensive experiments based on UAV123@10fps, VisDrone2018-test-dev, and UAVDT show that our tracker is superior to other state-of-the-art trackers. It is noteworthy that the proposed dynamic feature weight selection method can be embedded into any tracking model using multiple features.








Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Bonatti R, Ho C, Wang W, Choudhury S, Scherer S (2019) Towards a robust aerial cinematography platform: Localizing and tracking moving targets in unstructured environments. In: 2019 IEEE/RSJ International conference on intelligent robots and systems (IROS), pp 229–236. https://doi.org/10.1109/IROS40897.2019.8968163https://doi.org/10.1109/IROS40897.2019.8968163
collab=K M., H. C E. (2019) Uav traffic patrolling via road detection and tracking in anonymous aerial video frames. In: Journal of intelligent robotic systems, vol 95, pp 675–690. https://doi.org/10.1007/s10846-018-0954-x
Xue X, Li Y, Shen Q (2018) Unmanned aerial vehicle object tracking by correlation filter with adaptive appearance model. Sensors 18(9). https://doi.org/10.3390/s18092751
Jiang B, Zhang Y, Tang J, Luo B, Li C (2019) Robust visual tracking via laplacian regularized random walk ranking. Neurocomputing 339:139–148. https://doi.org/10.1016/j.neucom.2019.01.102
Chi Y, Zhixiang L, Youmin Z (2017) Aerial images-based forest fire detection for firefighting using optical remote sensing techniques and unmanned aerial vehicles. Journal of Intelligent & Robotic Systems 88:635–654. https://doi.org/10.1007/s10846-016-0464-7
Huang Z, Fu C, Li Y, Lin F, Lu P (2019) Learning aberrance repressed correlation filters for real-time uav tracking. In: 2019 IEEE/CVF International conference on computer vision (ICCV), pp 2891–2900. https://doi.org/10.1109/ICCV.2019.00298
Li Y, Fu C, Ding F, Huang Z, Lu G (2020) Autotrack: Towards high-performance visual tracking for uav with automatic spatio-temporal regularization. In: 2020 IEEE/CVF conference on computer vision and pattern recognition (CVPR), pp 11920–11929. https://doi.org/10.1109/CVPR42600.2020.01194
Wang M, Liu Y, Huang Z (2017) Large margin object tracking with circulant feature maps. In: 2017 IEEE Conference on computer vision and pattern recognition (CVPR), pp 4800–4808
Yan Y, Guo X, Tang J, Li C, Wang X (2021) Learning spatio-temporal correlation filter for visual tracking. Neurocomputing 436:273–282. https://doi.org/10.1016/j.neucom.2021.01.057
Yun S, Choi J, Yoo Y, Yun K, Choi JY (2017) Action-decision networks for visual tracking with deep reinforcement learning. In: 2017 IEEE Conference on computer vision and pattern recognition (CVPR), pp 1349–1358. https://doi.org/10.1109/CVPR.2017.148
Cen M, Jung C (2018) Fully convolutional siamese fusion networks for object tracking. In: 2018 25Th IEEE international conference on image processing (ICIP), pp 3718–3722. https://doi.org/10.1109/ICIP.2018.8451102
Voigtlaender P, Luiten J, Torr PHS, Leibe B (2020) Siam R-CNN: visual tracking by re-detection. In: 2020 IEEE/CVF conference on computer vision and pattern recognition, CVPR 2020, Seattle, WA, USA, Computer Vision Foundation IEEE, pp 6577–6587. https://doi.org/10.1109/CVPR42600.2020.00661
Sun X, Han G, Guo L, Yang H, Wu X, Li Q (2022) Two-stage aware attentional siamese network for visual tracking. Pattern Recogn 124:108502. https://doi.org/10.1016/j.patcog.2021.108502https://doi.org/10.1016/j.patcog.2021.108502
Danelljan M, Bhat G, Khan FS, Felsberg M (2017) Eco: Efficient convolution operators for tracking. In: 2017 IEEE Conference on computer vision and pattern recognition (CVPR), pp 6931–6939. https://doi.org/10.1109/CVPR.2017.733
Zhang J, Feng W, Yuan T, Wang J, Sangaiah AK (2022) Scstcf: Spatial-channel selection and temporal regularized correlation filters for visual tracking. Appl Soft Comput 118:108485. https://doi.org/10.1016/j.asoc.2022.108485
Zhang T, Xu C, Yang M-H (2017) Multi-task correlation particle filter for robust object tracking. In: 2017 IEEE Conference on computer vision and pattern recognition (CVPR), pp 4819–4827. https://doi.org/10.1109/CVPR.2017.512
Moorthy S, Choi JY, Joo YH (2020) Gaussian-response correlation filter for robust visual object tracking. Neurocomputing 411:78–90. https://doi.org/10.1016/j.neucom.2020.06.016
Gang X, Xingchen Z, Ping Y, Xingzhong X, Shengyun P (2020) Anti-occlusion object tracking based on correlation filter. SIViP 14:753–761. https://doi.org/10.1007/s11760-019-01601-6
Junrong Y, Luchao Z, Yingbiao Y, Xin X, Chenjie D (2021) Dual-template adaptive correlation filter for real-time object tracking. Multimed Tools Appl 80:2355–2376. https://doi.org/10.1007/s11042-020-09644-5
Danelljan M, Häger G, Khan FS, Felsberg M (2017) Discriminative scale space tracking. IEEE Trans Pattern Anal Mach Intell 39(8):1561–1575. https://doi.org/10.1109/TPAMI.2016.2609928
Wang C, Zhang L, Xie L, Yuan J (2018) Kernel cross-correlator. In: AAAI Conference on artificial intelligence, vol 32. https://ojs.aaai.org/index.php/AAAI/article/view/11710
Huang D, Luo L, Wen M, Chen Z, Zhang C (2015) Enable scale and aspect ratio adaptability in visual tracking with detection proposals. In: Proceedings of the british machine vision conference (BMVC), BMVA Press, pp 1–12. https://doi.org/10.5244/C.29.185
Ma C, Huang J-B, Yang X, Yang M-H (2019) Robust visual tracking via hierarchical convolutional features. IEEE Trans Pattern Anal Mach Intell 41(11):2709–2723. https://doi.org/10.1109/TPAMI.2018.2865311
Zitnick CL, Dollár P (2014) Edge boxes: Locating object proposals from edges. In: Fleet D, Pajdla T, Schiele B, Tuytelaars T (eds) Computer Vision – ECCV 2014, Springer, pp 391–405. https://doi.org/10.1007/978-3-319-10602-1_26
Lina G, Bing L, Ping F, Mingzhu X, Junbao L (2021) Visual tracking via dynamic saliency discriminative correlation filter. Applied Intelligence. https://doi.org/10.1007/s10489-021-02260-2
Elayaperumal D, Joo YH (2021) Aberrance suppressed spatio-temporal correlation filters for visual object tracking. Pattern Recogn 115:107922. https://doi.org/10.1016/j.patcog.2021.107922
Li F, Tian C, Zuo W, Zhang L, Yang M-H (2018) Learning spatial-temporal regularized correlation filters for visual tracking. In: 2018 IEEE/CVF Conference on computer vision and pattern recognition, pp 4904–4913. https://doi.org/10.1109/CVPR.2018.00515https://doi.org/10.1109/CVPR.2018.00515
Galoogahi HK, Fagg A, Lucey S (2017) Learning background-aware correlation filters for visual tracking. In: 2017 IEEE International conference on computer vision (ICCV), pp 1144–1152. https://doi.org/10.1109/ICCV.2017.129
Li B, Yan J, Wu W, Zhu Z, Hu X (2018) High performance visual tracking with siamese region proposal network. In: 2018 IEEE conference on computer vision and pattern recognition, CVPR 2018, Salt Lake City, UT, USA, Computer Vision Foundation / IEEE Computer Society, pp 8971–8980. https://doi.org/10.1109/CVPR.2018.00935
Li B, Wu W, Wang Q, Zhang F, Xing J, Yan J (2019) Siamrpn++: Evolution of siamese visual tracking with very deep networks. In: IEEE conference on computer vision and pattern recognition, CVPR 2019, Long Beach, CA, USA, Computer Vision Foundation / IEEE, pp 4282–4291. https://doi.org/10.1109/CVPR.2019.00441
Yu Y, Xiong Y, Huang W, Scott MR (2020) Deformable siamese attention networks for visual object tracking. In: 2020 IEEE/CVF conference on computer vision and pattern recognition, CVPR 2020, Seattle, WA, USA, Computer Vision Foundation / IEEE, pp 6727–6736. https://doi.org/10.1109/CVPR42600.2020.00676
Guo D, Wang J, Cui Y, Wang Z, Chen S (2020) Siamcar: Siamese fully convolutional classification and regression for visual tracking. In: 2020 IEEE/cvf conference on computer vision and pattern recognition, CVPR 2020, Seattle, WA, USA, Computer Vision Foundation / IEEE, pp 6268–6276. https://doi.org/10.1109/CVPR42600.2020.00630
Guo D, Shao Y, Cui Y, Wang Z, Zhang L, Shen C (2021) Graph attention tracking. In: 2021 IEEE/CVF Conference on computer vision and pattern recognition (CVPR), pp 9538–9547. https://doi.org/10.1109/CVPR46437.2021.00942
Yang K, He Z, Pei W, Zhou Z, Li X, Yuan D, Zhang H (2021) Siamcorners: Siamese corner networks for visual tracking. IEEE Trans Multimed, pp 1–1. https://doi.org/10.1109/TMM.2021.3074239https://doi.org/10.1109/TMM.2021.3074239
Wang Y, Luo X, Ding L, Fu S, Wei X (2019) Robust visual tracking based on response stability. Eng Appl Artif Intell 85:137–149. https://doi.org/10.1016/j.engappai.2019.05.002
Liu Y, Li S, Cheng M-M (2020) Refinedbox: Refining for fewer and high-quality object proposals. Neurocomputing 406:106–116. https://doi.org/10.1016/j.neucom.2020.04.017
Ke W, Chen J, Ye Q (2019) Deep contour and symmetry scored object proposal. Pattern Recogn Lett 119:172–179. https://doi.org/10.1016/j.patrec.2018.01.004
Liang Y, Liu Y, Yan Y, Zhang L, Wang H (2021) Robust visual tracking via spatio-temporal adaptive and channel selective correlation filters. Pattern Recogn 112:107738. https://doi.org/10.1016/j.patcog.2020.107738
Mueller M, Smith N, Ghanem B (2016) A benchmark and simulator for uav tracking. In: Leibe B, Matas J, Sebe N, Welling M (eds) Computer Vision – ECCV 2016, Springer, pp 445–461. https://doi.org/10.1007/978-3-319-46448-0_27
Wen L, Zhu P, Du D, Bian X, Ling H, Hu Q, Liu C, Cheng H, Liu X, Ma W (2018) Visdrone-sot2018: The vision meets drone single-object tracking challenge results. In: Computer Vision - ECCV 2018 Workshops - Munich, Germany, Proceedings, Part V, vol 11133, Springer, pp 469–495. https://doi.org/10.1007/978-3-030-11021-5_28
Du D, Qi Y, Yu H, Yang Y, Duan K, Li G, Zhang W, Huang Q, Tian Q (2018) The Unmanned Aerial Vehicle Benchmark: Object Detection and Tracking. Springer. https://doi.org/10.1007/978-3-030-01249-6_23
Li Y, Fu C, Ding F, Huang Z, Pan J (2020) Augmented memory for correlation filters in real-time uav tracking. In: 2020 IEEE/RSJ international conference on intelligent robots and systems (IROS), pp. 1559–1566. https://doi.org/10.1109/IROS45743.2020.9341595
Li B, Fu C, Ding F, Ye J, Lin F (2021) Adtrack: Target-aware dual filter learning for real-time anti-dark UAV tracking. In: IEEE International conference on robotics and automation, ICRA 2021, Xi’an, China, IEEE, pp 496–502. https://doi.org/10.1109/ICRA48506.2021.9561564
Funding
This research was funded by Natural Science Foundation of Shandong Province, grant number ZR2021MF068, ZR2021MF107, ZR2021MF015, ZR2020MA030 and Key R&D Program (Soft Science) Project of Shandong Province, grant number 2020RKB01017 and School-level Teaching Reform Project of Shandong Technology and Business University, grant number 11688202023.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Competing interests
The authors declare no conflict of interest.
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
An, Z., Wang, X., Li, B. et al. Robust visual tracking for UAVs with dynamic feature weight selection. Appl Intell 53, 3836–3849 (2023). https://doi.org/10.1007/s10489-022-03719-6
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10489-022-03719-6