首页 | 本学科首页   官方微博 | 高级检索  
     检索      

外观动作自适应目标跟踪方法
引用本文:熊珺瑶,王蓉,孙义博.外观动作自适应目标跟踪方法[J].北京航空航天大学学报,2022,48(8):1525-1533.
作者姓名:熊珺瑶  王蓉  孙义博
作者单位:中国人民公安大学 信息与网络安全学院,北京 100038
基金项目:国家自然科学基金62076246
摘    要:为降低目标运动时产生的外观形变对目标跟踪的影响,在DaSiamese-RPN基础上进行改进,提出了一种外观动作自适应的目标跟踪方法。在孪生网络的子网络中引入外观动作自适应更新模块,融合目标的时空信息和动作特征;利用2种欧氏距离分别度量真实图和预测图之间的全局和局部差异,并对二者加权融合构建损失函数,加强预测目标特征图与真实目标特征图之间全局和局部信息的关联性。在VOT2016、VOT2018、VOT2019和OTB100数据集上进行测试,实验结果表明:在VOT2016和VOT2018数据集上,预测平均重叠率分别提高4.5%和6.1%;在VOT2019数据集上,准确度提高0.4%,预测平均重叠率降低1%;在OTB100数据集上,跟踪成功率提高0.3%,精确度提高0.2%。 

关 键 词:目标跟踪    外观动作自适应    孪生网络    特征融合    外观形变
收稿时间:2021-10-09

Appearance and action adaptive target tracking method
Institution:School of Information Technology and Network Security, People's Public Security University of China, Beijing 100038, China
Abstract:On the basis of DaSiamese-RPN, a target tracking approach of appearance and action adaptation is proposed to limit the effect of appearance deformation on target tracking when the target is moving.First of all, the appearance and action adaptive module is introduced in the subnet of the Siamese network, which integrates the object's spatial information and action feature. Secondly, the global and local divergence between the actual and predicted feature maps are measured by using two Euclidean distances, and the loss function is constructed by weighting the fusion of the two, so as to strengthen the correlation between the global and local information. Finally, tests were conducted on the VOT2016, VOT2018, VOT2019, and OTB100 datasets. The experimental results showed that the expected average overlap was improved by 4.5% and 6.1% in the VOT2016 and VOT2018 datasets respectively. On the VOT2019 dataset, accuracy increased by 0.4% and expected average overlap decreased by 1%; The tracking success rate was improved by 0.3% and accuracy increased by 0.2% when evaluated on the OTB100 dataset. 
Keywords:
点击此处可从《北京航空航天大学学报》浏览原始摘要信息
点击此处可从《北京航空航天大学学报》下载免费的PDF全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号