Target-Aware Tracking with Long-term Context Attention

BitTorrent tracker Robustness
DOI: 10.48550/arxiv.2302.13840 Publication Date: 2023-01-01
ABSTRACT
Most deep trackers still follow the guidance of siamese paradigms and use a template that contains only target without any contextual information, which makes it difficult for tracker to cope with large appearance changes, rapid movement, attraction from similar objects. To alleviate above problem, we propose long-term context attention (LCA) module can perform extensive information fusion on its frames, calculate correlation while enhancing features. The complete location as well state around target. LCA uses previous frame exclude interference objects complex backgrounds, thus accurately locating enabling obtain higher robustness regression accuracy. By embedding in Transformer, build powerful online target-aware backbone, termed TATrack. In addition, dynamic update algorithm based classification confidence historical additional calculation burden. Our achieves state-of-the-art performance multiple benchmarks, 71.1\% AUC, 89.3\% NP, 73.0\% AO LaSOT, TrackingNet, GOT-10k. code trained models are available https://github.com/hekaijie123/TATrack.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....