LMOT: Efficient Light-Weight Detection and Tracking in Crowds | IEEE Journals & Magazine | IEEE Xplore

LMOT: Efficient Light-Weight Detection and Tracking in Crowds


Simplified DLA-34 generates crucial detection feature maps from current image frame encoding spatial and semantic representations. Linear transformer takes the previous i...

Abstract:

Multi-object tracking is a vital component in various robotics and computer vision applications. However, existing multi-object tracking techniques trade off computation ...Show More

Abstract:

Multi-object tracking is a vital component in various robotics and computer vision applications. However, existing multi-object tracking techniques trade off computation runtime for tracking accuracy leading to challenges in deploying such pipelines in real-time applications. This paper introduces a novel real-time model, LMOT, i.e., Light-weight Multi-Object Tracker, that performs joint pedestrian detection and tracking. LMOT introduces a simplified DLA-34 encoder network to extract detection features for the current image that are computationally efficient. Furthermore, we generate efficient tracking features using a linear transformer for the prior image frame and its corresponding detection heatmap. After that, LMOT fuses both detection and tracking feature maps in a multi-layer scheme and performs a two-stage online data association relying on the Kalman filter to generate tracklets. We evaluated our model on the challenging real-world MOT16/17/20 datasets, showing LMOT significantly outperforms the state-of-the-art trackers concerning runtime while maintaining high robustness. LMOT is approximately ten times faster than state-of-the-art trackers while being only 3.8% behind in performance accuracy on average leading to a much computationally lighter model.
Simplified DLA-34 generates crucial detection feature maps from current image frame encoding spatial and semantic representations. Linear transformer takes the previous i...
Published in: IEEE Access ( Volume: 10)
Page(s): 83085 - 83095
Date of Publication: 08 August 2022
Electronic ISSN: 2169-3536

References

References is not available for this document.