Transformer-based Models for Enhanced Amur Tiger Re-Identification | IEEE Conference Publication | IEEE Xplore

Transformer-based Models for Enhanced Amur Tiger Re-Identification


Abstract:

Rapid urban growth, with its profound impact on natural habitats, intensifies the global risk faced by many wildlife species, driving them closer to the brink of extincti...Show More

Abstract:

Rapid urban growth, with its profound impact on natural habitats, intensifies the global risk faced by many wildlife species, driving them closer to the brink of extinction due to factors like habitat destruction, illegal hunting, and the challenges posed by climate change. The urgency of this situation is highlighted by the current status of the Amur tigers, emphasising the need for continuous observation to ensure their survival. Within this context, re-identification (Re-ID) emerges as the method for recognising individual entities based on previously captured data. This study is dedicated to the re-identification of Amur tigers, employing the Amur Tiger Re-identification in the Wild (ATRW) dataset and placing a significant emphasis on assessing various deep learning architectures, particularly focusing on transformer-based models. Several neural network architectures, including Vision Transformer (ViT), Multiple Granularity Network (MGN), and Neighbor Transformer (NFormer), were explored. The results indicate that transformer-based methods hold substantial promise for further advancements in re-identification tasks. Notably, the ViT model achieved an impressive mAP score of 80.8, while the combination of ViT with MGN yielded an exceptional mAP of 83.4, surpassing the best benchmark method by an 9.3% in a single-camera scenario. Additionally, the NFormer architecture demonstrated comparable results, boasting a mAP score of 81.1.
Date of Conference: 25-27 January 2024
Date Added to IEEE Xplore: 14 February 2024
ISBN Information:

ISSN Information:

Conference Location: Stará Lesná, Slovakia

Contact IEEE to Subscribe

References

References is not available for this document.