Loading [a11y]/accessibility-menu.js
Progressive Cross-Modal Association Learning for Unsupervised Visible-Infrared Person Re-Identification | IEEE Journals & Magazine | IEEE Xplore

Progressive Cross-Modal Association Learning for Unsupervised Visible-Infrared Person Re-Identification


Abstract:

Unsupervised visible-infrared person re-identification (USL-VI-ReID) aims to explore the cross-modal associations and learn modality-invariant representations without man...Show More

Abstract:

Unsupervised visible-infrared person re-identification (USL-VI-ReID) aims to explore the cross-modal associations and learn modality-invariant representations without manual labels. The field provides flexible and economical methods for person re-identification across light and dark scenes. Existing approaches utilize cluster-level strong association methods, such as graph matching and optimal transport, to correlate modal differences, which may result in mis-linking between clusters and introduce noise. To overcome this limitation and gradually acquire reliable cross-modal associations, we propose a Progressive Cross-modal Association Learning (PCAL) method for USL-VI-ReID. Specifically, our PCAL naturally integrates Triple-modal Adversarial Learning (TAL), Cross-modal Neighbor Expansion (CNE) and Modality-invariant Contrastive Learning (MCL) into a unified framework. TAL fully utilizes the advantage of Channel Augmented (CA) technique to reduce modal differences, which facilitates subsequent mining of cross-modal associations. Furthermore, we identify the modal bias problem in existing clustering methods, which hinders the effective establishment of cross-modal associations. To address this problem, CNE is proposed to balance the contribution of cross-modal neighbor information, linking potential cross-modal neighbors as much as possible. Finally, MCL is then introduced to refine the cross-modal associations and learn modality-invariant representations. Extensive experiments on SYSU-MM01 and RegDB datasets demonstrate the competitive performance of PCAL method. Code is available at https://github.com/YimingYang23/PCA_USLVIReID.
Page(s): 1290 - 1304
Date of Publication: 08 January 2025

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.