Drop Loss for Person Attribute Recognition With Imbalanced Noisy-Labeled Samples | IEEE Journals & Magazine | IEEE Xplore

Scheduled Maintenance: On Tuesday, May 20, IEEE Xplore will undergo scheduled maintenance from 1:00-5:00 PM ET (6:00-10:00 PM UTC). During this time, there may be intermittent impact on performance. We apologize for any inconvenience.

Drop Loss for Person Attribute Recognition With Imbalanced Noisy-Labeled Samples


Abstract:

Person attribute recognition (PAR) aims to simultaneously predict multiple attributes of a person. Existing deep learning-based PAR methods have achieved impressive perfo...Show More

Abstract:

Person attribute recognition (PAR) aims to simultaneously predict multiple attributes of a person. Existing deep learning-based PAR methods have achieved impressive performance. Unfortunately, these methods usually ignore the fact that different attributes have an imbalance in the number of noisy-labeled samples in the PAR training datasets, thus leading to suboptimal performance. To address the above problem of imbalanced noisy-labeled samples, we propose a novel and effective loss called drop loss for PAR. In the drop loss, the attributes are treated differently in an easy-to-hard way. In particular, the noisy-labeled candidates, which are identified according to their gradient norms, are dropped with a higher drop rate for the harder attribute. Such a manner adaptively alleviates the adverse effect of imbalanced noisy-labeled samples on model learning. To illustrate the effectiveness of the proposed loss, we train a simple ResNet-50 model based on the drop loss and term it DropNet. Experimental results on two representative PAR tasks (including facial attribute recognition and pedestrian attribute recognition) demonstrate that the proposed DropNet achieves comparable or better performance in terms of both balanced accuracy and classification accuracy over several state-of-the-art PAR methods.
Published in: IEEE Transactions on Cybernetics ( Volume: 53, Issue: 11, November 2023)
Page(s): 7071 - 7084
Date of Publication: 23 May 2022

ISSN Information:

PubMed ID: 35604981

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.