Flowchart of the Co-training Algorithm Based on Cyclical Learning Rates.
Abstract:
Deep learning has excelled in image classification, but noisy labels in large datasets pose a significant challenge, impacting performance and generalization. To tackle t...Show MoreMetadata
Abstract:
Deep learning has excelled in image classification, but noisy labels in large datasets pose a significant challenge, impacting performance and generalization. To tackle this, we propose a novel co-training method using cyclic learning rates. This method trains two networks simultaneously, each selecting clean samples based on loss values to optimize the other’s parameters, reducing overfitting and confirmation bias. The cyclic learning rate allows networks to oscillate between underfitting and overfitting, enhancing the distinction between clean and noisy samples. Our approach improves noise detection accuracy and robustness against label noise on datasets like CIFAR-10, CIFAR-100, and Clothing1M. Especially on CIFAR-10 and CIFAR-100 with 40% symmetric noise ratio, and Clothing1M, it outperforms the most relevant O2U-Net by 2.59%, 6.11%, and 0.57% in test accuracy, respectively, demonstrating superior noise resistance and classification accuracy under various noise conditions. Comprehensive experiments confirm the effectiveness of our method, advancing image classification in the presence of noisy labels.
Flowchart of the Co-training Algorithm Based on Cyclical Learning Rates.
Published in: IEEE Access ( Volume: 13)