Loading [a11y]/accessibility-menu.js
Simple Self-Distillation Learning for Noisy Image Classification | IEEE Conference Publication | IEEE Xplore

Simple Self-Distillation Learning for Noisy Image Classification


Abstract:

In computer vision, considerable success has been achieved in image classification. Yet, real-world applications can be negatively affected by noise corruption, so a meth...Show More

Abstract:

In computer vision, considerable success has been achieved in image classification. Yet, real-world applications can be negatively affected by noise corruption, so a method for dealing with noise is crucial. Knowledge distillation that utilizes the knowledge of a teacher model trained on clean images to train a student model on noisy images is a promising technique because it can be applied without special modification of the classifier. However, clean images are typically not available for most practical uses. To address this issue, we proposed a novel knowledge distillation method without clean images. By leveraging a property of the feature extractor in the classifier that naturally removes unrelated features for classification, we perform a simple training of the teacher model on noisy images, under the assumption that such a teacher can provide pseudo-clean features. Our experiments demonstrate that the proposed method can achieve classification performance comparable to conventional methods, even without clean images.
Date of Conference: 08-11 October 2023
Date Added to IEEE Xplore: 11 September 2023
ISBN Information:
Conference Location: Kuala Lumpur, Malaysia

References

References is not available for this document.