SequenceOut: Boosting CNNs by Freezing Layers | IEEE Journals & Magazine | IEEE Xplore

SequenceOut: Boosting CNNs by Freezing Layers


Abstract:

Convolutional neural networks (CNNs) are a powerful tool for various computer vision tasks, demonstrating exceptional performance in image classification, object detectio...Show More

Abstract:

Convolutional neural networks (CNNs) are a powerful tool for various computer vision tasks, demonstrating exceptional performance in image classification, object detection, and segmentation. However, traditional training methods often require meticulous hyperparameter tuning, architectural adjustments, or the introduction of additional data through techniques such as data augmentation to achieve optimal accuracy. This letter introduces an innovative training strategy that leverages layer freezing to enhance the training process while keeping the model's architecture and hyperparameters unchanged. By selectively and progressively freezing certain hidden layers in the CNN, we prevent the model from reaching a saturation point. This approach effectively reduces the backpropagation parameter space, facilitating more focused and efficient learning in the remaining layers.
Published in: IEEE Signal Processing Letters ( Volume: 32)
Page(s): 1401 - 1405
Date of Publication: 20 March 2025

ISSN Information:

References is not available for this document.

References is not available for this document.

Contact IEEE to Subscribe

References

References is not available for this document.