Abstract:
Convolutional neural networks (CNNs) are a powerful tool for various computer vision tasks, demonstrating exceptional performance in image classification, object detectio...Show MoreMetadata
Abstract:
Convolutional neural networks (CNNs) are a powerful tool for various computer vision tasks, demonstrating exceptional performance in image classification, object detection, and segmentation. However, traditional training methods often require meticulous hyperparameter tuning, architectural adjustments, or the introduction of additional data through techniques such as data augmentation to achieve optimal accuracy. This letter introduces an innovative training strategy that leverages layer freezing to enhance the training process while keeping the model's architecture and hyperparameters unchanged. By selectively and progressively freezing certain hidden layers in the CNN, we prevent the model from reaching a saturation point. This approach effectively reduces the backpropagation parameter space, facilitating more focused and efficient learning in the remaining layers.
Published in: IEEE Signal Processing Letters ( Volume: 32)
References is not available for this document.
References is not available for this document.