Loading [MathJax]/extensions/MathZoom.js
Weight Dropout for Preventing Neural Networks from Overfitting | IEEE Conference Publication | IEEE Xplore

Weight Dropout for Preventing Neural Networks from Overfitting


Abstract:

This paper briefly introduces an enhanced neural network regularization method, so called weight dropout, in order to prevent deep neural networks from overfitting. In su...Show More

Abstract:

This paper briefly introduces an enhanced neural network regularization method, so called weight dropout, in order to prevent deep neural networks from overfitting. In suggested method, the fully connected layer jointly used with weight dropout is a collection of layers in which the weights between nodes are dropped randomly on the process of training. To accomplish the desired regularization method, we propose a building blocks with our weight dropout mask and CNN. The performance of proposed method has been compared with other previous methods in the domain of image classification and segmentation for the evaluation purpose. The results show that the proposed method gives successful performance accuracies in several datasets.
Date of Conference: 18-21 December 2020
Date Added to IEEE Xplore: 07 July 2021
ISBN Information:
Conference Location: Daegu, Korea (South)

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.