Our framework learns to dynamically balance local losses to optimize a global loss via meta-learning. A local loss block attached to a hidden layer captures local error s...
Abstract:
The standard training for deep neural networks relies on a global and fixed loss function. For more effective training, dynamic loss functions have been recently proposed...Show MoreMetadata
Abstract:
The standard training for deep neural networks relies on a global and fixed loss function. For more effective training, dynamic loss functions have been recently proposed. However, the dynamic global loss function is not flexible to differentially train layers in complex deep neural networks. In this paper, we propose a general framework that learns to adaptively train each layer of deep neural networks via meta-learning. Our framework leverages the local error signals from layers and identifies which layer needs to be trained more at every iteration. Also, the proposed method improves the local loss function with our minibatch-wise dropout and cross-validation loop to alleviate meta-overfitting. The experiments show that our method achieved competitive performance compared to state-of-the-art methods on popular benchmark datasets for image classification: CIFAR-10 and CIFAR-100. Surprisingly, our method enables training deep neural networks without skip-connections using dynamically weighted local loss functions.
Our framework learns to dynamically balance local losses to optimize a global loss via meta-learning. A local loss block attached to a hidden layer captures local error s...
Published in: IEEE Access ( Volume: 9)
Funding Agency:
Iterative Training Sample Augmentation for Enhancing Land Cover Change Detection Performance With Deep Learning Neural Network
Zhiyong Lv,Haitao Huang,Weiwei Sun,Meng Jia,Jón Atli Benediktsson,Fengrui Chen
Addressing Deep Learning Model Calibration Using Evidential Neural Networks and Uncertainty-Aware Training
Tareen Dawood,Emily Chan,Reza Razavi,Andrew P. King,Esther Puyol-Antón
Standard Training Dataset vs. Different Testing Dataset to Compare Deep Learning Architectures Models in Diagnosing COVID-19
Khalida A. Saeed,Wasfi T. Kahwachi
Training Spiking Neural Networks Using Lessons From Deep Learning
Jason K. Eshraghian,Max Ward,Emre O. Neftci,Xinxin Wang,Gregor Lenz,Girish Dwivedi,Mohammed Bennamoun,Doo Seok Jeong,Wei D. Lu
A Deep Learning-Driven Approach for Detecting Lung and Colon Cancer Using Pre-Trained Neural Networks
Majdi Rawashdeh,Muath A. Obaidat,Meryem Abouali,Kutub Thakur
Enhancing Deep Learning with Optimized Gradient Descent: Bridging Numerical Methods and Neural Network Training
Yuhan Ma,Dan Sun,Erdi Gao,Ningjing Sang,Iris Li,Guanming Huang
Training a Neural Network to Predict House Rents Using Artifical Intelligence and Deep Learning
Chung-Hsing Chao
Distributed Deep Neural Networks Training in a Multi-Worker Environment
Gowsikan Nakuleswaran,Jathurshan Sownthararasa,Jananie Jarachanthan
Modified ResNet-50 for Training the Neural Network in Pothole Detection using Deep Learning in Matlab
Marius-Emanuel Obreja,Dan-Marius Dobrea
ConvTimeNet: A Pre-trained Deep Convolutional Neural Network for Time Series Classification
Kathan Kashiparekh,Jyoti Narwariya,Pankaj Malhotra,Lovekesh Vig,Gautam Shroff