Loading [MathJax]/extensions/MathMenu.js
Meta-Learner with Sparsified Backpropagation | IEEE Conference Publication | IEEE Xplore

Meta-Learner with Sparsified Backpropagation


Abstract:

In todays world, Deep Learning is an area of research that has ever increasing applications. It deals with the use of neural networks to bring improvements in areas like ...Show More

Abstract:

In todays world, Deep Learning is an area of research that has ever increasing applications. It deals with the use of neural networks to bring improvements in areas like speech recognition, computer vision, natural language processing and several automated systems. Training deep neural networks involves careful selection of appropriate training examples, tuning of hyperparameters and scheduling step sizes, finding a proper combination of all these is a tedious and time-consuming task. In the recent times, a few learning-to-learn models have been proposed, that can learn automatically. The time and accuracy of the model are exceedingly important. A technique named meProp was proposed to accelerate Deep Learning with reduced over-fitting. meProp is a method that proposes a sparsified back propagation method which reduces the computational cost. In this paper, we propose an application of meProp to the learning-to-learn models to focus on learning of the most significant parameters which are consciously chosen. We demonstrate improvement in accuracy of the learning-to-learn model with the proposed technique and compare the performance with that of the unmodified learning-to-learn model.
Date of Conference: 10-11 January 2019
Date Added to IEEE Xplore: 29 July 2019
ISBN Information:
Conference Location: Noida, India

Contact IEEE to Subscribe

References

References is not available for this document.