Parallel and Distributed Training of Deep Neural Networks: A brief overview | IEEE Conference Publication | IEEE Xplore

Parallel and Distributed Training of Deep Neural Networks: A brief overview


Abstract:

Deep neural networks and deep learning are becoming important and popular techniques in modern services and applications. The training of these networks is computationall...Show More

Abstract:

Deep neural networks and deep learning are becoming important and popular techniques in modern services and applications. The training of these networks is computationally intensive, because of the extreme number of trainable parameters and the large amount of training samples. In this brief overview, current solutions aiming to speed up this training process via parallel and distributed computation are introduced. The necessary components and strategies are described from the low-level communication protocols to the high-level frameworks for the distributed deep learning. The current implementations of the deep learning frameworks with distributed computational capabilities are compared and key parameters are identified to help design effective solutions.
Date of Conference: 08-10 July 2020
Date Added to IEEE Xplore: 27 July 2020
ISBN Information:
Print on Demand(PoD) ISSN: 1543-9259
Conference Location: Reykjavík, Iceland

Contact IEEE to Subscribe

References

References is not available for this document.