Loading [MathJax]/extensions/MathMenu.js
Universal Source Coding of Deep Neural Networks | IEEE Conference Publication | IEEE Xplore

Universal Source Coding of Deep Neural Networks


Abstract:

Deep neural networks have shown incredible performance for inference tasks in a variety of domains. Unfortunately, most current deep networks are enormous cloud-based str...Show More

Abstract:

Deep neural networks have shown incredible performance for inference tasks in a variety of domains. Unfortunately, most current deep networks are enormous cloud-based structures that require significant storage space, which limits scaling of deep learning as a service (DLaaS). This paper is concerned with finding universal lossless compressed representations of deep feedforward networkswith synaptic weights drawn from discrete sets. The basic insight that allows much less rate than naive approaches is the recognition that the bipartite graph layers of feedforward networks have a kind of permutation invariance to the labeling of nodes, in terms of inferential operation. We provide efficient algorithms to dissipate this irrelevant uncertainty and then use arithmetic coding to nearly achieve the entropy bound in a universal manner.
Date of Conference: 04-07 April 2017
Date Added to IEEE Xplore: 11 May 2017
ISBN Information:
Electronic ISSN: 2375-0359
Conference Location: Snowbird, UT, USA

Contact IEEE to Subscribe

References

References is not available for this document.