Introduction to Model Compression Knowledge Distillation | IEEE Conference Publication | IEEE Xplore

Introduction to Model Compression Knowledge Distillation


Abstract:

In recent years, with the rapid development of deep neural networks, deep learning has been applied to the fields of medicine, industry, education, and so on. However, du...Show More

Abstract:

In recent years, with the rapid development of deep neural networks, deep learning has been applied to the fields of medicine, industry, education, and so on. However, due to a large number of neural network parameters and huge storage space consumption, the problem makes it is difficult to be ported to mobile devices. As a result, deep learning model compression method can effectively alleviate the problem. Among them, knowledge distillation adopts the idea of transfer learning, using the teacher network model to guide student network, the model allows the student model to learn from the teacher, thereby to improve the robustness of the model. This article presents primarily the process of development and the trend of distilling knowledge.
Date of Conference: 09-11 April 2021
Date Added to IEEE Xplore: 26 April 2021
ISBN Information:
Conference Location: Xi'an, China

Contact IEEE to Subscribe

References

References is not available for this document.