Abstract:
In recent years, with the rapid development of deep neural networks, deep learning has been applied to the fields of medicine, industry, education, and so on. However, du...Show MoreMetadata
Abstract:
In recent years, with the rapid development of deep neural networks, deep learning has been applied to the fields of medicine, industry, education, and so on. However, due to a large number of neural network parameters and huge storage space consumption, the problem makes it is difficult to be ported to mobile devices. As a result, deep learning model compression method can effectively alleviate the problem. Among them, knowledge distillation adopts the idea of transfer learning, using the teacher network model to guide student network, the model allows the student model to learn from the teacher, thereby to improve the robustness of the model. This article presents primarily the process of development and the trend of distilling knowledge.
Published in: 2021 6th International Conference on Intelligent Computing and Signal Processing (ICSP)
Date of Conference: 09-11 April 2021
Date Added to IEEE Xplore: 26 April 2021
ISBN Information: