Knowledge Distillation for Recurrent Neural Network Language Modeling with Trust Regularization | IEEE Conference Publication | IEEE Xplore