LSTM recurrent networks learn simple context-free and context-sensitive languages | IEEE Journals & Magazine | IEEE Xplore

LSTM recurrent networks learn simple context-free and context-sensitive languages


Abstract:

Previous work on learning regular languages from exemplary training sequences showed that long short-term memory (LSTM) outperforms traditional recurrent neural networks ...Show More

Abstract:

Previous work on learning regular languages from exemplary training sequences showed that long short-term memory (LSTM) outperforms traditional recurrent neural networks (RNNs). We demonstrate LSTMs superior performance on context-free language benchmarks for RNNs, and show that it works even better than previous hardwired or highly specialized architectures. To the best of our knowledge, LSTM variants are also the first RNNs to learn a simple context-sensitive language, namely a/sup n/b/sup n/c/sup n/.
Published in: IEEE Transactions on Neural Networks ( Volume: 12, Issue: 6, November 2001)
Page(s): 1333 - 1340
Date of Publication: 30 November 2001

ISSN Information:

PubMed ID: 18249962

Contact IEEE to Subscribe

References

References is not available for this document.