Improving Tree-LSTM with Tree Attention | IEEE Conference Publication | IEEE Xplore

Improving Tree-LSTM with Tree Attention


Abstract:

In Natural Language Processing (NLP), we often need to extract information from tree topology. Sentence structure can be represented via a dependency tree or a constituen...Show More

Abstract:

In Natural Language Processing (NLP), we often need to extract information from tree topology. Sentence structure can be represented via a dependency tree or a constituency tree structure. For this reason, a variant of LSTMs, named Tree-LSTM, was proposed to work on tree topology. In this paper, we design a generalized attention framework for both dependency and constituency trees by encoding variants of decomposable attention inside a Tree-LSTM cell. We evaluated our models on a semantic relatedness task and achieved notable results compared to Tree-Lstmbased methods with no attention as well as other neural and non-neural methods and good results compared to Tree-Lstmbased methods with attention.
Date of Conference: 30 January 2019 - 01 February 2019
Date Added to IEEE Xplore: 14 March 2019
ISBN Information:
Print on Demand(PoD) ISSN: 2325-6516
Conference Location: Newport Beach, CA, USA

Contact IEEE to Subscribe

References

References is not available for this document.