Loading web-font TeX/Math/Italic
A Single Attention-Based Combination of CNN and RNN for Relation Classification | IEEE Journals & Magazine | IEEE Xplore

A Single Attention-Based Combination of CNN and RNN for Relation Classification


The overall structure of Att-RCNN model. After transferring words into word embeddings, Att-RCNN employs a combination of bidirectional-RNN with GRU cells and CNN to extr...

Abstract:

As a vital task in natural language processing, relation classification aims to identify relation types between entities from texts. In this paper, we propose a novel Att...Show More

Abstract:

As a vital task in natural language processing, relation classification aims to identify relation types between entities from texts. In this paper, we propose a novel Att-RCNN model to extract text features and classify relations by combining recurrent neural network (RNN) and convolutional neural network (CNN). This network structure utilizes RNN to extract higher level contextual representations of words and CNN to obtain sentence features for the relation classification task. In addition to this network structure, both word-level and sentence-level attention mechanisms are employed in Att-RCNN to strengthen critical words and features to promote the model performance. Moreover, we conduct experiments on four distinct datasets: SemEval-2010 task 8, SemEval-2018 task 7 (two subtask datasets), and KBP37 dataset. Compared with the previous public models, Att-RCNN has the overall best performance and achieves the highest F_{1} score, especially on the KBP37 dataset.
The overall structure of Att-RCNN model. After transferring words into word embeddings, Att-RCNN employs a combination of bidirectional-RNN with GRU cells and CNN to extr...
Published in: IEEE Access ( Volume: 7)
Page(s): 12467 - 12475
Date of Publication: 09 January 2019
Electronic ISSN: 2169-3536

Funding Agency:


References

References is not available for this document.