SAKG-BERT: Enabling Language Representation With Knowledge Graphs for Chinese Sentiment Analysis | IEEE Journals & Magazine | IEEE Xplore

SAKG-BERT: Enabling Language Representation With Knowledge Graphs for Chinese Sentiment Analysis


In this study, we propose a method of constructing a domain-specific sentiment analysis knowledge graph (SAKG) to analyze online reviews. Sentiment knowledge is integrate...

Abstract:

Sentiment analysis of online reviews is an important task in natural language processing. It has received much attention not only in academia but also in industry. Data h...Show More

Abstract:

Sentiment analysis of online reviews is an important task in natural language processing. It has received much attention not only in academia but also in industry. Data have become an important source of competitive intelligence. Various pretraining models such as BERT and ERNIE have made great achievements in the task of natural language processing, but lack domain-specific knowledge. Knowledge graphs can enhance language representation. Furthermore, knowledge graphs have high entity / concept coverage and strong semantic expression ability. We propose a sentiment analysis knowledge graph (SAKG)-BERT model that combines sentiment analysis knowledge and the language representation model BERT. To improve the interpretability of the deep learning algorithm, we construct an SAKG in which triples are injected into sentences as domain knowledge. Our investigation reveals promising results in sentence completion and sentiment analysis tasks.
In this study, we propose a method of constructing a domain-specific sentiment analysis knowledge graph (SAKG) to analyze online reviews. Sentiment knowledge is integrate...
Published in: IEEE Access ( Volume: 9)
Page(s): 101695 - 101701
Date of Publication: 19 July 2021
Electronic ISSN: 2169-3536

Funding Agency:


References

References is not available for this document.