Sentiment Analysis using Multi Head Self-Attention Mechanism Based Bidirectional Gated Recurrent Unit | IEEE Conference Publication | IEEE Xplore

Sentiment Analysis using Multi Head Self-Attention Mechanism Based Bidirectional Gated Recurrent Unit


Abstract:

Sentiment Analysis plays a vital role in Natural Language Processing (NLP) which aims to discern opinions and emotions expressed in text. However, the data sparsity and d...Show More

Abstract:

Sentiment Analysis plays a vital role in Natural Language Processing (NLP) which aims to discern opinions and emotions expressed in text. However, the data sparsity and disambiguation of natural languages make it challenging for the existing approaches to provide accurate extraction and classification when subjected to text data. Hence, this research proposes Multihead Self-Attention Mechanism based Bidirectional Gated Recurrent Unit (MSA-BiGRU) approach for the classification of sentiment data into multi-classes. The MSA allows the BiGRU to consider various parts of the sequence, capturing long-term dependencies and relationships within the text. Initially, three standard datasets namely, Internet Movie Database (IMDB), Sentiment140 and World Cup Soccer are utilized to estimate the effectiveness of the MSA-BiGRU method. The Word-2-Vector (Word2Vec) is utilized for the feature extraction process and Analysis of Variance (ANOVA) is utilized for the selection of features. The performance metrices: accuracy, precision, recall and F1-score are utilized to validate the model’s effectiveness. The experimental results show that the MSA-BiGRU method attains a better accuracy of 98.91% on sentiment140 dataset as compared to Automated Sentiment Analysis in social media based on Harris Hawks Optimization with Deep Learning (ASASM-HHODL) and Gated Attention Mechanism and Recurrent Neural Network (GARN).
Date of Conference: 23-24 August 2024
Date Added to IEEE Xplore: 24 October 2024
ISBN Information:
Conference Location: Hassan, India

Contact IEEE to Subscribe

References

References is not available for this document.