A Memory-Driven Neural Attention Model for Aspect-Based Sentiment Classification

Jonathan van de Ruitenbeek, Flavius Frasincar, Gianni Brauwers*

*Corresponding author for this work

Research output: Contribution to journalArticleAcademicpeer-review

27 Downloads (Pure)


Sentiment analysis techniques are becoming more and more important as the number of reviews on the World Wide Web keeps increasing. Aspect-based sentiment analysis (ABSA) entails the automatic analysis of sentiments at the highly fine-grained aspect level. One of the challenges of ABSA is to identify the correct sentiment expressed towards every aspect in a sentence. In this paper, a neural attention model is discussed and three extensions are proposed to this model. First, the strengths and weaknesses of the highly successful CABASC model are discussed, and three shortcomings are identified: the aspect-representation is poor, the current attention mechanism can be extended for dealing with polysemy in natural language, and the design of the aspect-specific sentence representation is upheld by a weak construction. We propose the Extended CABASC (E-CABASC) model, which aims to solve all three of these problems. The model incorporates a context-aware aspect representation, a multi-dimensional attention mechanism, and an aspect-specific sentence representation. The main contribution of this work is that it is shown that attention models can be improved upon using some relatively simple extensions, such as fusion gates and multi-dimensional attention, which can be implemented in many state-of-the-art models. Additionally, an analysis of the parameters and attention weights is provided.

Original languageEnglish
Pages (from-to)1793-1830
Number of pages38
JournalJournal of Web Engineering
Issue number6
Publication statusPublished - 2022

Bibliographical note

Publisher Copyright:
© 2022 River Publishers.


Dive into the research topics of 'A Memory-Driven Neural Attention Model for Aspect-Based Sentiment Classification'. Together they form a unique fingerprint.

Cite this