PG-RNN: using position-gated recurrent neural networks for aspect-based sentiment classification

Qingchun Bai, Jie Zhou, Liang He

Research output: Contribution to journalArticlepeer-review

20 Scopus citations

Abstract

Recently, recurrent neural networks (RNN) have achieved great success in the aspect-based sentiment classification task. Existing approaches always focus on capture the local (attentive) representation or global representation independently, while how to integrate them is not well studied. To address this problem, we propose a Position-Gated Recurrent Neural Networks (PG-RNN) model that considered aspect word position information. PG-RNN can integrate global and local information dynamically for aspect-based sentiment classification. Specifically, first, we propose a positional RNN model to integrate the aspect position information into the sentence encoder to enhance the latent representation. Unlike the existing work, we use kernel function to model position information instead of discrete distance values. Second, we design a representation absorption gating to absorb local positional representation and global representation dynamically. Experiments on five benchmark datasets show the significant advantages of our proposed model. More specifically, we achieve a maximum improvement of 7.38% over the classic attention-based RNN model in terms of accuracy.

Original languageEnglish
Pages (from-to)4073-4094
Number of pages22
JournalJournal of Supercomputing
Volume78
Issue number3
DOIs
StatePublished - Feb 2022

Keywords

  • Aspect-based sentiment classification
  • Attention
  • Kernel function
  • Recurrent neural network

Fingerprint

Dive into the research topics of 'PG-RNN: using position-gated recurrent neural networks for aspect-based sentiment classification'. Together they form a unique fingerprint.

Cite this