TY - JOUR
T1 - PG-RNN
T2 - using position-gated recurrent neural networks for aspect-based sentiment classification
AU - Bai, Qingchun
AU - Zhou, Jie
AU - He, Liang
N1 - Publisher Copyright:
© 2021, The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature.
PY - 2022/2
Y1 - 2022/2
N2 - Recently, recurrent neural networks (RNN) have achieved great success in the aspect-based sentiment classification task. Existing approaches always focus on capture the local (attentive) representation or global representation independently, while how to integrate them is not well studied. To address this problem, we propose a Position-Gated Recurrent Neural Networks (PG-RNN) model that considered aspect word position information. PG-RNN can integrate global and local information dynamically for aspect-based sentiment classification. Specifically, first, we propose a positional RNN model to integrate the aspect position information into the sentence encoder to enhance the latent representation. Unlike the existing work, we use kernel function to model position information instead of discrete distance values. Second, we design a representation absorption gating to absorb local positional representation and global representation dynamically. Experiments on five benchmark datasets show the significant advantages of our proposed model. More specifically, we achieve a maximum improvement of 7.38% over the classic attention-based RNN model in terms of accuracy.
AB - Recently, recurrent neural networks (RNN) have achieved great success in the aspect-based sentiment classification task. Existing approaches always focus on capture the local (attentive) representation or global representation independently, while how to integrate them is not well studied. To address this problem, we propose a Position-Gated Recurrent Neural Networks (PG-RNN) model that considered aspect word position information. PG-RNN can integrate global and local information dynamically for aspect-based sentiment classification. Specifically, first, we propose a positional RNN model to integrate the aspect position information into the sentence encoder to enhance the latent representation. Unlike the existing work, we use kernel function to model position information instead of discrete distance values. Second, we design a representation absorption gating to absorb local positional representation and global representation dynamically. Experiments on five benchmark datasets show the significant advantages of our proposed model. More specifically, we achieve a maximum improvement of 7.38% over the classic attention-based RNN model in terms of accuracy.
KW - Aspect-based sentiment classification
KW - Attention
KW - Kernel function
KW - Recurrent neural network
UR - https://www.scopus.com/pages/publications/85113139554
U2 - 10.1007/s11227-021-04019-5
DO - 10.1007/s11227-021-04019-5
M3 - 文章
AN - SCOPUS:85113139554
SN - 0920-8542
VL - 78
SP - 4073
EP - 4094
JO - Journal of Supercomputing
JF - Journal of Supercomputing
IS - 3
ER -