We found a match
Your institution may have access to this item. Find your institution then sign in to continue.
- Title
PG-RNN: using position-gated recurrent neural networks for aspect-based sentiment classification.
- Authors
Bai, Qingchun; Zhou, Jie; He, Liang
- Abstract
Recently, recurrent neural networks (RNN) have achieved great success in the aspect-based sentiment classification task. Existing approaches always focus on capture the local (attentive) representation or global representation independently, while how to integrate them is not well studied. To address this problem, we propose a Position-Gated Recurrent Neural Networks (PG-RNN) model that considered aspect word position information. PG-RNN can integrate global and local information dynamically for aspect-based sentiment classification. Specifically, first, we propose a positional RNN model to integrate the aspect position information into the sentence encoder to enhance the latent representation. Unlike the existing work, we use kernel function to model position information instead of discrete distance values. Second, we design a representation absorption gating to absorb local positional representation and global representation dynamically. Experiments on five benchmark datasets show the significant advantages of our proposed model. More specifically, we achieve a maximum improvement of 7.38% over the classic attention-based RNN model in terms of accuracy.
- Subjects
RECURRENT neural networks; KERNEL functions; INFORMATION modeling
- Publication
Journal of Supercomputing, 2022, Vol 78, Issue 3, p4073
- ISSN
0920-8542
- Publication type
Academic Journal
- DOI
10.1007/s11227-021-04019-5