Position-aware hierarchical transfer model for aspect-level sentiment classification

Jie Zhou, Qin Chen, Jimmy Xiangji Huang, Qinmin Vivian Hu, Liang He

Research output: Contribution to journalArticlepeer-review

55 Scopus citations

Abstract

Recently, attention-based neural networks (NNs) have been widely used for aspect-level sentiment classification (ASC). Most neural models focus on incorporating the aspect representation into attention, however, the position information of each aspect is not studied well. Furthermore, the existing ASC datasets are relatively small owing to the labor-intensive labeling that largely limits the performance of NNs. In this paper, we propose a position-aware hierarchical transfer (PAHT) model that models the position information from multiple levels and enhances the ASC performance by transferring hierarchical knowledge from the resource-rich sentence-level sentiment classification (SSC) dataset. We first present aspect-based positional attention in the word and the segment levels to capture more salient information toward a given aspect. To make up for the limited data for ASC, we devise three sampling strategies to select related instances from the large-scale SSC dataset for pre-training and transfer the learned knowledge into ASC from four levels: embedding, word, segment and classifier. Extensive experiments on four benchmark datasets demonstrate that the proposed model is effective in improving the performance of ASC. Particularly, our model outperforms the state-of-the-art approaches in terms of accuracy over all the datasets considered.

Original languageEnglish
Pages (from-to)1-16
Number of pages16
JournalInformation Sciences
Volume513
DOIs
StatePublished - Mar 2020

Keywords

  • Aspect-level sentiment classification
  • Hierarchical attention networks
  • Neural networks
  • Position
  • Sentiment classification
  • Transfer learning

Fingerprint

Dive into the research topics of 'Position-aware hierarchical transfer model for aspect-level sentiment classification'. Together they form a unique fingerprint.

Cite this