TY - GEN
T1 - Improved Representations for Personalized Document-Level Sentiment Classification
AU - Zhang, Yihong
AU - Zhang, Wei
N1 - Publisher Copyright:
© 2020, Springer Nature Switzerland AG.
PY - 2020
Y1 - 2020
N2 - Incorporating personalization into document-level sentiment classification has gained considerable attention due to its better performance on diverse domains. Current progress in this field is attributed to the developed mechanisms of effectively modeling the interaction among the three fundamental factors: users, items, and words. However, how to improve the representation learning of the three factors themselves is largely unexplored. To bridge this gap, we propose to enrich users, items, and words representations in the state-of-the-art personalized sentiment classification model with an end-to-end training fashion. Specifically, relations between users and items are respectively modeled by graph neural networks to enhance original user and item representations. We further promote word representation by utilizing powerful pre-trained language models. Comprehensive experiments on several public and widely-used datasets demonstrate the superiority of the proposed approach, validating the contribution of the improved representations.
AB - Incorporating personalization into document-level sentiment classification has gained considerable attention due to its better performance on diverse domains. Current progress in this field is attributed to the developed mechanisms of effectively modeling the interaction among the three fundamental factors: users, items, and words. However, how to improve the representation learning of the three factors themselves is largely unexplored. To bridge this gap, we propose to enrich users, items, and words representations in the state-of-the-art personalized sentiment classification model with an end-to-end training fashion. Specifically, relations between users and items are respectively modeled by graph neural networks to enhance original user and item representations. We further promote word representation by utilizing powerful pre-trained language models. Comprehensive experiments on several public and widely-used datasets demonstrate the superiority of the proposed approach, validating the contribution of the improved representations.
KW - Graph neural network
KW - Representation learning
KW - Sentiment classification
UR - https://www.scopus.com/pages/publications/85092115692
U2 - 10.1007/978-3-030-59410-7_53
DO - 10.1007/978-3-030-59410-7_53
M3 - 会议稿件
AN - SCOPUS:85092115692
SN - 9783030594091
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 769
EP - 785
BT - Database Systems for Advanced Applications - 25th International Conference, DASFAA 2020, Proceedings
A2 - Nah, Yunmook
A2 - Cui, Bin
A2 - Lee, Sang-Won
A2 - Yu, Jeffrey Xu
A2 - Moon, Yang-Sae
A2 - Whang, Steven Euijong
PB - Springer Science and Business Media Deutschland GmbH
T2 - 25th International Conference on Database Systems for Advanced Applications, DASFAA 2020
Y2 - 24 September 2020 through 27 September 2020
ER -