TY - GEN
T1 - Using fractional latent topic to enhance recurrent neural network in text similarity modeling
AU - Song, Yang
AU - Hu, Wenxin
AU - He, Liang
N1 - Publisher Copyright:
© Springer Nature Switzerland AG 2019.
PY - 2019
Y1 - 2019
N2 - Recurrent neural networks (RNNs) have been widely used in text similarity modeling for text semantic representation learning. However, referring to the classical topic models, a text contains many different latent topics, and the complete semantic information of the text is described by all the latent topics. Previous RNN based models usually learn the text representation with the separated words in the text instead of topics, which will bring noises and loss hierarchical structure information for text representation. In this paper, we proposed a novel fractional latent topic based RNN (FraLT-RNN) model, which focuses on the text representation in topic-level and largely preserve the whole semantic information of a text. To be specific, we first adopt the fractional calculus to generate latent topics for a text with the hidden states learned by a RNN model. Then, we propose a topic-wise attention gating mechanism and embed it into our model to generate the topic-level attentive vector for each topic. Finally, we reward the topic perspective with the topic-level attention for text representation. Experiments on four benchmark datasets, namely TREC-QA and WikiQA for answer selection, MSRP for paraphrase identification, and MultiNLI for textual entailment, show the great advantages of our proposed model.
AB - Recurrent neural networks (RNNs) have been widely used in text similarity modeling for text semantic representation learning. However, referring to the classical topic models, a text contains many different latent topics, and the complete semantic information of the text is described by all the latent topics. Previous RNN based models usually learn the text representation with the separated words in the text instead of topics, which will bring noises and loss hierarchical structure information for text representation. In this paper, we proposed a novel fractional latent topic based RNN (FraLT-RNN) model, which focuses on the text representation in topic-level and largely preserve the whole semantic information of a text. To be specific, we first adopt the fractional calculus to generate latent topics for a text with the hidden states learned by a RNN model. Then, we propose a topic-wise attention gating mechanism and embed it into our model to generate the topic-level attentive vector for each topic. Finally, we reward the topic perspective with the topic-level attention for text representation. Experiments on four benchmark datasets, namely TREC-QA and WikiQA for answer selection, MSRP for paraphrase identification, and MultiNLI for textual entailment, show the great advantages of our proposed model.
KW - Fractional calculus
KW - Latent topic
KW - Recurrent neural network
UR - https://www.scopus.com/pages/publications/85065494082
U2 - 10.1007/978-3-030-18579-4_11
DO - 10.1007/978-3-030-18579-4_11
M3 - 会议稿件
AN - SCOPUS:85065494082
SN - 9783030185787
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 173
EP - 190
BT - Database Systems for Advanced Applications - 24th International Conference, DASFAA 2019, Proceedings
A2 - Li, Guoliang
A2 - Natwichai, Juggapong
A2 - Tong, Yongxin
A2 - Yang, Jun
A2 - Gama, Joao
PB - Springer Verlag
T2 - 24th International Conference on Database Systems for Advanced Applications, DASFAA 2019
Y2 - 22 April 2019 through 25 April 2019
ER -