Using fractional latent topic to enhance recurrent neural network in text similarity modeling

Yang Song, Wenxin Hu, Liang He

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Scopus citations

Abstract

Recurrent neural networks (RNNs) have been widely used in text similarity modeling for text semantic representation learning. However, referring to the classical topic models, a text contains many different latent topics, and the complete semantic information of the text is described by all the latent topics. Previous RNN based models usually learn the text representation with the separated words in the text instead of topics, which will bring noises and loss hierarchical structure information for text representation. In this paper, we proposed a novel fractional latent topic based RNN (FraLT-RNN) model, which focuses on the text representation in topic-level and largely preserve the whole semantic information of a text. To be specific, we first adopt the fractional calculus to generate latent topics for a text with the hidden states learned by a RNN model. Then, we propose a topic-wise attention gating mechanism and embed it into our model to generate the topic-level attentive vector for each topic. Finally, we reward the topic perspective with the topic-level attention for text representation. Experiments on four benchmark datasets, namely TREC-QA and WikiQA for answer selection, MSRP for paraphrase identification, and MultiNLI for textual entailment, show the great advantages of our proposed model.

Original languageEnglish
Title of host publicationDatabase Systems for Advanced Applications - 24th International Conference, DASFAA 2019, Proceedings
EditorsGuoliang Li, Juggapong Natwichai, Yongxin Tong, Jun Yang, Joao Gama
PublisherSpringer Verlag
Pages173-190
Number of pages18
ISBN (Print)9783030185787
DOIs
StatePublished - 2019
Event24th International Conference on Database Systems for Advanced Applications, DASFAA 2019 - Chiang Mai, Thailand
Duration: 22 Apr 201925 Apr 2019

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume11447 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference24th International Conference on Database Systems for Advanced Applications, DASFAA 2019
Country/TerritoryThailand
CityChiang Mai
Period22/04/1925/04/19

Keywords

  • Fractional calculus
  • Latent topic
  • Recurrent neural network

Fingerprint

Dive into the research topics of 'Using fractional latent topic to enhance recurrent neural network in text similarity modeling'. Together they form a unique fingerprint.

Cite this