跳到主要导航 跳到搜索 跳到主要内容

Enhancing the recurrent neural networks with positional gates for sentence representation

科研成果: 书/报告/会议事项章节会议稿件同行评审

摘要

The recurrent neural networks (RNN) with attention mechanism have shown good performance for answer selection in recent years. Most previous attention mechanisms focus on generating the attentive weights after obtaining all the hidden states, while the contextual information from the other sentence is not well studied during the internal hidden state generation. In this paper, we propose a position gated RNN (PG-RNN) model, which merges the positional contextual information of the question words for the inner hidden state generation. Specifically, we first design a positional interaction monitor to detect and measure the positional influence of question word within answer sentence. Then we present a positional gating mechanism and embed it into RNN to automatically absorb the positional contextual information for the hidden state update. Experiments on two benchmark datasets, namely TREC-QA and WikiQA, show the great advantages of our proposed model. In particular, we achieve the new state-of-the-art performance on TREC-QA and WikiQA.

源语言英语
主期刊名Neural Information Processing - 25th International Conference, ICONIP 2018, Proceedings
编辑Long Cheng, Andrew Chi Sing Leung, Seiichi Ozawa
出版商Springer Verlag
511-521
页数11
ISBN(印刷版)9783030041663
DOI
出版状态已出版 - 2018
活动25th International Conference on Neural Information Processing, ICONIP 2018 - Siem Reap, 柬埔寨
期限: 13 12月 201816 12月 2018

出版系列

姓名Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
11301 LNCS
ISSN(印刷版)0302-9743
ISSN(电子版)1611-3349

会议

会议25th International Conference on Neural Information Processing, ICONIP 2018
国家/地区柬埔寨
Siem Reap
时期13/12/1816/12/18

指纹

探究 'Enhancing the recurrent neural networks with positional gates for sentence representation' 的科研主题。它们共同构成独一无二的指纹。

引用此