TY - GEN
T1 - Residual connection-based multi-step reasoning via commonsense knowledge for multiple choice machine reading comprehension
AU - Sheng, Yixuan
AU - Lan, Man
N1 - Publisher Copyright:
© Springer Nature Switzerland AG 2019.
PY - 2019
Y1 - 2019
N2 - Generally, the candidate options for multiple choice machine reading comprehension (MRC) are not explicitly present in the document and need to be inferred from text or even from the world’s knowledge. Previous work endeavored to improve performance with the aid of commonsense knowledge or using multi-step reasoning strategy. However, there is no model adopt multi-step reasoning with external commonsense knowledge information to solve multiple choice MRC, and two shortcomings still remain unsolved, i.e., external knowledge may involve undesirable noise and only the latest reasoning step makes contribution to the next reasoning. To address the above issues, we propose a multi-step reasoning neural network based on the strong Co-Matching model with the aid of commonsense knowledge. Firstly, we present a sentence-level knowledge interaction (SKI) module to integrate commonsense knowledge with corresponding sentence rather than the whole MRC instance. Secondly, we present a residual connection-based multi-step reasoning (RCMR) answer module, which makes the next reasoning depending on the integration of several early reasoning steps rather than only the latest reasoning step. The comparative experimental results on MCScript show that our single model achieves a promising result comparable to SOTA single model with extra samples and specifically achieves the best result for commonsense type questions.
AB - Generally, the candidate options for multiple choice machine reading comprehension (MRC) are not explicitly present in the document and need to be inferred from text or even from the world’s knowledge. Previous work endeavored to improve performance with the aid of commonsense knowledge or using multi-step reasoning strategy. However, there is no model adopt multi-step reasoning with external commonsense knowledge information to solve multiple choice MRC, and two shortcomings still remain unsolved, i.e., external knowledge may involve undesirable noise and only the latest reasoning step makes contribution to the next reasoning. To address the above issues, we propose a multi-step reasoning neural network based on the strong Co-Matching model with the aid of commonsense knowledge. Firstly, we present a sentence-level knowledge interaction (SKI) module to integrate commonsense knowledge with corresponding sentence rather than the whole MRC instance. Secondly, we present a residual connection-based multi-step reasoning (RCMR) answer module, which makes the next reasoning depending on the integration of several early reasoning steps rather than only the latest reasoning step. The comparative experimental results on MCScript show that our single model achieves a promising result comparable to SOTA single model with extra samples and specifically achieves the best result for commonsense type questions.
KW - Attention
KW - Commonsense knowledge
KW - Machine reading comprehension
KW - Multi-step reasoning
KW - Question answering
UR - https://www.scopus.com/pages/publications/85076985025
U2 - 10.1007/978-3-030-36718-3_29
DO - 10.1007/978-3-030-36718-3_29
M3 - 会议稿件
AN - SCOPUS:85076985025
SN - 9783030367176
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 340
EP - 352
BT - Neural Information Processing - 26th International Conference, ICONIP 2019, Proceedings
A2 - Gedeon, Tom
A2 - Wong, Kok Wai
A2 - Lee, Minho
PB - Springer
T2 - 26th International Conference on Neural Information Processing, ICONIP 2019
Y2 - 12 December 2019 through 15 December 2019
ER -