TY - GEN
T1 - Attentive history selection for conversational question answering
AU - Qu, Chen
AU - Yang, Liu
AU - Qiu, Minghui
AU - Zhang, Yongfeng
AU - Chen, Cen
AU - Bruce Croft, W.
AU - Iyyer, Mohit
N1 - Publisher Copyright:
© 2019 Association for Computing Machinery.
PY - 2019/11/3
Y1 - 2019/11/3
N2 - Conversational question answering (ConvQA) is a simplified but concrete setting of conversational search [24]. One of its major challenges is to leverage the conversation history to understand and answer the current question. In this work, we propose a novel solution for ConvQA that involves three aspects. First, we propose a positional history answer embedding method to encode conversation history with position information using BERT [6] in a natural way. BERT is a powerful technique for text representation. Second, we design a history attention mechanism (HAM) to conduct a “soft selection” for conversation histories. This method attends to history turns with different weights based on how helpful they are on answering the current question. Third, in addition to handling conversation history, we take advantage of multi-task learning (MTL) to do answer prediction along with another essential conversation task (dialog act prediction) using a uniform model architecture. MTL is able to learn more expressive and generic representations to improve the performance of ConvQA. We demonstrate the effectiveness of our model with extensive experimental evaluations on QuAC, a large-scale ConvQA dataset. We show that position information plays an important role in conversation history modeling. We also visualize the history attention and provide new insights into conversation history understanding.
AB - Conversational question answering (ConvQA) is a simplified but concrete setting of conversational search [24]. One of its major challenges is to leverage the conversation history to understand and answer the current question. In this work, we propose a novel solution for ConvQA that involves three aspects. First, we propose a positional history answer embedding method to encode conversation history with position information using BERT [6] in a natural way. BERT is a powerful technique for text representation. Second, we design a history attention mechanism (HAM) to conduct a “soft selection” for conversation histories. This method attends to history turns with different weights based on how helpful they are on answering the current question. Third, in addition to handling conversation history, we take advantage of multi-task learning (MTL) to do answer prediction along with another essential conversation task (dialog act prediction) using a uniform model architecture. MTL is able to learn more expressive and generic representations to improve the performance of ConvQA. We demonstrate the effectiveness of our model with extensive experimental evaluations on QuAC, a large-scale ConvQA dataset. We show that position information plays an important role in conversation history modeling. We also visualize the history attention and provide new insights into conversation history understanding.
KW - Attention
KW - Conversation History
KW - Conversational Question Answering
KW - Multi-turn Question Answering
UR - https://www.scopus.com/pages/publications/85075426820
U2 - 10.1145/3357384.3357905
DO - 10.1145/3357384.3357905
M3 - 会议稿件
AN - SCOPUS:85075426820
T3 - International Conference on Information and Knowledge Management, Proceedings
SP - 1391
EP - 1400
BT - CIKM 2019 - Proceedings of the 28th ACM International Conference on Information and Knowledge Management
PB - Association for Computing Machinery
T2 - 28th ACM International Conference on Information and Knowledge Management, CIKM 2019
Y2 - 3 November 2019 through 7 November 2019
ER -