TY - GEN
T1 - Multi-level head-wise match and aggregation in transformer for textual sequence matching
AU - Wang, Shuohang
AU - Lan, Yunshi
AU - Tay, Yi
AU - Jiang, Jing
AU - Liu, Jingjing
N1 - Publisher Copyright:
Copyright © 2020 Association for the Advancement of Artificial Intelligence. All rights reserved.
PY - 2020
Y1 - 2020
N2 - Transformer has been successfully applied to many natural language processing tasks. However, for textual sequence matching, simple matching between the representation of a pair of sequences might bring in unnecessary noise. In this paper, we propose a new approach to sequence pair matching with Transformer, by learning head-wise matching representations on multiple levels. Experiments show that our proposed approach can achieve new state-of-the-art performance on multiple tasks that rely only on pre-computed sequence-vectorrepresentation, such as SNLI, MNLI-match, MNLI-mismatch, QQP, and SQuAD-binary.
AB - Transformer has been successfully applied to many natural language processing tasks. However, for textual sequence matching, simple matching between the representation of a pair of sequences might bring in unnecessary noise. In this paper, we propose a new approach to sequence pair matching with Transformer, by learning head-wise matching representations on multiple levels. Experiments show that our proposed approach can achieve new state-of-the-art performance on multiple tasks that rely only on pre-computed sequence-vectorrepresentation, such as SNLI, MNLI-match, MNLI-mismatch, QQP, and SQuAD-binary.
UR - https://www.scopus.com/pages/publications/85106603636
M3 - 会议稿件
AN - SCOPUS:85106603636
T3 - AAAI 2020 - 34th AAAI Conference on Artificial Intelligence
SP - 9209
EP - 9216
BT - AAAI 2020 - 34th AAAI Conference on Artificial Intelligence
PB - AAAI press
T2 - 34th AAAI Conference on Artificial Intelligence, AAAI 2020
Y2 - 7 February 2020 through 12 February 2020
ER -