Multi-task attention-based neural networks for implicit discourse relationship representation and identification

  • Man Lan
  • , Jianxiang Wang
  • , Yuanbin Wu
  • , Zheng Yu Niu
  • , Haifeng Wang

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

106 Scopus citations

Abstract

We present a novel multi-task attention-based neural network model to address implicit discourse relationship representation and identification through two types of representation learning, an attention-based neural network for learning discourse relationship representation with two arguments and a multi-task framework for learning knowledge from annotated and unannotated corpora. The extensive experiments have been performed on two benchmark corpora (i.e., PDTB and CoNLL-2016 datasets). Experimental results show that our proposed model outperforms the state-of-the-art systems on benchmark corpora.

Original languageEnglish
Title of host publicationEMNLP 2017 - Conference on Empirical Methods in Natural Language Processing, Proceedings
PublisherAssociation for Computational Linguistics (ACL)
Pages1299-1308
Number of pages10
ISBN (Electronic)9781945626838
DOIs
StatePublished - 2017
Event2017 Conference on Empirical Methods in Natural Language Processing, EMNLP 2017 - Copenhagen, Denmark
Duration: 9 Sep 201711 Sep 2017

Publication series

NameEMNLP 2017 - Conference on Empirical Methods in Natural Language Processing, Proceedings

Conference

Conference2017 Conference on Empirical Methods in Natural Language Processing, EMNLP 2017
Country/TerritoryDenmark
CityCopenhagen
Period9/09/1711/09/17

Fingerprint

Dive into the research topics of 'Multi-task attention-based neural networks for implicit discourse relationship representation and identification'. Together they form a unique fingerprint.

Cite this