Abstractive Dialogue Summarization Based on Dynamic Pattern Exploiting Training

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Pre-trained language models (PLMs) have shown remarkable performance in natural language processing tasks, whereas these approaches often require a massive amount of data. Due to the lack of sufficient training instances, it is challenging for existing PLMs to achieve good results on dialogue summarization. In this paper, we propose DynamicPET, a pattern-exploiting training (PET) based method for abstractive dialogue summarization, which leverages the recent prompt learning paradigm to boost the performance of PLMs. In contrast to PET, our method does not rely on any task-specific unlabeled data, but obtains strong performance on two dialogue summarization datasets, especially in the few-shot scenarios.

Original languageEnglish
Title of host publication2022 International Joint Conference on Neural Networks, IJCNN 2022 - Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781728186719
DOIs
StatePublished - 2022
Event2022 International Joint Conference on Neural Networks, IJCNN 2022 - Padua, Italy
Duration: 18 Jul 202223 Jul 2022

Publication series

NameProceedings of the International Joint Conference on Neural Networks
ISSN (Print)2161-4393
ISSN (Electronic)2161-4407

Conference

Conference2022 International Joint Conference on Neural Networks, IJCNN 2022
Country/TerritoryItaly
CityPadua
Period18/07/2223/07/22

Keywords

  • Dialogue summarization
  • Pre-trained Language Model
  • Prompt Learning

Fingerprint

Dive into the research topics of 'Abstractive Dialogue Summarization Based on Dynamic Pattern Exploiting Training'. Together they form a unique fingerprint.

Cite this