跳到主要导航 跳到搜索 跳到主要内容

Curriculum Prompt Learning with Self-Training for Abstractive Dialogue Summarization

科研成果: 书/报告/会议事项章节会议稿件同行评审

摘要

Succinctly summarizing dialogue is a task of growing interest, but inherent challenges, such as insufficient training data and low information density impede our ability to train abstractive models. In this work, we propose a novel curriculum-based prompt learning method with self-training to address these problems. Specifically, prompts are learned using a curriculum learning strategy that gradually increases the degree of prompt perturbation, thereby improving the dialogue understanding and modeling capabilities of our model. Unlabeled dialogue is incorporated by means of self-training so as to reduce the dependency on labeled data. We further investigate topic-aware prompts to better plan for the generation of summaries. Experiments confirm that our model substantially outperforms strong baselines and achieves new state-of-the-art results on the AMI and ICSI datasets. Human evaluations also show the superiority of our model with regard to the summary generation quality.

源语言英语
主期刊名Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, EMNLP 2022
编辑Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
出版商Association for Computational Linguistics (ACL)
1096-1106
页数11
ISBN(电子版)9781959429401
DOI
出版状态已出版 - 2022
活动2022 Conference on Empirical Methods in Natural Language Processing, EMNLP 2022 - Hybrid, Abu Dhabi, 阿拉伯联合酋长国
期限: 7 12月 202211 12月 2022

出版系列

姓名Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, EMNLP 2022

会议

会议2022 Conference on Empirical Methods in Natural Language Processing, EMNLP 2022
国家/地区阿拉伯联合酋长国
Hybrid, Abu Dhabi
时期7/12/2211/12/22

指纹

探究 'Curriculum Prompt Learning with Self-Training for Abstractive Dialogue Summarization' 的科研主题。它们共同构成独一无二的指纹。

引用此