跳到主要导航 跳到搜索 跳到主要内容

Resilient Abstractive Summarization Model with Adaptively Weighted Training Loss

  • Shiqi Guo
  • , Jing Zhao*
  • , Shiliang Sun
  • *此作品的通讯作者

科研成果: 书/报告/会议事项章节会议稿件同行评审

摘要

Recently, abstractive summarization models are preferred over extractive summarization models as they can generate words that do not exist in the original text, whose summary descriptions are more flexible and natural. Neural network-based models learn the pattern of summary generation from the training data by modeling the relationship between the original text and the reference summary, which is very dependent on the reference summary. Although we intuitively feel that summary with higher Abstraction Degree (quantified by the number of words in the summary which do not appear in the original text) will be more general, manually generated summaries with high Abstraction Degree are most likely subliminally written with additional knowledge. It's difficult to learn the generation pattern of such reference summary using limited training data. What's more, such reference summaries can even harm the model performance. To this end, we design a learning method that can adaptively weighted difference training samples based on their Abstraction Degree, so that the model will pay less attention to the samples with higher Abstraction Degree. Experiments of LCSTS and CNN-DM dataset show that our method greatly improves the performance of the summarization model and is resilient in the face of training data containing low quality reference summaries.

源语言英语
主期刊名IJCNN 2021 - International Joint Conference on Neural Networks, Proceedings
出版商Institute of Electrical and Electronics Engineers Inc.
ISBN(电子版)9780738133669
DOI
出版状态已出版 - 18 7月 2021
活动2021 International Joint Conference on Neural Networks, IJCNN 2021 - Virtual, Online, 中国
期限: 18 7月 202122 7月 2021

出版系列

姓名Proceedings of the International Joint Conference on Neural Networks
2021-July
ISSN(印刷版)2161-4393
ISSN(电子版)2161-4407

会议

会议2021 International Joint Conference on Neural Networks, IJCNN 2021
国家/地区中国
Virtual, Online
时期18/07/2122/07/21

指纹

探究 'Resilient Abstractive Summarization Model with Adaptively Weighted Training Loss' 的科研主题。它们共同构成独一无二的指纹。

引用此