Skip to main navigation Skip to search Skip to main content

Resilient Abstractive Summarization Model with Adaptively Weighted Training Loss

  • Shiqi Guo
  • , Jing Zhao*
  • , Shiliang Sun
  • *Corresponding author for this work
  • East China Normal University

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Recently, abstractive summarization models are preferred over extractive summarization models as they can generate words that do not exist in the original text, whose summary descriptions are more flexible and natural. Neural network-based models learn the pattern of summary generation from the training data by modeling the relationship between the original text and the reference summary, which is very dependent on the reference summary. Although we intuitively feel that summary with higher Abstraction Degree (quantified by the number of words in the summary which do not appear in the original text) will be more general, manually generated summaries with high Abstraction Degree are most likely subliminally written with additional knowledge. It's difficult to learn the generation pattern of such reference summary using limited training data. What's more, such reference summaries can even harm the model performance. To this end, we design a learning method that can adaptively weighted difference training samples based on their Abstraction Degree, so that the model will pay less attention to the samples with higher Abstraction Degree. Experiments of LCSTS and CNN-DM dataset show that our method greatly improves the performance of the summarization model and is resilient in the face of training data containing low quality reference summaries.

Original languageEnglish
Title of host publicationIJCNN 2021 - International Joint Conference on Neural Networks, Proceedings
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9780738133669
DOIs
StatePublished - 18 Jul 2021
Event2021 International Joint Conference on Neural Networks, IJCNN 2021 - Virtual, Online, China
Duration: 18 Jul 202122 Jul 2021

Publication series

NameProceedings of the International Joint Conference on Neural Networks
Volume2021-July
ISSN (Print)2161-4393
ISSN (Electronic)2161-4407

Conference

Conference2021 International Joint Conference on Neural Networks, IJCNN 2021
Country/TerritoryChina
CityVirtual, Online
Period18/07/2122/07/21

Fingerprint

Dive into the research topics of 'Resilient Abstractive Summarization Model with Adaptively Weighted Training Loss'. Together they form a unique fingerprint.

Cite this