Abstract
Event argument extraction (EAE) aims to extract arguments with given roles from texts, which have been widely studied in natural language processing. Most previous works have achieved good performance in specific EAE datasets with dedicated neural architectures. Whereas, these architectures are usually difficult to adapt to new datasets/scenarios with various annotation schemas or formats. Furthermore, they rely on large-scale labeled data for training, which is unavailable due to the high labelling cost in most cases. In this paper, we propose a multi-format transfer learning model with variational information bottleneck, which makes use of the information especially the common knowledge in existing datasets for EAE in new datasets. Specifically, we introduce a shared-specific prompt framework to learn both format-shared and format-specific knowledge from datasets with different formats. In order to further absorb the common knowledge for EAE and eliminate the irrelevant noise, we integrate variational information bottleneck into our architecture to refine the shared representation. We conduct extensive experiments on three benchmark datasets, and obtain new state-of-the-art performance on EAE.
| Original language | English |
|---|---|
| Pages (from-to) | 1990-2000 |
| Number of pages | 11 |
| Journal | Proceedings - International Conference on Computational Linguistics, COLING |
| Volume | 29 |
| Issue number | 1 |
| State | Published - 2022 |
| Event | 29th International Conference on Computational Linguistics, COLING 2022 - Hybrid, Gyeongju, Korea, Republic of Duration: 12 Oct 2022 → 17 Oct 2022 |
Fingerprint
Dive into the research topics of 'A Multi-Format Transfer Learning Model for Event Argument Extraction via Variational Information Bottleneck'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver