TY - GEN
T1 - ENHANCING CLASS UNDERSTANDING VIA PROMPT-TUNING FOR ZERO-SHOT TEXT CLASSIFICATION
AU - Dan, Yuhao
AU - Zhou, Jie
AU - Chen, Qin
AU - Bai, Qingchun
AU - He, Liang
N1 - Publisher Copyright:
© 2022 IEEE
PY - 2022
Y1 - 2022
N2 - Zero-shot text classification (ZSTC) poses a big challenge due to the lack of labeled data for unseen classes during training. Most studies focus on transferring knowledge from seen classes to unseen classes, which have achieved good performance in most cases. Whereas, it is difficult to transfer knowledge when the classes have semantic gaps or low similarities. In this paper, we propose a prompt-based method, which enhances semantic understanding for each class and learns the matching between texts and classes for better ZSTC. Specifically, we first generate discriminative words for class description with prompt inserting (PIN). Then, a prompt matching (POM) model is learned to determine whether the text can well match the class description. Experiments on three benchmark datasets show the great advantages of our proposed method. In particular, we achieve the state-of-the-art performance on the unseen classes, while maintaining comparable strength with the existing ZSTC approaches regarding to the seen classes.
AB - Zero-shot text classification (ZSTC) poses a big challenge due to the lack of labeled data for unseen classes during training. Most studies focus on transferring knowledge from seen classes to unseen classes, which have achieved good performance in most cases. Whereas, it is difficult to transfer knowledge when the classes have semantic gaps or low similarities. In this paper, we propose a prompt-based method, which enhances semantic understanding for each class and learns the matching between texts and classes for better ZSTC. Specifically, we first generate discriminative words for class description with prompt inserting (PIN). Then, a prompt matching (POM) model is learned to determine whether the text can well match the class description. Experiments on three benchmark datasets show the great advantages of our proposed method. In particular, we achieve the state-of-the-art performance on the unseen classes, while maintaining comparable strength with the existing ZSTC approaches regarding to the seen classes.
KW - Prompt Tuning
KW - Semantics Enhancing
KW - Zero-shot Text Classification
UR - https://www.scopus.com/pages/publications/85131251146
U2 - 10.1109/ICASSP43922.2022.9746200
DO - 10.1109/ICASSP43922.2022.9746200
M3 - 会议稿件
AN - SCOPUS:85131251146
T3 - ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings
SP - 4303
EP - 4307
BT - 2022 IEEE International Conference on Acoustics, Speech, and Signal Processing, ICASSP 2022 - Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2022 IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP 2022
Y2 - 22 May 2022 through 27 May 2022
ER -