TY - GEN
T1 - Modeling Comparative Logical Relation with Contrastive Learning for Text Generation
AU - Dan, Yuhao
AU - Tian, Junfeng
AU - Zhou, Jie
AU - Yan, Ming
AU - Zhang, Ji
AU - Chen, Qin
AU - He, Liang
N1 - Publisher Copyright:
© The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2025.
PY - 2025
Y1 - 2025
N2 - Data-to-Text Generation (D2T), a classic natural language generation problem, aims at producing fluent descriptions for structured input data, such as a table. Existing D2T works mainly focus on describing the superficial associative relations among entities, while ignoring the deep comparative logical relations, such as A is better than B in a certain aspect with a corresponding opinion, which is quite common in our daily life. In this paper, we introduce a new D2T task named comparative logical relation generation (CLRG). Additionally, we propose a Comparative Logic (CoLo) based text generation method, which generates texts following specific comparative logical relations with contrastive learning. Specifically, we first construct various positive and negative samples by fine-grained perturbations in entities, aspects and opinions. Then, we perform contrastive learning in the encoder layer to have a better understanding of the comparative logical relations, and integrate it in the decoder layer to guide the model to correctly generate the relations. Noting the data scarcity problem, we construct a Chinese Comparative Logical Relation Dataset (CLRD), which is a high-quality human-annotated dataset and challenging for text generation with descriptions of multiple entities and annotations on their comparative logical relations. Extensive experiments show that our method achieves impressive performance in both automatic and human evaluations.
AB - Data-to-Text Generation (D2T), a classic natural language generation problem, aims at producing fluent descriptions for structured input data, such as a table. Existing D2T works mainly focus on describing the superficial associative relations among entities, while ignoring the deep comparative logical relations, such as A is better than B in a certain aspect with a corresponding opinion, which is quite common in our daily life. In this paper, we introduce a new D2T task named comparative logical relation generation (CLRG). Additionally, we propose a Comparative Logic (CoLo) based text generation method, which generates texts following specific comparative logical relations with contrastive learning. Specifically, we first construct various positive and negative samples by fine-grained perturbations in entities, aspects and opinions. Then, we perform contrastive learning in the encoder layer to have a better understanding of the comparative logical relations, and integrate it in the decoder layer to guide the model to correctly generate the relations. Noting the data scarcity problem, we construct a Chinese Comparative Logical Relation Dataset (CLRD), which is a high-quality human-annotated dataset and challenging for text generation with descriptions of multiple entities and annotations on their comparative logical relations. Extensive experiments show that our method achieves impressive performance in both automatic and human evaluations.
KW - Contrastive learning
KW - Data-to-text generation
KW - Dataset construction
KW - Natural language processing
UR - https://www.scopus.com/pages/publications/85210079893
U2 - 10.1007/978-981-97-9440-9_9
DO - 10.1007/978-981-97-9440-9_9
M3 - 会议稿件
AN - SCOPUS:85210079893
SN - 9789819794393
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 107
EP - 119
BT - Natural Language Processing and Chinese Computing - 13th National CCF Conference, NLPCC 2024, Proceedings
A2 - Wong, Derek F.
A2 - Wei, Zhongyu
A2 - Yang, Muyun
PB - Springer Science and Business Media Deutschland GmbH
T2 - 13th CCF International Conference on Natural Language Processing and Chinese Computing, NLPCC 2024
Y2 - 1 November 2024 through 3 November 2024
ER -