TY - GEN
T1 - Commonsense Generative Model for Chinese Automatic Knowledge Graph Construction
AU - Shi, Xiaowen
AU - Yang, Jing
AU - He, Liang
N1 - Publisher Copyright:
© 2022 IEEE.
PY - 2022
Y1 - 2022
N2 - Commonsense knowledge graph support applications in commonsense reasoning, question answering, and so on. However, automatic knowledge graph construction is still a continuing goal for AI researchers due to the difficulty of obtaining tractable and objective commonsense information. Besides, the relative researches have so far been mainly limited to English, making it slow to develop the research of commonsense knowledge in other languages. Previous studies constructed the knowledge bases as the relational schemas which use the expert knowledge, semi-structured text extraction and unstructured text extraction. However, with the way of extraction, these methods can only capture the explicit knowledge mentioned in the text, while the commonsense knowledge in the text is usually implicit. In this paper, we propose a commonsense generative model with a novel attention mechanism and discuss whether pre-trained language models can effectively learn and generate novel knowledge. The empirical results show that our model could generate correct commonsense knowledge with high scores which up to 50.10% precision on ATOMIC dataset humans given.
AB - Commonsense knowledge graph support applications in commonsense reasoning, question answering, and so on. However, automatic knowledge graph construction is still a continuing goal for AI researchers due to the difficulty of obtaining tractable and objective commonsense information. Besides, the relative researches have so far been mainly limited to English, making it slow to develop the research of commonsense knowledge in other languages. Previous studies constructed the knowledge bases as the relational schemas which use the expert knowledge, semi-structured text extraction and unstructured text extraction. However, with the way of extraction, these methods can only capture the explicit knowledge mentioned in the text, while the commonsense knowledge in the text is usually implicit. In this paper, we propose a commonsense generative model with a novel attention mechanism and discuss whether pre-trained language models can effectively learn and generate novel knowledge. The empirical results show that our model could generate correct commonsense knowledge with high scores which up to 50.10% precision on ATOMIC dataset humans given.
KW - Attention mechanism
KW - Commonsense knowledge graph construction
KW - Generative model
UR - https://www.scopus.com/pages/publications/85136338529
U2 - 10.1109/ICETCI55101.2022.9832371
DO - 10.1109/ICETCI55101.2022.9832371
M3 - 会议稿件
AN - SCOPUS:85136338529
T3 - 2022 IEEE 2nd International Conference on Electronic Technology, Communication and Information, ICETCI 2022
SP - 20
EP - 24
BT - 2022 IEEE 2nd International Conference on Electronic Technology, Communication and Information, ICETCI 2022
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2nd IEEE International Conference on Electronic Technology, Communication and Information, ICETCI 2022
Y2 - 27 May 2022 through 29 May 2022
ER -