跳到主要导航 跳到搜索 跳到主要内容

A Joint-BERT Method for Knowledge Base Question Answering

科研成果: 书/报告/会议事项章节会议稿件同行评审

摘要

This paper proposes a Joint-BERT method to solve the knowledge base question answering (KBQA) task involved a single fact. It divides it into two subtasks, topic entity recognition and relation detection. For the entity recognition subtask, instead of treating it as a sequence labeling task, a simpler approach of applying a pointer network on the BERT encoder is used to predict the start and end positions of the topic entity. The subtask of relation detection, sharing the BERT encoder, computes candidate predicates rankings from both local and global matching perspectives, and finally trains the two tasks jointly in a multi-task learning framework, so that the two tasks benefit from each other. Experiments show that: Joint-BERT model achieves competitive results on the SimpleQuestions benchmark.

源语言英语
主期刊名MLNLP 2022 - 2022 5th International Conference on Machine Learning and Natural Language Processing, Conference Proceedings
出版商Association for Computing Machinery
35-40
页数6
ISBN(电子版)9781450399067
DOI
出版状态已出版 - 23 12月 2022
活动5th International Conference on Machine Learning and Natural Language Processing, MLNLP 2022 - Sanya, 中国
期限: 23 12月 202225 12月 2022

出版系列

姓名ACM International Conference Proceeding Series

会议

会议5th International Conference on Machine Learning and Natural Language Processing, MLNLP 2022
国家/地区中国
Sanya
时期23/12/2225/12/22

指纹

探究 'A Joint-BERT Method for Knowledge Base Question Answering' 的科研主题。它们共同构成独一无二的指纹。

引用此