A Joint-BERT Method for Knowledge Base Question Answering

  • Tianyu Zhang*
  • , Zhiyun Chen
  • *Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Scopus citations

Abstract

This paper proposes a Joint-BERT method to solve the knowledge base question answering (KBQA) task involved a single fact. It divides it into two subtasks, topic entity recognition and relation detection. For the entity recognition subtask, instead of treating it as a sequence labeling task, a simpler approach of applying a pointer network on the BERT encoder is used to predict the start and end positions of the topic entity. The subtask of relation detection, sharing the BERT encoder, computes candidate predicates rankings from both local and global matching perspectives, and finally trains the two tasks jointly in a multi-task learning framework, so that the two tasks benefit from each other. Experiments show that: Joint-BERT model achieves competitive results on the SimpleQuestions benchmark.

Original languageEnglish
Title of host publicationMLNLP 2022 - 2022 5th International Conference on Machine Learning and Natural Language Processing, Conference Proceedings
PublisherAssociation for Computing Machinery
Pages35-40
Number of pages6
ISBN (Electronic)9781450399067
DOIs
StatePublished - 23 Dec 2022
Event5th International Conference on Machine Learning and Natural Language Processing, MLNLP 2022 - Sanya, China
Duration: 23 Dec 202225 Dec 2022

Publication series

NameACM International Conference Proceeding Series

Conference

Conference5th International Conference on Machine Learning and Natural Language Processing, MLNLP 2022
Country/TerritoryChina
CitySanya
Period23/12/2225/12/22

Keywords

  • BERT
  • KBQA
  • Multi-task Learning
  • Pointer Network

Fingerprint

Dive into the research topics of 'A Joint-BERT Method for Knowledge Base Question Answering'. Together they form a unique fingerprint.

Cite this