Enhancing Textbook Question Answering with Knowledge Graph-Augmented Large Language Models

  • Mengliang He
  • , Aimin Zhou
  • , Xiaoming Shi*
  • *Corresponding author for this work

Research output: Contribution to journalConference articlepeer-review

1 Scopus citations

Abstract

Previous works on Textbook Question Answering suffer from limited performance due to the small-scale neural network based backbone. To alleviate the issue, we propose to utilize LLMs as the backbone of TQA tasks. To this end, we utilize two methods, the raw-context based prompting method and the knowledge graph based prompting method. Specifically, we introduce the Textbook Question Answering-Knowledge Graph (TQA-KG) method, which first converts textbook content into structural knowledge graphs and then combining knowledge graph into LLM prompting, thereby enhancing the model’s reasoning capabilities and answer accuracy. Extensive experiments conducted on the CK12-QA dataset illustrate the effectiveness of the method, achieving an improvement of 5.67% in accuracy compared to current state-of-the-art methods on average.

Original languageEnglish
Pages (from-to)639-654
Number of pages16
JournalProceedings of Machine Learning Research
Volume260
StatePublished - 2024
Event16th Asian Conference on Machine Learning, ACML 2024 - Hanoi, Viet Nam
Duration: 5 Dec 20248 Dec 2024

Keywords

  • Large language model
  • Retrieve-and-Generate
  • Textbook Question Answering
  • knowledge graphs

Fingerprint

Dive into the research topics of 'Enhancing Textbook Question Answering with Knowledge Graph-Augmented Large Language Models'. Together they form a unique fingerprint.

Cite this