跳到主要导航 跳到搜索 跳到主要内容

Knowledgeable In-Context Tuning: Exploring and Exploiting Factual Knowledge for In-Context Learning

  • Jianing Wang
  • , Chengyu Wang
  • , Chuanqi Tan
  • , Jun Huang
  • , Ming Gao*
  • *此作品的通讯作者
  • East China Normal University
  • Alibaba Group Holding Ltd.

科研成果: 书/报告/会议事项章节会议稿件同行评审

摘要

Large language models (LLMs) enable in-context learning (ICL) by conditioning on a few labeled training examples as a text-based prompt, eliminating the need for parameter updates and achieving competitive performance. In this paper, we demonstrate that factual knowledge is imperative for the performance of ICL in three core facets: the inherent knowledge learned in LLMs, the factual knowledge derived from the selected in-context examples, and the knowledge biases in LLMs for output generation. To unleash the power of LLMs in few-shot learning scenarios, we introduce a novel Knowledgeable In-Context Tuning (KICT) framework to further improve the performance of ICL: 1) injecting knowledge into LLMs during continual self-supervised pretraining, 2) judiciously selecting the examples for ICL with high knowledge relevance, and 3) calibrating the prediction results based on prior knowledge. We evaluate the proposed approaches on autoregressive models (e.g., GPT-style LLMs) over multiple text classification and question-answering tasks. Experimental results demonstrate that KICT substantially outperforms strong baselines and improves by more than 13% and 7% on text classification and question-answering tasks, respectively.

源语言英语
主期刊名Findings of the Association for Computational Linguistics
主期刊副标题NAACL 2024 - Findings
编辑Kevin Duh, Helena Gomez, Steven Bethard
出版商Association for Computational Linguistics (ACL)
3261-3280
页数20
ISBN(电子版)9798891761193
DOI
出版状态已出版 - 2024
活动2024 Findings of the Association for Computational Linguistics: NAACL 2024 - Hybrid, Mexico City, 墨西哥
期限: 16 6月 202421 6月 2024

出版系列

姓名Findings of the Association for Computational Linguistics: NAACL 2024 - Findings

会议

会议2024 Findings of the Association for Computational Linguistics: NAACL 2024
国家/地区墨西哥
Hybrid, Mexico City
时期16/06/2421/06/24

指纹

探究 'Knowledgeable In-Context Tuning: Exploring and Exploiting Factual Knowledge for In-Context Learning' 的科研主题。它们共同构成独一无二的指纹。

引用此