Probing Effects of Contextual Bias on Number Magnitude Estimation

  • Xuehao Du
  • , Ping Ji*
  • , Wei Qin
  • , Lei Wang
  • , Yunshi Lan
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

The semantic understanding of numbers requires association with context. However, powerful neural networks overfit spurious correlations between context and numbers in training corpus can lead to the occurrence of contextual bias, which may affect the network’s accurate estimation of number magnitude when making inferences in real-world data. To investigate the resilience of current methodologies against contextual bias, we introduce a novel out-of-distribution (OOD) numerical question-answering (QA) dataset that features specific correlations between context and numbers in the training data, which are not present in the OOD test data. We evaluate the robustness of different numerical encoding and decoding methods when confronted with contextual bias on this dataset. Our findings indicate that encoding methods incorporating more detailed digit information exhibit greater resilience against contextual bias. Inspired by this finding, we propose a digit-aware position embedding strategy, and the experimental results demonstrate that this strategy is highly effective in improving the robustness of neural networks against contextual bias.

Original languageEnglish
Pages (from-to)2464-2482
Number of pages19
JournalKSII Transactions on Internet and Information Systems
Volume18
Issue number9
DOIs
StatePublished - 30 Sep 2024

Keywords

  • Contextual bias
  • Natural language processing
  • Number magnitude estimation
  • Out of distribution
  • Question answering

Fingerprint

Dive into the research topics of 'Probing Effects of Contextual Bias on Number Magnitude Estimation'. Together they form a unique fingerprint.

Cite this