A hybrid model with pre-trained entity-aware transformer for relation extraction

Jinxin Yao, Min Zhang*, Biyang Wang, Xianda Xu

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Distantly supervised relation extraction is an efficient method to extract novel relational facts from unstructed text. Most previous neural methods adopt Convolutional Neural Network (CNN) or Recurrent Neural Network (RNN) to encode sentences. However, CNN is difficult to learn long-range dependencies and the parallelization of training RNN is precluded by its sequential nature. In this paper, we propose a novel hybrid model that combines Piece-wise Convolutional Neural Network (PCNN) and Entity-Aware Transformer to extract local features and learn the dependencies between distant positions jointly. The entity-aware Transformer is able to take semantic and syntax information under consideration and acquire entity-specific representations. The inner-sentence attention mechanism is then used over Transformer to alleviate the noise caused by irrelevant words. We concatenate outputs of PCNN and Transformer with word embeddings of entity mentions and then send them to the classifier, which can boost the performance of our model further. A transfer learning based strategy is applied, where the entity-aware Transformer is initialized with a priori knowledge learned from the related task of entity typing to improve the robustness of our model. The experimental results on a large-scale benchmark dataset show that our hybrid model with the pre-training strategy gets AUC score of 0.432 and outperforms the state-of-the-art baselines.

Original languageEnglish
Title of host publicationKnowledge Science, Engineering and Management - 13th International Conference, KSEM 2020, Proceedings, Part 1
EditorsGang Li, Heng Tao Shen, Ye Yuan, Xiaoyang Wang, Huawen Liu, Xiang Zhao
PublisherSpringer
Pages148-160
Number of pages13
ISBN (Print)9783030551292
DOIs
StatePublished - 2020
Event13th International Conference on Knowledge Science, Engineering and Management, KSEM 2020 - Hangzhou, China
Duration: 28 Aug 202030 Aug 2020

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume12274 LNAI
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference13th International Conference on Knowledge Science, Engineering and Management, KSEM 2020
Country/TerritoryChina
CityHangzhou
Period28/08/2030/08/20

Keywords

  • Relation extraction
  • Transfer learning
  • Transformer

Fingerprint

Dive into the research topics of 'A hybrid model with pre-trained entity-aware transformer for relation extraction'. Together they form a unique fingerprint.

Cite this