TY - GEN
T1 - Learning Attention-Based Translational Knowledge Graph Embedding via Nonlinear Dynamic Mapping
AU - Wang, Zhihao
AU - Xu, Honggang
AU - Li, Xin
AU - Deng, Yuxin
N1 - Publisher Copyright:
© 2021, Springer Nature Switzerland AG.
PY - 2021
Y1 - 2021
N2 - Knowledge graph embedding has become a promising method for knowledge graph completion. It aims to learn low-dimensional embeddings in continuous vector space for each entity and relation. It remains challenging to learn accurate embeddings for complex multi-relational facts. In this paper, we propose a new translation-based embedding method named ATransD-NL to address the following two observations. First, most existing translational methods do not consider contextual information that have been proved useful for improving performance of link prediction. Our method learns attention-based embeddings for each triplet taking into account influence of one-hop or potentially multi-hop neighbourhood entities. Second, we apply nonlinear dynamic projection of head and tail entities to relational space, to capture nonlinear correlations among entities and relations due to complex multi-relational facts. As an extension of TransD, our model only introduces one more extra parameter, giving a good tradeoff between model complexity and the state-of-the-art predictive accuracy. Compared with state-of-the-art translation-based methods and the neural-network based methods, experiment results show that our method delivers substantial improvements over baselines on the MeanRank metric of link prediction, e.g., an improvement of 35.6% over the attention-based graph embedding method KBGAT and an improvement of 64% over the translational method TransMS on WN18 database, with comparable performance on the Hits@10 metric.
AB - Knowledge graph embedding has become a promising method for knowledge graph completion. It aims to learn low-dimensional embeddings in continuous vector space for each entity and relation. It remains challenging to learn accurate embeddings for complex multi-relational facts. In this paper, we propose a new translation-based embedding method named ATransD-NL to address the following two observations. First, most existing translational methods do not consider contextual information that have been proved useful for improving performance of link prediction. Our method learns attention-based embeddings for each triplet taking into account influence of one-hop or potentially multi-hop neighbourhood entities. Second, we apply nonlinear dynamic projection of head and tail entities to relational space, to capture nonlinear correlations among entities and relations due to complex multi-relational facts. As an extension of TransD, our model only introduces one more extra parameter, giving a good tradeoff between model complexity and the state-of-the-art predictive accuracy. Compared with state-of-the-art translation-based methods and the neural-network based methods, experiment results show that our method delivers substantial improvements over baselines on the MeanRank metric of link prediction, e.g., an improvement of 35.6% over the attention-based graph embedding method KBGAT and an improvement of 64% over the translational method TransMS on WN18 database, with comparable performance on the Hits@10 metric.
KW - Attention mechanism
KW - Knowledge graph embedding
KW - Link prediction
KW - Translation-based methods
UR - https://www.scopus.com/pages/publications/85111037779
U2 - 10.1007/978-3-030-75768-7_12
DO - 10.1007/978-3-030-75768-7_12
M3 - 会议稿件
AN - SCOPUS:85111037779
SN - 9783030757670
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 141
EP - 154
BT - Advances in Knowledge Discovery and Data Mining - 25th Pacific-Asia Conference, PAKDD 2021, Proceedings
A2 - Karlapalem, Kamal
A2 - Cheng, Hong
A2 - Ramakrishnan, Naren
A2 - Agrawal, R. K.
A2 - Reddy, P. Krishna
A2 - Srivastava, Jaideep
A2 - Chakraborty, Tanmoy
PB - Springer Science and Business Media Deutschland GmbH
T2 - 25th Pacific-Asia Conference on Knowledge Discovery and Data Mining, PAKDD 2021
Y2 - 11 May 2021 through 14 May 2021
ER -