TY - GEN
T1 - TiRGN
T2 - 31st International Joint Conference on Artificial Intelligence, IJCAI 2022
AU - Li, Yujia
AU - Sun, Shiliang
AU - Zhao, Jing
N1 - Publisher Copyright:
© 2022 International Joint Conferences on Artificial Intelligence. All rights reserved.
PY - 2022
Y1 - 2022
N2 - Temporal knowledge graphs (TKGs) have been widely used in various fields that model the dynamics of facts along the timeline. In the extrapolation setting of TKG reasoning, since facts happening in the future are entirely unknowable, insight into history is the key to predicting future facts. However, it is still a great challenge for existing models as they hardly learn the characteristics of historical events adequately. From the perspective of historical development laws, comprehensively considering the sequential, repetitive, and cyclical patterns of historical facts is conducive to predicting future facts. To this end, we propose a novel representation learning model for TKG reasoning, namely TiRGN, a time-guided recurrent graph network with local-global historical patterns. Specifically, TiRGN uses a local recurrent graph encoder network to model the historical dependency of events at adjacent timestamps and uses the global history encoder network to collect repeated historical facts. After the trade-off between the two encoders, the final inference is performed by a decoder with periodicity. We use six benchmark datasets to evaluate the proposed method. The experimental results show that TiRGN outperforms the state-of-the-art TKG reasoning methods in most cases.
AB - Temporal knowledge graphs (TKGs) have been widely used in various fields that model the dynamics of facts along the timeline. In the extrapolation setting of TKG reasoning, since facts happening in the future are entirely unknowable, insight into history is the key to predicting future facts. However, it is still a great challenge for existing models as they hardly learn the characteristics of historical events adequately. From the perspective of historical development laws, comprehensively considering the sequential, repetitive, and cyclical patterns of historical facts is conducive to predicting future facts. To this end, we propose a novel representation learning model for TKG reasoning, namely TiRGN, a time-guided recurrent graph network with local-global historical patterns. Specifically, TiRGN uses a local recurrent graph encoder network to model the historical dependency of events at adjacent timestamps and uses the global history encoder network to collect repeated historical facts. After the trade-off between the two encoders, the final inference is performed by a decoder with periodicity. We use six benchmark datasets to evaluate the proposed method. The experimental results show that TiRGN outperforms the state-of-the-art TKG reasoning methods in most cases.
UR - https://www.scopus.com/pages/publications/85137854207
U2 - 10.24963/ijcai.2022/299
DO - 10.24963/ijcai.2022/299
M3 - 会议稿件
AN - SCOPUS:85137854207
T3 - IJCAI International Joint Conference on Artificial Intelligence
SP - 2152
EP - 2158
BT - Proceedings of the 31st International Joint Conference on Artificial Intelligence, IJCAI 2022
A2 - De Raedt, Luc
A2 - De Raedt, Luc
PB - International Joint Conferences on Artificial Intelligence
Y2 - 23 July 2022 through 29 July 2022
ER -