PSP: Pre-training and Structure Prompt Tuning for Graph Neural Networks

  • Qingqing Ge
  • , Zeyuan Zhao
  • , Yiding Liu
  • , Anfeng Cheng
  • , Xiang Li*
  • , Shuaiqiang Wang
  • , Dawei Yin
  • *Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

8 Scopus citations

Abstract

Graph Neural Networks (GNNs) are powerful in learning semantics of graph data. Recently, a new paradigm “pre-train & prompt” has shown promising results in adapting GNNs to various tasks with less supervised data. The success of such paradigm can be attributed to the more consistent objectives of pre-training and task-oriented prompt tuning, where the pre-trained knowledge can be effectively transferred to downstream tasks. Most existing methods are based on the class prototype vector framework. However, in the few-shot scenarios, given few labeled data, class prototype vectors are difficult to be accurately constructed or learned. Meanwhile, the structure information of graph is usually exploited during pre-training for learning node representations, while neglected in the prompt tuning stage for learning more accurate prototype vectors. In addition, they generally ignore the impact of heterophilous neighborhoods on node representation and are not suitable for heterophilous graphs. To bridge these gaps, we propose a novel pre-training and structure prompt tuning framework for GNNs, namely PSP, which consistently exploits structure information in both pre-training and prompt tuning stages. In particular, PSP 1) employs a dual-view contrastive learning to align the latent semantic spaces of node attributes and graph structure, and 2) incorporates structure information in prompted graph to construct more accurate prototype vectors and elicit more pre-trained knowledge in prompt tuning. We conduct extensive experiments on node classification and graph classification tasks to evaluate the effectiveness of PSP. We show that PSP can lead to superior performance in few-shot scenarios on both homophilous and heterophilous graphs. The implemented code is available at https://github.com/gqq1210/PSP.

Original languageEnglish
Title of host publicationMachine Learning and Knowledge Discovery in Databases. Research Track - European Conference, ECML PKDD 2024, Proceedings
EditorsAlbert Bifet, Jesse Davis, Tomas Krilavicius, Meelis Kull, Eirini Ntoutsi, Indre Žliobaite
PublisherSpringer Science and Business Media Deutschland GmbH
Pages423-439
Number of pages17
ISBN (Print)9783031703614
DOIs
StatePublished - 2024
EventEuropean Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, ECML PKDD 2024 - Vilnius, Lithuania
Duration: 9 Sep 202413 Sep 2024

Publication series

NameLecture Notes in Computer Science
Volume14945 LNAI
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

ConferenceEuropean Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, ECML PKDD 2024
Country/TerritoryLithuania
CityVilnius
Period9/09/2413/09/24

Keywords

  • Few-shot
  • Graph Neural Networks
  • Pre-training
  • Prompt

Fingerprint

Dive into the research topics of 'PSP: Pre-training and Structure Prompt Tuning for Graph Neural Networks'. Together they form a unique fingerprint.

Cite this