TY - JOUR
T1 - Graph Curvature Flow-Based Masked Attention
AU - Chen, Yili
AU - Wan, Zheng
AU - Li, Yangyang
AU - He, Xiao
AU - Wei, Xian
AU - Han, Jun
N1 - Publisher Copyright:
© 2024 American Chemical Society.
PY - 2024/11/11
Y1 - 2024/11/11
N2 - Graph neural networks (GNNs) have revolutionized drug discovery in chemistry and biology, enhancing efficiency and reducing resource demands. However, classical GNNs often struggle to capture long-range dependencies due to challenges like oversmoothing and oversquashing. Graph Transformers address these issues by employing global self-attention mechanisms that allow direct information exchange between any pair of nodes, enabling the modeling of long-range interactions. Despite this, Graph Transformers often face difficulties in capturing the nuanced structural information on graphs. To overcome these challenges, we introduce the CurvFlow-Transformer, a novel graph Transformer model incorporating a curvature flow-based masked attention mechanism. By leveraging a topologically enhanced mask matrix, the attention layer can effectively detect subtle structural differences within graphs, balancing the focus between global mutual information and local structural details of molecules. The CurvFlow-Transformer demonstrates superior performance on the MoleculeNet data set, surpassing several state-of-the-art models across various tasks. Moreover, the model provides unique insights into the relationship between molecular structure and chemical properties by analyzing the attention heat coefficients of individual atoms.
AB - Graph neural networks (GNNs) have revolutionized drug discovery in chemistry and biology, enhancing efficiency and reducing resource demands. However, classical GNNs often struggle to capture long-range dependencies due to challenges like oversmoothing and oversquashing. Graph Transformers address these issues by employing global self-attention mechanisms that allow direct information exchange between any pair of nodes, enabling the modeling of long-range interactions. Despite this, Graph Transformers often face difficulties in capturing the nuanced structural information on graphs. To overcome these challenges, we introduce the CurvFlow-Transformer, a novel graph Transformer model incorporating a curvature flow-based masked attention mechanism. By leveraging a topologically enhanced mask matrix, the attention layer can effectively detect subtle structural differences within graphs, balancing the focus between global mutual information and local structural details of molecules. The CurvFlow-Transformer demonstrates superior performance on the MoleculeNet data set, surpassing several state-of-the-art models across various tasks. Moreover, the model provides unique insights into the relationship between molecular structure and chemical properties by analyzing the attention heat coefficients of individual atoms.
UR - https://www.scopus.com/pages/publications/85207286891
U2 - 10.1021/acs.jcim.4c01616
DO - 10.1021/acs.jcim.4c01616
M3 - 文章
C2 - 39443864
AN - SCOPUS:85207286891
SN - 1549-9596
VL - 64
SP - 8153
EP - 8163
JO - Journal of Chemical Information and Modeling
JF - Journal of Chemical Information and Modeling
IS - 21
ER -