Javascript must be enabled to continue!
Self-Supervised Heterogeneous Graph Neural Network with Multi-Scale Meta-Path Contrastive Learning
View through CrossRef
Abstract
Heterogeneous graph neural networks (HGNNs) exhibit remarkable capabilities in modeling complex structures and multi-semantic information. However, existing methods mainly focus on capturing high-order association patterns between heterogeneous nodes when constructing meta-paths, while they often lack sufficient expressive power for local neighborhood information. This limitation hinders their ability to effectively model both global and local structural relationships. To address this issue, we propose a self-supervised heterogeneous graph neural network (HMMC) based on multi-scale meta-path contrastive learning. The proposed approach introduces a multi-scale meta-path embedding mechanism that jointly captures both local and global structural information. Additionally, we design a cross-view self-supervised contrastive learning framework to optimize representations across multiple views, thereby enhancing the model's capacity to represent heterogeneous graph topological structures. To effectively mitigate the negative sample noise that often interferes with model optimization in traditional contrastive learning methods, we propose a novel star-shaped contrastive loss. This loss function ensures the representational consistency of positive sample pairs by constructing a multi-level optimization strategy involving center nodes, positive samples, and negative samples. Experimental results show that the proposed method outperforms existing state-of-the-art approaches across multiple datasets, achieving performance improvements of 0.5–4.1%, thus fully validating its representational capacity, robustness, and generalizability in heterogeneous graph learning tasks.
Title: Self-Supervised Heterogeneous Graph Neural Network with Multi-Scale Meta-Path Contrastive Learning
Description:
Abstract
Heterogeneous graph neural networks (HGNNs) exhibit remarkable capabilities in modeling complex structures and multi-semantic information.
However, existing methods mainly focus on capturing high-order association patterns between heterogeneous nodes when constructing meta-paths, while they often lack sufficient expressive power for local neighborhood information.
This limitation hinders their ability to effectively model both global and local structural relationships.
To address this issue, we propose a self-supervised heterogeneous graph neural network (HMMC) based on multi-scale meta-path contrastive learning.
The proposed approach introduces a multi-scale meta-path embedding mechanism that jointly captures both local and global structural information.
Additionally, we design a cross-view self-supervised contrastive learning framework to optimize representations across multiple views, thereby enhancing the model's capacity to represent heterogeneous graph topological structures.
To effectively mitigate the negative sample noise that often interferes with model optimization in traditional contrastive learning methods, we propose a novel star-shaped contrastive loss.
This loss function ensures the representational consistency of positive sample pairs by constructing a multi-level optimization strategy involving center nodes, positive samples, and negative samples.
Experimental results show that the proposed method outperforms existing state-of-the-art approaches across multiple datasets, achieving performance improvements of 0.
5–4.
1%, thus fully validating its representational capacity, robustness, and generalizability in heterogeneous graph learning tasks.
Related Results
Domination of Polynomial with Application
Domination of Polynomial with Application
In this paper, .We .initiate the study of domination. polynomial , consider G=(V,E) be a simple, finite, and directed graph without. isolated. vertex .We present a study of the Ira...
Self-Supervised Contrastive Representation Learning in Computer Vision
Self-Supervised Contrastive Representation Learning in Computer Vision
Although its origins date a few decades back, contrastive learning has recently gained popularity due to its achievements in self-supervised learning, especially in computer vision...
Self-Supervised Based Multi-View Graph Presentation Learning for Drug-Drug Interaction Prediction
Self-Supervised Based Multi-View Graph Presentation Learning for Drug-Drug Interaction Prediction
Article
Self-Supervised Based Multi-View Graph Presentation Learning for Drug-Drug Interaction Prediction
Kuang Du 1, Jing Du 2 and Zhi Wei 1,*
1 Department of Computer Science...
Improving Neural Retrieval with Contrastive Learning
Improving Neural Retrieval with Contrastive Learning
In recent years, neural retrieval models have shown remarkable progress in improving the efficiency and accuracy of information retrieval systems. However, challenges remain in eff...
MSRHNN:Multidimensional Social Relation under Heterogeneous Neural Network for Recommendation
MSRHNN:Multidimensional Social Relation under Heterogeneous Neural Network for Recommendation
Abstract
With the growing popularity of mobile smart devices and the availability of 4G and 5G networks, social recommendation systems have become a hot research topic for ...
Dimensionality Reduction and Denoising of Spatial Transcriptomics Data Using Dual-Channel Masked Graph Autoencoder
Dimensionality Reduction and Denoising of Spatial Transcriptomics Data Using Dual-Channel Masked Graph Autoencoder
AbstractRecent advances in spatial transcriptomics (ST) technology allow researchers to comprehensively measure gene expression patterns at the level of individual cells or even su...
Abstract 902: Explainable AI: Graph machine learning for response prediction and biomarker discovery
Abstract 902: Explainable AI: Graph machine learning for response prediction and biomarker discovery
Abstract
Accurately predicting drug sensitivity and understanding what is driving it are major challenges in drug discovery. Graphs are a natural framework for captu...
Subgraph Adaptive Structure-Aware Graph Contrastive Learning
Subgraph Adaptive Structure-Aware Graph Contrastive Learning
Graph contrastive learning (GCL) has been subject to more attention and been widely applied to numerous graph learning tasks such as node classification and link prediction. Althou...


