HeGCL: Advance Self-Supervised Learning in Heterogeneous Graph-Level Representation

  • Gen Shi
  • , Yifan Zhu
  • , Jian K. Liu
  • , Xuesong Li*
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Representation learning in heterogeneous graphs with massive unlabeled data has aroused great interest. The heterogeneity of graphs not only contains rich information, but also raises difficult barriers to designing unsupervised or self-supervised learning (SSL) strategies. Existing methods such as random walk-based approaches are mainly dependent on the proximity information of neighbors and lack the ability to integrate node features into a higher-level representation. Furthermore, previous self-supervised or unsupervised frameworks are usually designed for node-level tasks, which are commonly short of capturing global graph properties and may not perform well in graph-level tasks. Therefore, a label-free framework that can better capture the global properties of heterogeneous graphs is urgently required. In this article, we propose a self-supervised heterogeneous graph neural network (GNN) based on cross-view contrastive learning (HeGCL). The HeGCL presents two views for encoding heterogeneous graphs: the meta-path view and the outline view. Compared with the meta-path view that provides semantic information, the outline view encodes the complex edge relations and captures graph-level properties by using a nonlocal block. Thus, the HeGCL learns node embeddings through maximizing mutual information (MI) between global and semantic representations coming from the outline and meta-path view, respectively. Experiments on both node-level and graph-level tasks show the superiority of the proposed model over other methods, and further exploration studies also show that the introduction of nonlocal block brings a significant contribution to graph-level tasks.
Original languageEnglish
JournalIEEE Transactions on Neural Networks and Learning Systems
Early online date25 May 2023
DOIs
Publication statusE-pub ahead of print - 25 May 2023

Bibliographical note

Funding:
This work was supported in part by the National Natural Science Foundation of China under Grant 62071049 and Grant 61801026, in part by the Beijing Natural Science Foundation Project under Grant 4222018, and in part by the China Postdoctoral Science Foundation under Grant 2022M711814.

Keywords

  • Graph neural networks (GNNs)
  • heterogeneous graphs
  • self-supervised learning (SSL)

Fingerprint

Dive into the research topics of 'HeGCL: Advance Self-Supervised Learning in Heterogeneous Graph-Level Representation'. Together they form a unique fingerprint.

Cite this