arXiv Analytics

Sign in

arXiv:2502.08918 [cs.LG]AbstractReferencesReviewsResources

CLEAR: Cluster-based Prompt Learning on Heterogeneous Graphs

Feiyang Wang, Zhongbao Zhang, Junda Ye, Li Sun, Jianzhong Qi

Published 2025-02-13Version 1

Prompt learning has attracted increasing attention in the graph domain as a means to bridge the gap between pretext and downstream tasks. Existing studies on heterogeneous graph prompting typically use feature prompts to modify node features for specific downstream tasks, which do not concern the structure of heterogeneous graphs. Such a design also overlooks information from the meta-paths, which are core to learning the high-order semantics of the heterogeneous graphs. To address these issues, we propose CLEAR, a Cluster-based prompt LEARNING model on heterogeneous graphs. We present cluster prompts that reformulate downstream tasks as heterogeneous graph reconstruction. In this way, we align the pretext and downstream tasks to share the same training objective. Additionally, our cluster prompts are also injected into the meta-paths such that the prompt learning process incorporates high-order semantic information entailed by the meta-paths. Extensive experiments on downstream tasks confirm the superiority of CLEAR. It consistently outperforms state-of-the-art models, achieving up to 5% improvement on the F1 metric for node classification.

Related articles: Most relevant | Search more
arXiv:2211.03782 [cs.LG] (Published 2022-11-07)
On minimal variations for unsupervised representation learning
arXiv:2309.17002 [cs.LG] (Published 2023-09-29)
Understanding and Mitigating the Label Noise in Pre-training on Downstream Tasks
Hao Chen et al.
arXiv:2005.10039 [cs.LG] (Published 2020-05-20)
The Effects of Randomness on the Stability of Node Embeddings