arXiv Analytics

Sign in

arXiv:1909.03211 [cs.LG]AbstractReferencesReviewsResources

Measuring and Relieving the Over-smoothing Problem for Graph Neural Networks from the Topological View

Deli Chen, Yankai Lin, Wei Li, Peng Li, Jie Zhou, Xu Sun

Published 2019-09-07Version 1

Graph Neural Networks (GNNs) have achieved promising performance on a wide range of graph-based tasks. Despite their success, one severe limitation of GNNs is the over-smoothing issue (indistinguishable representations of nodes in different classes). In this work, we present a systematic and quantitative study on the over-smoothing issue of GNNs. First, we introduce two quantitative metrics, MAD and MADGap, to measure the smoothness and over-smoothness of the graph nodes representations, respectively. Then, we verify that smoothing is the nature of GNNs and the critical factor leading to over-smoothness is the low information-to-noise ratio of the message received by the nodes, which is partially determined by the graph topology. Finally, we propose two methods to alleviate the over-smoothing issue from the topological view: (1) MADReg which adds a MADGap-based regularizer to the training objective;(2) AdaGraph which optimizes the graph topology based on the model predictions. Extensive experiments on 7 widely-used graph datasets with 10 typical GNN models show that the two proposed methods are effective for relieving the over-smoothing issue, thus improving the performance of various GNN models.

Comments: 8 pages, 7 Figures and 5 Tables
Categories: cs.LG, cs.SI, stat.ML
Related articles: Most relevant | Search more
arXiv:2006.15437 [cs.LG] (Published 2020-06-27)
GPT-GNN: Generative Pre-Training of Graph Neural Networks
arXiv:1910.10682 [cs.LG] (Published 2019-10-23)
Feature Selection and Extraction for Graph Neural Networks
arXiv:2005.06649 [cs.LG] (Published 2020-05-13)
How hard is graph isomorphism for graph neural networks?