{ "id": "1909.03211", "version": "v1", "published": "2019-09-07T08:14:41.000Z", "updated": "2019-09-07T08:14:41.000Z", "title": "Measuring and Relieving the Over-smoothing Problem for Graph Neural Networks from the Topological View", "authors": [ "Deli Chen", "Yankai Lin", "Wei Li", "Peng Li", "Jie Zhou", "Xu Sun" ], "comment": "8 pages, 7 Figures and 5 Tables", "categories": [ "cs.LG", "cs.SI", "stat.ML" ], "abstract": "Graph Neural Networks (GNNs) have achieved promising performance on a wide range of graph-based tasks. Despite their success, one severe limitation of GNNs is the over-smoothing issue (indistinguishable representations of nodes in different classes). In this work, we present a systematic and quantitative study on the over-smoothing issue of GNNs. First, we introduce two quantitative metrics, MAD and MADGap, to measure the smoothness and over-smoothness of the graph nodes representations, respectively. Then, we verify that smoothing is the nature of GNNs and the critical factor leading to over-smoothness is the low information-to-noise ratio of the message received by the nodes, which is partially determined by the graph topology. Finally, we propose two methods to alleviate the over-smoothing issue from the topological view: (1) MADReg which adds a MADGap-based regularizer to the training objective;(2) AdaGraph which optimizes the graph topology based on the model predictions. Extensive experiments on 7 widely-used graph datasets with 10 typical GNN models show that the two proposed methods are effective for relieving the over-smoothing issue, thus improving the performance of various GNN models.", "revisions": [ { "version": "v1", "updated": "2019-09-07T08:14:41.000Z" } ], "analyses": { "keywords": [ "graph neural networks", "topological view", "over-smoothing problem", "over-smoothing issue", "graph topology" ], "note": { "typesetting": "TeX", "pages": 8, "language": "en", "license": "arXiv", "status": "editable" } } }