arXiv Analytics

Sign in

arXiv:1906.03707 [cs.LG]AbstractReferencesReviewsResources

Redundancy-Free Computation Graphs for Graph Neural Networks

Zhihao Jia, Sina Lin, Rex Ying, Jiaxuan You, Jure Leskovec, Alex Aiken

Published 2019-06-09Version 1

Graph Neural Networks (GNNs) are based on repeated aggregations of information across nodes' neighbors in a graph. However, because common neighbors are shared between different nodes, this leads to repeated and inefficient computations. We propose Hierarchically Aggregated computation Graphs (HAGs), a new GNN graph representation that explicitly avoids redundancy by managing intermediate aggregation results hierarchically, eliminating repeated computations and unnecessary data transfers in GNN training and inference. We introduce an accurate cost function to quantitatively evaluate the runtime performance of different HAGs and use a novel HAG search algorithm to find optimized HAGs. Experiments show that the HAG representation significantly outperforms the standard GNN graph representation by increasing the end-to-end training throughput by up to 2.8x and reducing the aggregations and data transfers in GNN training by up to 6.3x and 5.6x, while maintaining the original model accuracy.

Related articles: Most relevant | Search more
arXiv:2006.15437 [cs.LG] (Published 2020-06-27)
GPT-GNN: Generative Pre-Training of Graph Neural Networks
arXiv:1910.10682 [cs.LG] (Published 2019-10-23)
Feature Selection and Extraction for Graph Neural Networks
arXiv:2005.06649 [cs.LG] (Published 2020-05-13)
How hard is graph isomorphism for graph neural networks?