arXiv Analytics

Sign in

arXiv:2501.06444 [cs.LG]AbstractReferencesReviewsResources

On the Computational Capability of Graph Neural Networks: A Circuit Complexity Bound Perspective

Xiaoyu Li, Yingyu Liang, Zhenmei Shi, Zhao Song, Wei Wang, Jiahao Zhang

Published 2025-01-11Version 1

Graph Neural Networks (GNNs) have become the standard approach for learning and reasoning over relational data, leveraging the message-passing mechanism that iteratively propagates node embeddings through graph structures. While GNNs have achieved significant empirical success, their theoretical limitations remain an active area of research. Existing studies primarily focus on characterizing GNN expressiveness through Weisfeiler-Lehman (WL) graph isomorphism tests. In this paper, we take a fundamentally different approach by exploring the computational limitations of GNNs through the lens of circuit complexity. Specifically, we analyze the circuit complexity of common GNN architectures and prove that under constraints of constant-depth layers, linear or sublinear embedding sizes, and polynomial precision, GNNs cannot solve key problems such as graph connectivity and graph isomorphism unless $\mathsf{TC}^0 = \mathsf{NC}^1$. These results reveal the intrinsic expressivity limitations of GNNs behind their empirical success and introduce a novel framework for analyzing GNN expressiveness that can be extended to a broader range of GNN models and graph decision problems.

Related articles: Most relevant | Search more
arXiv:2005.06649 [cs.LG] (Published 2020-05-13)
How hard is graph isomorphism for graph neural networks?
arXiv:2307.00134 [cs.LG] (Published 2023-06-30)
Generalization Limits of Graph Neural Networks in Identity Effects Learning
arXiv:2002.02046 [cs.LG] (Published 2020-02-06)
Supervised Learning on Relational Databases with Graph Neural Networks