arXiv Analytics

Sign in

arXiv:2202.10156 [cs.LG]AbstractReferencesReviewsResources

1-WL Expressiveness Is (Almost) All You Need

Markus Zopf

Published 2022-02-21Version 1

It has been shown that a message passing neural networks (MPNNs), a popular family of neural networks for graph-structured data, are at most as expressive as the first-order Weisfeiler-Leman (1-WL) graph isomorphism test, which has motivated the development of more expressive architectures. In this work, we analyze if the limited expressiveness is actually a limiting factor for MPNNs and other WL-based models in standard graph datasets. Interestingly, we find that the expressiveness of WL is sufficient to identify almost all graphs in most datasets. Moreover, we find that the classification accuracy upper bounds are often close to 100\%. Furthermore, we find that simple WL-based neural networks and several MPNNs can be fitted to several datasets. In sum, we conclude that the performance of WL/MPNNs is not limited by their expressiveness in practice.

Related articles: Most relevant | Search more
arXiv:2203.16995 [cs.LG] (Published 2022-03-31)
Message Passing Neural Networks for Hypergraphs
arXiv:2206.01003 [cs.LG] (Published 2022-06-02)
Shortest Path Networks for Graph Property Prediction
arXiv:2302.02941 [cs.LG] (Published 2023-02-06)
On Over-Squashing in Message Passing Neural Networks: The Impact of Width, Depth, and Topology