arXiv Analytics

Sign in

arXiv:2405.17163 [cs.LG]AbstractReferencesReviewsResources

Port-Hamiltonian Architectural Bias for Long-Range Propagation in Deep Graph Networks

Simon Heilig, Alessio Gravina, Alessandro Trenta, Claudio Gallicchio, Davide Bacciu

Published 2024-05-27, updated 2025-02-13Version 2

The dynamics of information diffusion within graphs is a critical open issue that heavily influences graph representation learning, especially when considering long-range propagation. This calls for principled approaches that control and regulate the degree of propagation and dissipation of information throughout the neural flow. Motivated by this, we introduce (port-)Hamiltonian Deep Graph Networks, a novel framework that models neural information flow in graphs by building on the laws of conservation of Hamiltonian dynamical systems. We reconcile under a single theoretical and practical framework both non-dissipative long-range propagation and non-conservative behaviors, introducing tools from mechanical systems to gauge the equilibrium between the two components. Our approach can be applied to general message-passing architectures, and it provides theoretical guarantees on information conservation in time. Empirical results prove the effectiveness of our port-Hamiltonian scheme in pushing simple graph convolutional architectures to state-of-the-art performance in long-range benchmarks.

Related articles: Most relevant | Search more
arXiv:2104.08060 [cs.LG] (Published 2021-04-16)
MEG: Generating Molecular Counterfactual Explanations for Deep Graph Networks
arXiv:2410.10464 [cs.LG] (Published 2024-10-14, updated 2024-10-15)
Information propagation dynamics in Deep Graph Networks
arXiv:2312.16560 [cs.LG] (Published 2023-12-27)
Adaptive Message Passing: A General Framework to Mitigate Oversmoothing, Oversquashing, and Underreaching