{ "id": "2405.17163", "version": "v2", "published": "2024-05-27T13:36:50.000Z", "updated": "2025-02-13T16:32:55.000Z", "title": "Port-Hamiltonian Architectural Bias for Long-Range Propagation in Deep Graph Networks", "authors": [ "Simon Heilig", "Alessio Gravina", "Alessandro Trenta", "Claudio Gallicchio", "Davide Bacciu" ], "comment": "Accepted at ICLR 2025 (https://openreview.net/forum?id=03EkqSCKuO)", "categories": [ "cs.LG", "cs.SY", "eess.SY" ], "abstract": "The dynamics of information diffusion within graphs is a critical open issue that heavily influences graph representation learning, especially when considering long-range propagation. This calls for principled approaches that control and regulate the degree of propagation and dissipation of information throughout the neural flow. Motivated by this, we introduce (port-)Hamiltonian Deep Graph Networks, a novel framework that models neural information flow in graphs by building on the laws of conservation of Hamiltonian dynamical systems. We reconcile under a single theoretical and practical framework both non-dissipative long-range propagation and non-conservative behaviors, introducing tools from mechanical systems to gauge the equilibrium between the two components. Our approach can be applied to general message-passing architectures, and it provides theoretical guarantees on information conservation in time. Empirical results prove the effectiveness of our port-Hamiltonian scheme in pushing simple graph convolutional architectures to state-of-the-art performance in long-range benchmarks.", "revisions": [ { "version": "v2", "updated": "2025-02-13T16:32:55.000Z" } ], "analyses": { "keywords": [ "deep graph networks", "long-range propagation", "port-hamiltonian architectural bias", "simple graph convolutional architectures", "influences graph representation learning" ], "note": { "typesetting": "TeX", "pages": 0, "language": "en", "license": "arXiv", "status": "editable" } } }