arXiv Analytics

Sign in

arXiv:2305.09420 [math.OC]AbstractReferencesReviewsResources

Optimizing over trained GNNs via symmetry breaking

Shiqiang Zhang, Juan S Campos Salazar, Christian Feldmann, David Walz, Frederik Sandfort, Miriam Mathea, Calvin Tsay, Ruth Misener

Published 2023-05-16Version 1

Optimization over trained machine learning models has applications including: verification, minimizing neural acquisition functions, and integrating a trained surrogate into a larger decision-making problem. This paper formulates and solves optimization problems constrained by trained graph neural networks (GNNs). To circumvent the symmetry issue caused by graph isomorphism, we propose two types of symmetry-breaking constraints: one indexing a node 0 and one indexing the remaining nodes by lexicographically ordering their neighbor sets. To guarantee that adding these constraints will not remove all symmetric solutions, we construct a graph indexing algorithm and prove that the resulting graph indexing satisfies the proposed symmetry-breaking constraints. For the classical GNN architectures considered in this paper, optimizing over a GNN with a fixed graph is equivalent to optimizing over a dense neural network. Thus, we study the case where the input graph is not fixed, implying that each edge is a decision variable, and develop two mixed-integer optimization formulations. To test our symmetry-breaking strategies and optimization formulations, we consider an application in molecular design.

Comments: 10 main pages, 27 with appendix, 9 figures, 7 tables
Categories: math.OC
Related articles: Most relevant | Search more
arXiv:2006.06934 [math.OC] (Published 2020-06-12)
Gradient Projection for Optimizing a Function on Standard Simplex
arXiv:1305.6271 [math.OC] (Published 2013-05-27, updated 2013-11-07)
Symmetry breaking in a constrained Cheeger type isoperimetric inequality
arXiv:1601.02146 [math.OC] (Published 2016-01-09)
Symmetry breaking for a problem in optimal insulation