arXiv Analytics

Sign in

arXiv:2304.04234 [cs.LG]AbstractReferencesReviewsResources

Variational operator learning: A unified paradigm for training neural operators and solving partial differential equations

Tengfei Xu, Dachuan Liu, Peng Hao, Bo Wang

Published 2023-04-09Version 1

Based on the variational method, we propose a novel paradigm that provides a unified framework of training neural operators and solving partial differential equations (PDEs) with the variational form, which we refer to as the variational operator learning (VOL). We first derive the functional approximation of the system from the node solution prediction given by neural operators, and then conduct the variational operation by automatic differentiation, constructing a forward-backward propagation loop to derive the residual of the linear system. One or several update steps of the steepest decent method (SD) and the conjugate gradient method (CG) are provided in every iteration as a cheap yet effective update for training the neural operators. Experimental results show the proposed VOL can learn a variety of solution operators in PDEs of the steady heat transfer and the variable stiffness elasticity with satisfactory results and small error. The proposed VOL achieves nearly label-free training. Only five to ten labels are used for the output distribution-shift session in all experiments. Generalization benefits of the VOL are investigated and discussed.

Related articles: Most relevant | Search more
arXiv:2401.03492 [cs.LG] (Published 2024-01-07)
Neural Networks with Kernel-Weighted Corrective Residuals for Solving Partial Differential Equations
arXiv:2301.01104 [cs.LG] (Published 2023-01-03)
KoopmanLab: A PyTorch module of Koopman neural operator family for solving partial differential equations
arXiv:2210.12177 [cs.LG] (Published 2022-10-21)
An unsupervised latent/output physics-informed convolutional-LSTM network for solving partial differential equations using peridynamic differential operator