arXiv Analytics

Sign in

arXiv:2305.14703 [math.NA]AbstractReferencesReviewsResources

Generative diffusion learning for parametric partial differential equations

Ting Wang, Petr Plechac, Jaroslaw Knap

Published 2023-05-24Version 1

We develop a class of data-driven generative models that approximate the solution operator for parameter-dependent partial differential equations (PDE). We propose a novel probabilistic formulation of the operator learning problem based on recently developed generative denoising diffusion probabilistic models (DDPM) in order to learn the input-to-output mapping between problem parameters and solutions of the PDE. To achieve this goal we modify DDPM to supervised learning in which the solution operator for the PDE is represented by a class of conditional distributions. The probabilistic formulation combined with DDPM allows for an automatic quantification of confidence intervals for the learned solutions. Furthermore, the framework is directly applicable for learning from a noisy data set. We compare computational performance of the developed method with the Fourier Network Operators (FNO). Our results show that our method achieves comparable accuracy and recovers the noise magnitude when applied to data sets with outputs corrupted by additive noise.

Related articles: Most relevant | Search more
arXiv:2207.06145 [math.NA] (Published 2022-07-13)
On the matching of eigensolutions to parametric partial differential equations
arXiv:2405.07139 [math.NA] (Published 2024-05-12)
Reduced Krylov Basis Methods for Parametric Partial Differential Equations
arXiv:1911.08954 [math.NA] (Published 2019-11-20)
Basic Ideas and Tools for Projection-Based Model Reduction of Parametric Partial Differential Equations