arXiv Analytics

Sign in

arXiv:2307.09738 [math.NA]AbstractReferencesReviewsResources

A discretization-invariant extension and analysis of some deep operator networks

Zecheng Zhang, Wing Tat Leung, Hayden Schaeffer

Published 2023-07-19Version 1

We present a generalized version of the discretization-invariant neural operator and prove that the network is a universal approximation in the operator sense. Moreover, by incorporating additional terms in the architecture, we establish a connection between this discretization-invariant neural operator network and those discussed before. The discretization-invariance property of the operator network implies that different input functions can be sampled using various sensor locations within the same training and testing phases. Additionally, since the network learns a ``basis'' for the input and output function spaces, our approach enables the evaluation of input functions on different discretizations. To evaluate the performance of the proposed discretization-invariant neural operator, we focus on challenging examples from multiscale partial differential equations. Our experimental results indicate that the method achieves lower prediction errors compared to previous networks and benefits from its discretization-invariant property.

Related articles: Most relevant | Search more
arXiv:2304.06902 [math.NA] (Published 2023-04-14)
Quantum Algorithms for Multiscale Partial Differential Equations
arXiv:2309.01020 [math.NA] (Published 2023-09-02)
On the training and generalization of deep operator networks
arXiv:2202.04537 [math.NA] (Published 2022-02-09)
Time complexity analysis of quantum difference methods for linear high dimensional and multiscale partial differential equations