arXiv Analytics

Sign in

arXiv:2103.13330 [math.NA]AbstractReferencesReviewsResources

Convergence Rate Analysis for Deep Ritz Method

Chenguang Duan, Yuling Jiao, Yanming Lai, Xiliang Lu, Zhijian Yang

Published 2021-03-24Version 1

Using deep neural networks to solve PDEs has attracted a lot of attentions recently. However, why the deep learning method works is falling far behind its empirical success. In this paper, we provide a rigorous numerical analysis on deep Ritz method (DRM) \cite{wan11} for second order elliptic equations with Neumann boundary conditions. We establish the first nonasymptotic convergence rate in $H^1$ norm for DRM using deep networks with $\mathrm{ReLU}^2$ activation functions. In addition to providing a theoretical justification of DRM, our study also shed light on how to set the hyper-parameter of depth and width to achieve the desired convergence rate in terms of number of training samples. Technically, we derive bounds on the approximation error of deep $\mathrm{ReLU}^2$ network in $H^1$ norm and on the Rademacher complexity of the non-Lipschitz composition of gradient norm and $\mathrm{ReLU}^2$ network, both of which are of independent interest.

Related articles: Most relevant | Search more
arXiv:2107.14478 [math.NA] (Published 2021-07-30)
Error Analysis of Deep Ritz Methods for Elliptic Equations
arXiv:2312.06980 [math.NA] (Published 2023-12-12)
SPFNO: Spectral operator learning for PDEs with Dirichlet and Neumann boundary conditions
arXiv:2208.09969 [math.NA] (Published 2022-08-21)
Interior over-stabilized enriched Galerkin methods for second order elliptic equations