{ "id": "2103.13330", "version": "v1", "published": "2021-03-24T16:30:23.000Z", "updated": "2021-03-24T16:30:23.000Z", "title": "Convergence Rate Analysis for Deep Ritz Method", "authors": [ "Chenguang Duan", "Yuling Jiao", "Yanming Lai", "Xiliang Lu", "Zhijian Yang" ], "categories": [ "math.NA", "cs.NA" ], "abstract": "Using deep neural networks to solve PDEs has attracted a lot of attentions recently. However, why the deep learning method works is falling far behind its empirical success. In this paper, we provide a rigorous numerical analysis on deep Ritz method (DRM) \\cite{wan11} for second order elliptic equations with Neumann boundary conditions. We establish the first nonasymptotic convergence rate in $H^1$ norm for DRM using deep networks with $\\mathrm{ReLU}^2$ activation functions. In addition to providing a theoretical justification of DRM, our study also shed light on how to set the hyper-parameter of depth and width to achieve the desired convergence rate in terms of number of training samples. Technically, we derive bounds on the approximation error of deep $\\mathrm{ReLU}^2$ network in $H^1$ norm and on the Rademacher complexity of the non-Lipschitz composition of gradient norm and $\\mathrm{ReLU}^2$ network, both of which are of independent interest.", "revisions": [ { "version": "v1", "updated": "2021-03-24T16:30:23.000Z" } ], "analyses": { "keywords": [ "deep ritz method", "convergence rate analysis", "first nonasymptotic convergence rate", "second order elliptic equations", "neumann boundary conditions" ], "note": { "typesetting": "TeX", "pages": 0, "language": "en", "license": "arXiv", "status": "editable" } } }