arXiv Analytics

Sign in

arXiv:2409.04991 [math.NA]AbstractReferencesReviewsResources

Error estimates of the Euler's method for stochastic differential equations with multiplicative noise via relative entropy

Lei Li, Mengchao Wang, Yuliang Wang

Published 2024-09-08Version 1

We investigate the sharp error estimate of the density under the relative entropy (or Kullback-Leibler divergence) for the traditional Euler-Maruyama discretization of stochastic differential equations (SDEs) with multiplicative noise. The foundation of the proof is the estimates of the derivatives for the logarithmic numerical density. The key technique is to adopt the Malliavin calculus to get the expressions of the derivatives of the logarithmic Green's function and to obtain an estimate for the inverse Malliavin matrix. The estimate of relative entropy then naturally gives sharp error bounds under total variation distance and Wasserstein distances. Compared to the usual weak error estimate for SDEs, such estimate can give an error bound for a family of test functions instead of one test function.

Related articles: Most relevant | Search more
arXiv:1608.07096 [math.NA] (Published 2016-08-25)
A New Class of Exponential Integrators for Stochastic Differential Equations With Multiplicative Noise
arXiv:2008.03148 [math.NA] (Published 2020-08-06)
A note on the asymptotic stability of the Semi-Discrete method for Stochastic Differential Equations
arXiv:2004.05687 [math.NA] (Published 2020-04-12)
Sampling of Stochastic Differential Equations using the Karhunen-Loève Expansion and Matrix Functions