arXiv:2409.04991 [math.NA]AbstractReferencesReviewsResources
Error estimates of the Euler's method for stochastic differential equations with multiplicative noise via relative entropy
Lei Li, Mengchao Wang, Yuliang Wang
Published 2024-09-08Version 1
We investigate the sharp error estimate of the density under the relative entropy (or Kullback-Leibler divergence) for the traditional Euler-Maruyama discretization of stochastic differential equations (SDEs) with multiplicative noise. The foundation of the proof is the estimates of the derivatives for the logarithmic numerical density. The key technique is to adopt the Malliavin calculus to get the expressions of the derivatives of the logarithmic Green's function and to obtain an estimate for the inverse Malliavin matrix. The estimate of relative entropy then naturally gives sharp error bounds under total variation distance and Wasserstein distances. Compared to the usual weak error estimate for SDEs, such estimate can give an error bound for a family of test functions instead of one test function.