arXiv Analytics

Sign in

arXiv:2112.09639 [math.PR]AbstractReferencesReviewsResources

Necessary and Sufficient Conditions for Optimal Control of Semilinear Stochastic Partial Differential Equations

Wilhelm Stannat, Lukas Wessels

Published 2021-12-17, updated 2022-01-21Version 2

Using a recently introduced representation of the second order adjoint state as the solution of a function-valued backward stochastic partial differential equation (SPDE), we calculate the viscosity super- and subdifferential of the value function evaluated along an optimal trajectory for controlled semilinear SPDEs. This establishes the well-known connection between Pontryagin's maximum principle and dynamic programming within the framework of viscosity solutions. As a corollary, we derive that the correction term in the stochastic Hamiltonian arising in non-smooth stochastic control problems is non-positive. These results directly lead us to a stochastic verification theorem for fully nonlinear Hamilton-Jacobi-Bellman equations in the framework of viscosity solutions.

Comments: 29 pages; added a necessary assumption to Theorem 4.1 and improved the presentation; added Remark 4.3 regarding more general differential operators
Categories: math.PR, math.OC
Related articles: Most relevant | Search more
arXiv:2311.03241 [math.PR] (Published 2023-11-06)
On optimal control of reflected diffusions
arXiv:2105.05194 [math.PR] (Published 2021-05-11, updated 2021-08-10)
Peng's Maximum Principle for Stochastic Partial Differential Equations
arXiv:math/0512118 [math.PR] (Published 2005-12-06, updated 2006-11-23)
Optimal control of a large dam