arXiv:1910.01098 [math.OC]AbstractReferencesReviewsResources
Optimal Impulse Control of Dynamical Systems with Functional Constraints
Published 2019-10-02Version 1
This paper considers a constrained optimal impulse control problem of dynamical systems generated by a flow. Under quite general and natural conditions, we prove the existence of an optimal stationary policy. This is done by making use of the tools of Markov decision processes. Two linear programming approaches are established and justified. In absence of constraints, we show that these two linear programming approaches are dual to the dynamic programming method with the optimality equations in the integral and differential form, respectively.
Categories: math.OC
Related articles: Most relevant | Search more
arXiv:2007.01602 [math.OC] (Published 2020-07-03)
On the existence of optimal stationary policies for average Markov decision processes with countable states
arXiv:1901.02825 [math.OC] (Published 2019-01-09)
Stochastic stabilization of dynamical systems over communication channels
arXiv:2010.02282 [math.OC] (Published 2020-10-05)
First-order methods for problems with O(1) functional constraints can have almost the same convergence rate as for unconstrained problems