arXiv Analytics

Sign in

arXiv:0810.1386 [math.OC]AbstractReferencesReviewsResources

Discrete Mechanics and Optimal Control: an Analysis

S. Ober-Bloebaum, O. Junge, J. E. Marsden

Published 2008-10-08Version 1

The optimal control of a mechanical system is of crucial importance in many realms. Typical examples are the determination of a time-minimal path in vehicle dynamics, a minimal energy trajectory in space mission design, or optimal motion sequences in robotics and biomechanics. In most cases, some sort of discretization of the original, infinite-dimensional optimization problem has to be performed in order to make the problem amenable to computations. The approach proposed in this paper is to directly discretize the variational description of the system's motion. The resulting optimization algorithm lets the discrete solution directly inherit characteristic structural properties from the continuous one like symmetries and integrals of the motion. We show that the DMOC approach is equivalent to a finite difference discretization of Hamilton's equations by a symplectic partitioned Runge-Kutta scheme and employ this fact in order to give a proof of convergence. The numerical performance of DMOC and its relationship to other existing optimal control methods are investigated.

Related articles: Most relevant | Search more
arXiv:1203.0580 [math.OC] (Published 2012-03-02)
Discrete Variational Optimal Control
arXiv:1107.3944 [math.OC] (Published 2011-07-20, updated 2011-11-23)
Optimal control with stochastic PDE constraints and uncertain controls
arXiv:1602.08618 [math.OC] (Published 2016-02-27)
Riccati equations and optimal control of well-posed linear systems