{ "id": "1904.10642", "version": "v1", "published": "2019-04-24T05:07:49.000Z", "updated": "2019-04-24T05:07:49.000Z", "title": "Towards Combining On-Off-Policy Methods for Real-World Applications", "authors": [ "Kai-Chun Hu", "Chen-Huan Pi", "Ting Han Wei", "I-Chen Wu", "Stone Cheng", "Yi-Wei Dai", "Wei-Yuan Ye" ], "categories": [ "cs.LG", "stat.ML" ], "abstract": "In this paper, we point out a fundamental property of the objective in reinforcement learning, with which we can reformulate the policy gradient objective into a perceptron-like loss function, removing the need to distinguish between on and off policy training. Namely, we posit that it is sufficient to only update a policy $\\pi$ for cases that satisfy the condition $A(\\frac{\\pi}{\\mu}-1)\\leq0$, where $A$ is the advantage, and $\\mu$ is another policy. Furthermore, we show via theoretic derivation that a perceptron-like loss function matches the clipped surrogate objective for PPO. With our new formulation, the policies $\\pi$ and $\\mu$ can be arbitrarily apart in theory, effectively enabling off-policy training. To examine our derivations, we can combine the on-policy PPO clipped surrogate (which we show to be equivalent with one instance of the new reformation) with the off-policy IMPALA method. We first verify the combined method on the OpenAI Gym pendulum toy problem. Next, we use our method to train a quadrotor position controller in a simulator. Our trained policy is efficient and lightweight enough to perform in a low cost micro-controller at a minimum update rate of 500 Hz. For the quadrotor, we show two experiments to verify our method and demonstrate performance: 1) hovering at a fixed position, and 2) tracking along a specific trajectory. In preliminary trials, we are also able to apply the method to a real-world quadrotor.", "revisions": [ { "version": "v1", "updated": "2019-04-24T05:07:49.000Z" } ], "analyses": { "keywords": [ "combining on-off-policy methods", "real-world applications", "openai gym pendulum toy problem", "clipped surrogate", "minimum update rate" ], "note": { "typesetting": "TeX", "pages": 0, "language": "en", "license": "arXiv", "status": "editable" } } }