arXiv Analytics

Sign in

arXiv:2202.10994 [math.OC]AbstractReferencesReviewsResources

An accelerated proximal gradient method for multiobjective optimization

Hiroki Tanabe, Ellen H. Fukuda, Nobuo Yamashita

Published 2022-02-22Version 1

Many descent methods for multiobjective optimization problems have been developed in recent years. In 2000, the steepest descent method was proposed for differentiable multiobjective optimization problems. Afterward, the proximal gradient method, which can solve composite problems, was also considered. However, the accelerated versions are not sufficiently studied. In this paper, we propose a multiobjective accelerated proximal gradient algorithm, in which we solve subproblems with terms that only appear in the multiobjective case. We also show the proposed method's global convergence rate ($O(1/k^2)$) under reasonable assumptions, using a merit function to measure the complexity. Moreover, we present an efficient way to solve the subproblem via its dual, and we confirm the validity of the proposed method through preliminary numerical experiments.

Related articles: Most relevant | Search more
arXiv:2107.12122 [math.OC] (Published 2021-07-26)
A Steepest Descent Method for Set Optimization Problems with Set-Valued Mappings of Finite Cardinality
arXiv:2309.06929 [math.OC] (Published 2023-09-13)
Barzilai-Borwein Descent Methods for Multiobjective Optimization Problems with Variable Trade-off Metrics
arXiv:2311.08109 [math.OC] (Published 2023-11-14)
Improvement of steepest descent method for multiobjective optimization