arXiv Analytics

Sign in

arXiv:2106.07703 [math.OC]AbstractReferencesReviewsResources

Distributed Optimization with Global Constraints Using Noisy Measurements

Van Sy Mai, Richard J. La, Tao Zhang, Abdella Battou

Published 2021-06-14Version 1

We propose a new distributed optimization algorithm for solving a class of constrained optimization problems in which (a) the objective function is separable (i.e., the sum of local objective functions of agents), (b) the optimization variables of distributed agents, which are subject to nontrivial local constraints, are coupled by global constraints, and (c) only noisy observations are available to estimate (the gradients of) local objective functions. In many practical scenarios, agents may not be willing to share their optimization variables with others. For this reason, we propose a distributed algorithm that does not require the agents to share their optimization variables with each other; instead, each agent maintains a local estimate of the global constraint functions and share the estimate only with its neighbors. These local estimates of constraint functions are updated using a consensus-type algorithm, while the local optimization variables of each agent are updated using a first-order method based on noisy estimates of gradient. We prove that, when the agents adopt the proposed algorithm, their optimization variables converge with probability 1 to an optimal point of an approximated problem based on the penalty method.

Related articles: Most relevant | Search more
arXiv:1803.07143 [math.OC] (Published 2018-03-19)
Communication reduction in distributed optimization via estimation of the proximal operator
arXiv:2102.12989 [math.OC] (Published 2021-02-25)
Distributed Optimization with Coupling Constraints
arXiv:1609.03961 [math.OC] (Published 2016-09-13)
Fast Algorithms for Distributed Optimization and Hypothesis Testing: A Tutorial