arXiv Analytics

Sign in

arXiv:2410.08934 [stat.ML]AbstractReferencesReviewsResources

The Effect of Personalization in FedProx: A Fine-grained Analysis on Statistical Accuracy and Communication Efficiency

Xin Yu, Zelin He, Ying Sun, Lingzhou Xue, Runze Li

Published 2024-10-11Version 1

FedProx is a simple yet effective federated learning method that enables model personalization via regularization. Despite remarkable success in practice, a rigorous analysis of how such a regularization provably improves the statistical accuracy of each client's local model hasn't been fully established. Setting the regularization strength heuristically presents a risk, as an inappropriate choice may even degrade accuracy. This work fills in the gap by analyzing the effect of regularization on statistical accuracy, thereby providing a theoretical guideline for setting the regularization strength for achieving personalization. We prove that by adaptively choosing the regularization strength under different statistical heterogeneity, FedProx can consistently outperform pure local training and achieve a nearly minimax-optimal statistical rate. In addition, to shed light on resource allocation, we design an algorithm, provably showing that stronger personalization reduces communication complexity without increasing the computation cost overhead. Finally, our theory is validated on both synthetic and real-world datasets and its generalizability is verified in a non-convex setting.

Related articles: Most relevant | Search more
arXiv:2411.12068 [stat.ML] (Published 2024-11-18)
The Statistical Accuracy of Neural Posterior and Likelihood Estimation
arXiv:2112.09746 [stat.ML] (Published 2021-12-17, updated 2022-02-09)
Supervised Multivariate Learning with Simultaneous Feature Auto-grouping and Dimension Reduction
arXiv:1805.10559 [stat.ML] (Published 2018-05-27)
cpSGD: Communication-efficient and differentially-private distributed SGD