arXiv Analytics

Sign in

arXiv:2109.01326 [stat.ML]AbstractReferencesReviewsResources

Statistical Estimation and Inference via Local SGD in Federated Learning

Xiang Li, Jiadong Liang, Xiangyu Chang, Zhihua Zhang

Published 2021-09-03Version 1

Federated Learning (FL) makes a large amount of edge computing devices (e.g., mobile phones) jointly learn a global model without data sharing. In FL, data are generated in a decentralized manner with high heterogeneity. This paper studies how to perform statistical estimation and inference in the federated setting. We analyze the so-called Local SGD, a multi-round estimation procedure that uses intermittent communication to improve communication efficiency. We first establish a {\it functional central limit theorem} that shows the averaged iterates of Local SGD weakly converge to a rescaled Brownian motion. We next provide two iterative inference methods: the {\it plug-in} and the {\it random scaling}. Random scaling constructs an asymptotically pivotal statistic for inference by using the information along the whole Local SGD path. Both the methods are communication efficient and applicable to online data. Our theoretical and empirical results show that Local SGD simultaneously achieves both statistical efficiency and communication efficiency.

Related articles: Most relevant | Search more
arXiv:2211.14115 [stat.ML] (Published 2022-11-25)
Inverse Solvability and Security with Applications to Federated Learning
arXiv:2107.03770 [stat.ML] (Published 2021-07-08)
Federated Learning as a Mean-Field Game
arXiv:2107.10663 [stat.ML] (Published 2021-07-21)
Fed-ensemble: Improving Generalization through Model Ensembling in Federated Learning