arXiv:2106.10796 [cs.LG]AbstractReferencesReviewsResources
CD-SGD: Distributed Stochastic Gradient Descent with Compression and Delay Compensation
Enda Yu, Dezun Dong, Yemao Xu, Shuo Ouyang, Xiangke Liao
Published 2021-06-21Version 1
Communication overhead is the key challenge for distributed training. Gradient compression is a widely used approach to reduce communication traffic. When combining with parallel communication mechanism method like pipeline, gradient compression technique can greatly alleviate the impact of communication overhead. However, there exists two problems of gradient compression technique to be solved. Firstly, gradient compression brings in extra computation cost, which will delay the next training iteration. Secondly, gradient compression usually leads to the decrease of convergence accuracy.
Comments: 12 pages
Related articles: Most relevant | Search more
arXiv:1512.02970 [cs.LG] (Published 2015-12-09)
Scaling Up Distributed Stochastic Gradient Descent Using Variance Reduction
arXiv:1512.01708 [cs.LG] (Published 2015-12-05)
Variance Reduction for Distributed Stochastic Gradient Descent
arXiv:2212.02049 [cs.LG] (Published 2022-12-05)
Distributed Stochastic Gradient Descent with Cost-Sensitive and Strategic Agents