Stragglers Are Not Disaster: A Hybrid Federated Learning Algorithm with Delayed Gradients
FOS: Computer and information sciences
Computer Science - Machine Learning
Computer Science - Distributed, Parallel, and Cluster Computing
0202 electrical engineering, electronic engineering, information engineering
02 engineering and technology
Distributed, Parallel, and Cluster Computing (cs.DC)
Machine Learning (cs.LG)
DOI:
10.48550/arxiv.2102.06329
Publication Date:
2021-01-01
AUTHORS (4)
ABSTRACT
Federated learning (FL) is a new machine framework which trains joint model across large amount of decentralized computing devices. Existing methods, e.g., Averaging (FedAvg), are able to provide an optimization guarantee by synchronously training the model, but usually suffer from stragglers, i.e., IoT devices with low power or communication bandwidth, especially on heterogeneous problems. To mitigate influence this paper presents novel FL algorithm, namely Hybrid Learning (HFL), achieve balance in efficiency and effectiveness. It consists two major components: synchronous kernel asynchronous updater. Unlike traditional our HFL introduces updater actively pulls unsynchronized delayed local weights stragglers. An adaptive approximation method, Adaptive Delayed-SGD (AD-SGD), proposed merge updates into model. The theoretical analysis shows that convergence rate algorithm $\mathcal{O}(\frac{1}{t+τ})$ for both convex non-convex
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....