Towards efficient communications in federated learning: A contemporary survey
FOS: Computer and information sciences
Computer Science - Distributed, Parallel, and Cluster Computing
0202 electrical engineering, electronic engineering, information engineering
Distributed, Parallel, and Cluster Computing (cs.DC)
02 engineering and technology
DOI:
10.1016/j.jfranklin.2022.12.053
Publication Date:
2023-01-07T01:08:57Z
AUTHORS (7)
ABSTRACT
In the traditional distributed machine learning scenario, the user's private data is transmitted between clients and a central server, which results in significant potential privacy risks. In order to balance the issues of data privacy and joint training of models, federated learning (FL) is proposed as a particular distributed machine learning procedure with privacy protection mechanisms, which can achieve multi-party collaborative computing without revealing the original data. However, in practice, FL faces a variety of challenging communication problems. This review seeks to elucidate the relationship between these communication issues by methodically assessing the development of FL communication research from three perspectives: communication efficiency, communication environment, and communication resource allocation. Firstly, we sort out the current challenges existing in the communications of FL. Second, we have collated FL communications-related papers and described the overall development trend of the field based on their logical relationship. Ultimately, we discuss the future directions of research for communications in FL.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (166)
CITATIONS (41)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....