Balanced coarse-to-fine federated learning for noisy heterogeneous clients
Heterogeneous clients
Electronic computers. Computer science
Self-paced learning
Noisy data
QA75.5-76.95
Information technology
T58.5-58.64
Robust federated learning
DOI:
10.1007/s40747-024-01694-8
Publication Date:
2025-01-07T08:24:20Z
AUTHORS (6)
ABSTRACT
Abstract For heterogeneous federated learning, each client cannot ensure the reliability due to the uncertainty in data collection, where different types of noise are always introduced into heterogeneous clients. Current existing methods rely on the specific assumptions for the distribution of noise data to select the clean samples or eliminate noisy samples. However, heterogeneous clients have different deep neural network structures, and these models have different sensitivity to various noise types, the fixed noise-detection based methods may not be effective for each client. To overcome these challenges, we propose a balanced coarse-to-fine federated learning method to solve noisy heterogeneous clients. By introducing the coarse-to-fine two-stage strategy, the client can adaptively eliminate the noisy data. Meanwhile, we proposed a balanced progressive learning framework, It leverages the self-paced learning to sort the training samples from simple to difficult, which can evenly construct the client model from simple to difficult paradigm. The experimental results show that the proposed method has higher accuracy and robustness in processing noisy data from heterogeneous clients, and it is suitable for both heterogeneous and homogeneous federated learning scenarios. The code is avaliable at https://github.com/drafly/bcffl.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (42)
CITATIONS (0)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....