An Efficient Federated Learning Framework for Training Semantic Communication System
Federated Learning
Baseline (sea)
DOI:
10.48550/arxiv.2310.13236
Publication Date:
2023-01-01
AUTHORS (7)
ABSTRACT
Semantic communication has emerged as a pillar for the next generation of systems due to its capabilities in alleviating data redundancy. Most semantic are built upon advanced deep learning models whose training performance heavily relies on availability. Existing studies often make unrealistic assumptions readily accessible source, where practice, is mainly created client side. Due privacy and security concerns, transmission restricted, which necessary conventional centralized schemes. To address this challenge, we explore federated (FL) setting that utilizes without leaking privacy. Additionally, design our system tackle overhead by reducing quantity information delivered each global round. In way, can save significant bandwidth resource-limited devices reduce overall network traffic. Finally, introduce mechanism aggregate model from clients, called FedLol. Extensive simulation results demonstrate effectiveness proposed technique compared baseline methods.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES ()
CITATIONS ()
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....