Towards communication-efficient vertical federated learning training via cache-enabled local updates

Federated Learning
DOI: 10.14778/3547305.3547316 Publication Date: 2022-09-07T16:09:53Z
ABSTRACT
Vertical federated learning (VFL) is an emerging paradigm that allows different parties (e.g., organizations or enterprises) to collaboratively build machine models with privacy protection. In the training phase, VFL only exchanges intermediate statistics, i.e., forward activations and backward derivatives, across compute model gradients. Nevertheless, due its geo-distributed nature, usually suffers from low WAN bandwidth. this paper, we introduce CELU-VFL, a novel efficient framework exploits local update technique reduce cross-party communication rounds. CELU-VFL caches stale statistics reuses them estimate gradients without exchanging ad hoc statistics. Significant techniques are proposed improve convergence performance. First, handle stochastic variance problem, propose uniform sampling strategy fairly choose for updates. Second, harness errors brought by staleness, devise instance weighting mechanism measures reliability of estimated Theoretical analysis proves achieves similar sub-linear rate as vanilla but requires much fewer Empirical results on both public real-world workloads validate can be up six times faster than existing works.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (53)
CITATIONS (18)