RSA: Byzantine-Robust Stochastic Aggregation Methods for Distributed Learning from Heterogeneous Datasets

Subgradient method Stochastic Gradient Descent Regularization Quantum Byzantine agreement
DOI: 10.1609/aaai.v33i01.33011544 Publication Date: 2019-09-13T05:18:37Z
ABSTRACT
In this paper, we propose a class of robust stochastic subgradient methods for distributed learning from heterogeneous datasets at presence an unknown number Byzantine workers. The workers, during the process, may send arbitrary incorrect messages to master due data corruptions, communication failures or malicious attacks, and consequently bias learned model. key proposed is regularization term incorporated with objective function so as robustify task mitigate negative effects attacks. resultant subgradient-based algorithms are termed Byzantine-Robust Stochastic Aggregation methods, justifying our acronym RSA used henceforth. contrast most existing algorithms, does not rely on assumption that independent identically (i.i.d.) hence fits wider applications. Theoretically, show that: i) converges near-optimal solution error dependent workers; ii) convergence rate under attacks same gradient descent method, which free Numerically, experiments real dataset corroborate competitive performance complexity reduction compared state-of-the-art alternatives.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (0)
CITATIONS (220)