Stochastic linearized generalized alternating direction method of multipliers: Expected convergence rates and large deviation properties
Linearization
DOI:
10.1017/s096012952300004x
Publication Date:
2023-03-14T10:42:50Z
AUTHORS (3)
ABSTRACT
Abstract Alternating direction method of multipliers (ADMM) receives much attention in the field optimization and computer science, etc. The generalized ADMM (G-ADMM) proposed by Eckstein Bertsekas incorporates an acceleration factor is more efficient than original ADMM. However, G-ADMM not applicable some models where objective function value (or its gradient) computationally costly or even impossible to compute. In this paper, we consider two-block separable convex problem with linear constraints, only noisy estimations gradient are accessible. Under setting, propose a stochastic linearized (called SLG-ADMM) two subproblems approximated linearization strategies. And theory, analyze expected convergence rates large deviation properties SLG-ADMM. particular, show that worst-case SLG-ADMM $\mathcal{O}\left( {{N}^{-1/2}}\right)$ $\mathcal{O}\left({\ln N} \cdot {N}^{-1}\right)$ for solving general strongly problems, respectively, N iteration number, similarly hereinafter, high probability, has $\mathcal{O}\left ( \ln N^{-1/2} \right ) $ \left )^{2} N^{-1} constraint violation bounds error respectively.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (29)
CITATIONS (0)
EXTERNAL LINKS
PlumX Metrics
RECOMMENDATIONS
FAIR ASSESSMENT
Coming soon ....
JUPYTER LAB
Coming soon ....