Fully stochastic distributed convex optimization on time-varying graph with compression
This paper develops a fully stochastic proximal primal-dual (FSPPD) algorithm for distributed
convex optimization. At each iteration, the distributed algorithm has agents communicating
on a randomly drawn graph and applies random sparsification on the transmitted messages,
while the agents only have access to a stochastic gradient oracle. To our best knowledge,
this is the first compression-enabled distributed stochastic gradient algorithm on random
graphs utilizing the primal-dual framework. With diminishing step size, we show that the …
convex optimization. At each iteration, the distributed algorithm has agents communicating
on a randomly drawn graph and applies random sparsification on the transmitted messages,
while the agents only have access to a stochastic gradient oracle. To our best knowledge,
this is the first compression-enabled distributed stochastic gradient algorithm on random
graphs utilizing the primal-dual framework. With diminishing step size, we show that the …
This paper develops a fully stochastic proximal primal-dual (FSPPD) algorithm for distributed convex optimization. At each iteration, the distributed algorithm has agents communicating on a randomly drawn graph and applies random sparsification on the transmitted messages, while the agents only have access to a stochastic gradient oracle. To our best knowledge, this is the first compression-enabled distributed stochastic gradient algorithm on random graphs utilizing the primal-dual framework. With diminishing step size, we show that the FSPPD algorithm converges almost surely to an optimal solution of the strongly convex optimization problem. Numerical experiments are provided to verify our results.
ieeexplore.ieee.org
Showing the best result for this search. See all results