|An Edge-based Stochastic Proximal Gradient Algorithm for Decentralized Composite Optimization
Ling Zhang, Yu Yan*, Zheng Wang, and Huaqing Li*
International Journal of Control, Automation, and Systems, vol. 19, no. 11, pp.3598-3610, 2021
Abstract : This paper investigates decentralized composite optimization problems involving a common non-smooth regularization term over an undirected and connected network. In the same situation, there exist lots of gradientbased proximal distributed methods, but most of them are only sublinearly convergent. The proof of linear convergence for this series of algorithms is extremely difficult. To set up the problem, we presume all networked agents use the same non-smooth regularization term, which is the circumstance for most machine learning to implement based on centralized optimization. For this scenario, most existing proximal-gradient algorithms trend to ignore the cost of gradient evaluations, which results in degraded performance. To tackle this problem, we further set the local cost function to the average of a moderate amount of local cost subfunctions and develop an edge-based stochastic proximal gradient algorithm (SPG-Edge) by employing local unbiased stochastic averaging gradient method. When the non-smooth term does not exist, the proposed algorithm could be extended to some notable primal-dual domain algorithms, such as EXTRA and DIGing. Finally, we provide a simplified proof of linear convergence and conduct numerical experiments to illustrate the validity of theoretical results.
"Decentralized composite optimization, linear convergence, machine learning, proximal-gradient method, stochastic averaging gradient. "
Download PDF : Click this link