Skip to main content

A Distributed Nesterov-Like Gradient Tracking Algorithm for Composite Constrained Optimization

By
Lifeng Zheng; Huaqing Li; Jun Li; Zheng Wang; Qingguo Lü; Yawei Shi; Huiwei Wang; Tao Dong; Lianghao Ji; Dawen Xia

This paper focuses on the constrained optimization problem where the objective function is composed of smooth (possibly nonconvex) and nonsmooth parts. The proposed algorithm integrates the successive convex approximation (SCA) technique with the gradient tracking mechanism that aims at achieving a linear convergence rate and employing the momentum term to regulate update directions in each time instant. It is proved that the proposed algorithm converges provided that the constant step size and momentum parameter are lower than the given upper bounds. When the smooth part is strongly convex, the proposed algorithm linearly converges to the global optimal solution, whereas it converges to a local stationary solution with a sub-linear convergence rate if the smooth part is nonconvex. Numerical simulations are applied to demonstrate the validity of the proposed algorithm and the theoretical analysis.