A Distributed Nesterov-Like Gradient Tracking Algorithm for Composite Constrained Optimization

You are here

Top Reasons to Join SPS Today!

1. IEEE Signal Processing Magazine
2. Signal Processing Digital Library*
3. Inside Signal Processing Newsletter
4. SPS Resource Center
5. Career advancement & recognition
6. Discounts on conferences and publications
7. Professional networking
8. Communities for students, young professionals, and women
9. Volunteer opportunities
10. Coming soon! PDH/CEU credits
Click here to learn more.

A Distributed Nesterov-Like Gradient Tracking Algorithm for Composite Constrained Optimization

By: 
Lifeng Zheng; Huaqing Li; Jun Li; Zheng Wang; Qingguo Lü; Yawei Shi; Huiwei Wang; Tao Dong; Lianghao Ji; Dawen Xia

This paper focuses on the constrained optimization problem where the objective function is composed of smooth (possibly nonconvex) and nonsmooth parts. The proposed algorithm integrates the successive convex approximation (SCA) technique with the gradient tracking mechanism that aims at achieving a linear convergence rate and employing the momentum term to regulate update directions in each time instant. It is proved that the proposed algorithm converges provided that the constant step size and momentum parameter are lower than the given upper bounds. When the smooth part is strongly convex, the proposed algorithm linearly converges to the global optimal solution, whereas it converges to a local stationary solution with a sub-linear convergence rate if the smooth part is nonconvex. Numerical simulations are applied to demonstrate the validity of the proposed algorithm and the theoretical analysis.

SPS Social Media

IEEE SPS Educational Resources

IEEE SPS Resource Center

IEEE SPS YouTube Channel