IEEE SPS SAM TC Webinar: 9 December 2021, by Yuejie Chi

You are here

Inside Signal Processing Newsletter Home Page

Top Reasons to Join SPS Today!

1. IEEE Signal Processing Magazine
2. Signal Processing Digital Library*
3. Inside Signal Processing Newsletter
4. SPS Resource Center
5. Career advancement & recognition
6. Discounts on conferences and publications
7. Professional networking
8. Communities for students, young professionals, and women
9. Volunteer opportunities
10. Coming soon! PDH/CEU credits
Click here to learn more.

News and Resources for Members of the IEEE Signal Processing Society

IEEE SPS SAM TC Webinar: 9 December 2021, by Yuejie Chi

IEEE SPS SAM Webinar
(Organized by the IEEE SPS Sensor Array and Multichannel Technical Committee (SAM TC))

Title: Scaling and Scalability: Accelerating Ill-conditioned Low-rank Estimation via Scaled Gradient Descent
Date: 9 December 2021
Time: 11:00 AM - 12:00 PM ET (New York time)
Duration: 1 Hour
Presenters: Yuejie Chi

 

Register for the webinar to acquire Zoom link!

About the topic:

Many problems encountered in sensing and imaging can be formulated as estimating a low-rank object from incomplete, and possibly corrupted, linear measurements; prominent examples include matrix completion and tensor completion. Through the lens of matrix and tensor factorization, one of the most popular approaches is to employ simple iterative algorithms such as gradient descent to recover the low-rank factors directly, which allow for small memory and computation footprints. However, the convergence rate of gradient descent depends linearly, and sometimes even quadratically, on the condition number of the low-rank object, and therefore, slows down painstakingly when the problem is ill-conditioned. This talk introduces a new algorithmic approach, dubbed scaled gradient descent (ScaledGD), that provably converges linearly at a constant rate independent of the condition number of the low-rank object, while maintaining the low per-iteration cost of gradient descent. In addition, a nonsmooth variant of ScaledGD provides further robustness to corruptions by optimizing the least absolute deviation loss. In total, ScaledGD highlights the power of appropriate preconditioning in accelerating nonconvex statistical estimation, where the iteration-varying preconditioners promote desirable invariance properties of the trajectory with respect to the symmetry in low-rank factorization.


About the presenter:

Yuejie Chi

Dr. Yuejie Chi is a Professor in the department of Electrical and Computer Engineering, and a faculty affiliate with the Machine Learning department and CyLab at Carnegie Mellon University. Her research interests lie in the theoretical and algorithmic foundations of data science, signal processing, machine learning and inverse problems, with applications in sensing and societal systems, broadly defined. Among others, Dr. Chi received the inaugural IEEE Signal Processing Society Early Career Technical Achievement Award for contributions to high-dimensional structured signal processing. 

 

 

For more information, please contact: Xiao Fu or Nuria Gonzalez-Prelcic

SPS Social Media

IEEE SPS Educational Resources

IEEE SPS Resource Center

IEEE SPS YouTube Channel