The technology we use, and even rely on, in our everyday lives –computers, radios, video, cell phones – is enabled by signal processing. Learn More »
1. IEEE Signal Processing Magazine
2. Signal Processing Digital Library*
3. Inside Signal Processing Newsletter
4. SPS Resource Center
5. Career advancement & recognition
6. Discounts on conferences and publications
7. Professional networking
8. Communities for students, young professionals, and women
9. Volunteer opportunities
10. Coming soon! PDH/CEU credits
Click here to learn more.
"Scaling and Scalability: Accelerating Ill-conditioned Low-rank Estimation via Scaled Gradient Descent"
Speaker: Yuejie Chi
Time: 11:00 am – 12:00 pm EST (NY Time), December 9 2021
Registration link: https://oregonstate.zoom.us/meeting/register/tJYrcOyvrTwiGtfOISD1o72fHSnaGjLBBwrn
Abstract: Many problems encountered in sensing and imaging can be formulated as estimating a low-rank object from incomplete, and possibly corrupted, linear measurements; prominent examples include matrix completion and tensor completion. Through the lens of matrix and tensor factorization, one of the most popular approaches is to employ simple iterative algorithms such as gradient descent to recover the low-rank factors directly, which allow for small memory and computation footprints. However, the convergence rate of gradient descent depends linearly, and sometimes even quadratically, on the condition number of the low-rank object, and therefore, slows down painstakingly when the problem is ill-conditioned. This talk introduces a new algorithmic approach, dubbed scaled gradient descent (ScaledGD), that provably converges linearly at a constant rate independent of the condition number of the low-rank object, while maintaining the low per-iteration cost of gradient descent. In addition, a nonsmooth variant of ScaledGD provides further robustness to corruptions by optimizing the least absolute deviation loss. In total, ScaledGD highlights the power of appropriate preconditioning in accelerating nonconvex statistical estimation, where the iteration-varying preconditioners promote desirable invariance properties of the trajectory with respect to the symmetry in low-rank factorization.
Short Bio: Dr. Yuejie Chi is a Professor in the department of Electrical and Computer Engineering, and a faculty affiliate with the Machine Learning department and CyLab at Carnegie Mellon University. Her research interests lie in the theoretical and algorithmic foundations of data science, signal processing, machine learning and inverse problems, with applications in sensing and societal systems, broadly defined. Among others, Dr. Chi received the inaugural IEEE Signal Processing Society Early Career Technical Achievement Award for contributions to high-dimensional structured signal processing.
This SAM webinar series is organized by Nuria Gonzalez Prelcic and Xiao Fu on behalf of the SAM TC. If you are interested in giving a talk, please contact both of them at:
Dr Nuria Gonzalez Prelcic, email: ngprelcic@ncsu.edu
Dr Xiao Fu, email: xiao.fu@oregonstate.edu
Home | Sitemap | Contact | Accessibility | Nondiscrimination Policy | IEEE Ethics Reporting | IEEE Privacy Policy | Terms | Feedback
© Copyright 2024 IEEE - All rights reserved. Use of this website signifies your agreement to the IEEE Terms and Conditions.
A public charity, IEEE is the world's largest technical professional organization dedicated to advancing technology for the benefit of humanity.