1. IEEE Signal Processing Magazine
2. Signal Processing Digital Library*
3. Inside Signal Processing Newsletter
4. SPS Resource Center
5. Career advancement & recognition
6. Discounts on conferences and publications
7. Professional networking
8. Communities for students, young professionals, and women
9. Volunteer opportunities
10. Coming soon! PDH/CEU credits
Click here to learn more.
Date: 11-16 December 2023 (In-person)
Location: Coimbatore, Tamil Nadu, India
Date: 11-15 December 2023 (Hybrid)
Location: Bangalore, Karnataka, India
Date: 4-9 December 2023 (Virtual)
Location: Pune, Maharashtra, India
Date: 24-28 November 2023 (Hybrid)
Location: BTKIT Dwarahat, Uttarakhand
Date: 1-2 March 2024 (Hybrid)
Location: Rajkot, Gujarat, India
Date: 3-5 November 2023 (In-person)
Location: Colombo, Western Province, Sri Lanka
Date: 3-5 November 2023 (In-person)
Location: Ernakulam, Kerala, India
Date: 25-27 October 2023 (Hybrid)
Location: Chennai, Tamil Nadu, India
Date: 5-7 October 2023 (In-person)
Location: Kalamassery , Kochi, Kerala
Date: 1-3 October 2023
Location: Kozhikode, Kerala ,India
Date: 18 October 2023
Chapter: North Jersey Chapter
Chapter Chair: Alfredo Tan
Title: Synthetic Aperture Radar (SAR) Signal Processing Challenges and Data Sets for Associated Research
Date: 21 November 2023
Chapter: German Chapter
Chapter Chair: Wolfgang Utschick
Topic: Differentiable Tools for Digital Twin Networks
Lecture Date: 19 October 2023
Chapter: France Chapter
Chapter Chair: William Puech
Topic: Reinforcement Learning meets Federated Learning and Distributional Robustness
Lecture Date: 17 October 2023
Chapter: Spain Chapter
Chapter Chair: Javier Prieto
Topic: Nonconvex Optimization Meets Low-Rank Matrix Estimation
Date: 12-14 June 2024
Location: Taichung, Taiwan
Inference tasks in signal processing are often characterized by the availability of reliable statistical modeling with some missing instance-specific parameters. One conventional approach uses data to estimate these missing parameters and then infers based on the estimated model. Alternatively, data can also be leveraged to directly learn the inference mapping end to end. These approaches for combining partially known statistical models and data in inference are related to the notions of generative and discriminative models used in the machine learning literature [1] , [2] , typically considered in the context of classifiers.
Quaternions are still largely misunderstood and often considered an “exotic” signal representation without much practical utility despite the fact that they have been around the signal and image processing community for more than 30 years now. The main aim of this article is to counter this misconception and to demystify the use of quaternion algebra for solving problems in signal and image processing. To this end, we propose a comprehensive and objective overview of the key aspects of quaternion representations, models, and methods and illustrate our journey through the literature with flagship applications. We conclude this work by an outlook on the remaining challenges and open problems in quaternion signal and image processing.
Deep learning (DL) has been wildly successful in practice, and most of the state-of-the-art machine learning methods are based on neural networks (NNs). Lacking, however, is a rigorous mathematical theory that adequately explains the amazing performance of deep NNs (DNNs). In this article, we present a relatively new mathematical framework that provides the beginning of a deeper understanding of DL. This framework precisely characterizes the functional properties of NNs that are trained to fit to data. The key mathematical tools that support this framework include transform-domain sparse regularization, the Radon transform of computed tomography, and approximation theory, which are all techniques deeply rooted in signal processing.
Compression is essential for efficient storage and transmission of signals. One powerful method for compression is through the application of orthogonal transforms, which convert a group of