Skip to main content

SPM Article September 2023

Discriminative and Generative Learning for the Linear Estimation of Random Signals

Inference tasks in signal processing are often characterized by the availability of reliable statistical modeling with some missing instance-specific parameters. One conventional approach uses data to estimate these missing parameters and then infers based on the estimated model. Alternatively, data can also be leveraged to directly learn the inference mapping end to end. These approaches for combining partially known statistical models and data in inference are related to the notions of generative and discriminative models used in the machine learning literature [1] , [2] , typically considered in the context of classifiers.

Read more

Quaternions in Signal and Image Processing: A comprehensive and objective overview

Quaternions are still largely misunderstood and often considered an “exotic” signal representation without much practical utility despite the fact that they have been around the signal and image processing community for more than 30 years now. The main aim of this article is to counter this misconception and to demystify the use of quaternion algebra for solving problems in signal and image processing. To this end, we propose a comprehensive and objective overview of the key aspects of quaternion representations, models, and methods and illustrate our journey through the literature with flagship applications. We conclude this work by an outlook on the remaining challenges and open problems in quaternion signal and image processing.

Read more

Deep Learning Meets Sparse Regularization: A signal processing perspective

Deep learning (DL) has been wildly successful in practice, and most of the state-of-the-art machine learning methods are based on neural networks (NNs). Lacking, however, is a rigorous mathematical theory that adequately explains the amazing performance of deep NNs (DNNs). In this article, we present a relatively new mathematical framework that provides the beginning of a deeper understanding of DL. This framework precisely characterizes the functional properties of NNs that are trained to fit to data. The key mathematical tools that support this framework include transform-domain sparse regularization, the Radon transform of computed tomography, and approximation theory, which are all techniques deeply rooted in signal processing.

Read more

The Discrete Cosine Transform and Its Impact on Visual Compression: Fifty Years From Its Invention

Compression is essential for efficient storage and transmission of signals. One powerful method for compression is through the application of orthogonal transforms, which convert a group of N data samples into a group of N transform coefficients. In transform coding, the N samples are first transformed, and then the coefficients are individually quantized and entropy coded into binary bits. The transform serves two purposes: one is to compact the energy of the original N samples into coefficients with increasingly smaller variances so that removing smaller coefficients have negligible reconstruction errors, and another is to decorrelate the original samples so that the coefficients can be quantized and entropy coded individually without losing compression performance. 

Read more

Reflecting on the Success of ICASSP 2023

As we gear up for the International Conference on Acoustics, Speech, and Signal Processing (ICASSP) 2024, it is essential to take a moment to celebrate the achievements and highlights of ICASSP 2023, which took place on Rhodes Island, Greece, this past June. ICASSP 2023 was a momentous event as it marked the first postpandemic ICASSP, and the return to in-person meetings. With the theme “Signal Processing in the AI Era,” the conference underscored the strong connection between signal processing and machine learning, highlighting the pivotal role of signal processing in shaping the development of artificial intelligence (AI).

Read more