1. IEEE Signal Processing Magazine
2. Signal Processing Digital Library*
3. Inside Signal Processing Newsletter
4. SPS Resource Center
5. Career advancement & recognition
6. Discounts on conferences and publications
7. Professional networking
8. Communities for students, young professionals, and women
9. Volunteer opportunities
10. Coming soon! PDH/CEU credits
Click here to learn more.
As humans, we cannot be indifferent to the increasing number of dramatic events taking place in the world: fires, tornadoes, floods, and—recently—the collapse of a huge block of the Marmolada glacier in the Italian Alps. All are clear evidence to the global warming of the Earth. As scientists in signal and image processing and in the data sciences, domains that are marked by great economic growth and a high consumption of energy and Earth resources, we must take into account these events and ponder the utility for and impact on humans of the research work we are doing.
s humans, we cannot be indifferent to the increasing number of dramatic events taking place in the world: fires, tornadoes, floods, and—recently—the collapse of a huge block of the Marmolada glacier in the Italian Alps. All are clear evidence to the global warming of the Earth. As scientists in signal and image processing and in the data sciences, domains that are marked by great economic growth and a high consumption of energy and Earth resources, we must take into account these events and ponder the utility for and impact on humans of the research work we are doing.
This is especially true for trending domains of research such as advanced telecommunications or artificial intelligence (AI) and machine learning (ML). In the latter, we observe more and more studies using these fashionable approaches, but too often without any relevant justification. In fact, the first questions should be [1] the following: Is ML or deep learning (DL) mandatory for solving this problem? Does a simpler and more economic method for solving it exist? However, many scientists do not consider such questions.
In addition, when benchmarking such approaches—which are typically very energy and memory consuming—with more classical ones, many scientists use a simple metric based only on performance, like the mean square error. Such a comparison is unfair, and I think that it should be mandatory to take into account the computational and memory complexity, at least. This could be done by using criteria like the Akaike information criterion (AIC) or Bayesian information criterion (BIC) or by using a cost function with a regularization term related to the complexity. Such a criterion will select a solution with a relevant balance between the performance and complexity. This is used in the “Lecture Notes” column (“The Monte-Carlo Sampling Approach to Model Selection: A Primer”) on page 85 of this issue of IEEE Signal Processing Magazine (SPM). Again, this principle is nothing but the idea of Occam’s razor, which promotes one of the simplest (but still very efficient) solution.