The technology we use, and even rely on, in our everyday lives –computers, radios, video, cell phones – is enabled by signal processing. Learn More »
1. IEEE Signal Processing Magazine
2. Signal Processing Digital Library*
3. Inside Signal Processing Newsletter
4. SPS Resource Center
5. Career advancement & recognition
6. Discounts on conferences and publications
7. Professional networking
8. Communities for students, young professionals, and women
9. Volunteer opportunities
10. Coming soon! PDH/CEU credits
Click here to learn more.
This year was one of the largest ICASSP conferences that I have attended with more than 3,000 participants. During opening remarks, SPS president, Ali H. Sayed announced that the membership fees for students has been set to $1, there will be an open access journal for signal processing, IEEE SPS formal policy statement for commitment to diversity, and initiating an E-learning center which are great steps forward to create an open society.
Among tutorial sessions there were 3 related to deep learning and its applications and 8 tutorials focusing on other aspects of signal processing including quantum computation. Among the IEEE SPS best paper awards, there were two papers from speech and language (TASLP) one on speech enhancement (Y. Xu et al, “A Regression Approach to Speech Enhancement Based on Deep Neural Networks”, IEEE TASLP 2015) and the other one on sentence embedding for IR and search engines (H. Palangi et al, “Deep Sentence Embedding Using Long Short-Term Memory Networks: Analysis and Application to Information Retrieval”, IEEE TASLP 2016)
A few interesting papers that got my attention were “Understanding Deep Neural Networks through Input Uncertainties” by J. J. Thiagarajan et al where some angles of questions like `which input features helped the decision’ for a neural net model is addressed. This is specifically an important problem if we look into the interpretability of deep neural networks from this aspect, specifically because we can `teach` the model to rely more on which part of input data for specific decisions it is making. Another paper that has done a through empirical study on language modelling was “Investigation of Sampling Techniques for Maximum Entropy Language Modeling Training” by Xie Chen et al where various sampling techniques have been explored for language modeling on a few large scale datasets.
I also got the chance to check a few interesting papers in the area of linear inverse problems and sparse decomposition. An interesting use of Bayesian neural networks for reconstruction in “Bayesian Neural Networks for Sparse Coding” by D. Kuzin et al. A valuable study on using structured latent variables for faster reconstruction in compressed sensing in “Fast Compressive Sensing Recovery Using Generative Models with Structured Latent Variables” by Justin Romberg’s group at Georgia Tech., and the inspiring work on reconstructing sparse signals from their one-bit noisy measurements and proposing Deep Rec in “Deep Signal Recovery with One-bit Quantization” from Yonina Eldar, Mojtaba Soltanian and other collaborators which reminded me of the great review paper that I read about structured compressed sensing from Yonina back in 2012 that helped me to learn more about compressive sensing with many interesting observations.
Home | Sitemap | Contact | Accessibility | Nondiscrimination Policy | IEEE Ethics Reporting | IEEE Privacy Policy | Terms | Feedback
© Copyright 2024 IEEE – All rights reserved. Use of this website signifies your agreement to the IEEE Terms and Conditions.
A not-for-profit organization, IEEE is the world's largest technical professional organization dedicated to advancing technology for the benefit of humanity.