1. IEEE Signal Processing Magazine
2. Signal Processing Digital Library*
3. Inside Signal Processing Newsletter
4. SPS Resource Center
5. Career advancement & recognition
6. Discounts on conferences and publications
7. Professional networking
8. Communities for students, young professionals, and women
9. Volunteer opportunities
10. Coming soon! PDH/CEU credits
Click here to learn more.
The Video and Image Processing Cup (VIP Cup) competition encourages teams of students work together to solve real-world problems using video and image processing methods and techniques. Three final teams are chosen to present their work during ICIP to compete for the US$5,000 grand prize!
Technical Committees interested in submitting a call for proposal for upcoming VIP Cup competitions, please visit the Technical Committees page for more information.
[Sponsored by the IEEE Signal Processing Society]
Every person spends around 1/3 of their life in bed. For an infant or a young toddler this percentage can be much higher, and for bed-bound patients it can go up to 100% of their time. Automatic non-contact human pose estimation topic has received a lot of attention/success especially in the last few years in the artificial intelligence (AI) community thanks to the introduction of deep learning and its power in AI modeling. However, the state-of-the-art vision-based AI algorithms in this field can hardly work under the challenges associated with in-bed human behavior monitoring, which includes significant illumination changes (e.g. full darkness at night), heavy occlusion (e.g. covered by a sheet or a blanket), as well as privacy concerns that mitigate large-scale data collection, necessary for any AI model training.
Theoretically, estimating human pose from covered labeled cases to unlabeled ones can be deemed as a domain adaptation problem. For the human pose estimation problem although multiple existing datasets, yet they are mainly RGB images from daily activities, which have huge domain shifts with our target problem. Although the domain adaptation topic has been studied in machine learning in the last few decades, mainstream algorithms mainly focus on the classification problem instead of a regression task, such as pose estimation. The 2021 VIP Cup challenge is in fact a domain adaptation problem for regression with a practical application for in-bed human pose estimation, not been addressed before.
In this 2021 VIP Cup challenge, we seek computer vision-based solutions for in-bed pose estimation under the covers, where no annotations are available for covered cases during model training, while the contestants have access to the large amounts of labeled data in no-cover cases. The successful completion of this task enables the in-bed behavior monitoring technologies to work on novel subjects and environments, where no prior training data is accessible. For more information about the competition please visit: IEEE VIP Cup 2021
Each team must be composed of: (i) One faculty member (the Supervisor); (ii) At most one graduate student (the Tutor), and; (iii) At least 3 but no more than 10 undergraduates. At least three of the undergraduate team members must be either IEEE Signal Processing Society (SPS) members or SPS student members.
Augmented Cognition Lab at Northeastern University.
Grand Prize - Team Name: Samaritan
University: Bangladesh University of Engineering and Technology
Supervisor: Mohammad Ariful Haque
Sawradip Saha, Sanjay Acharjee, Aurick Das, Shahruk Hossain, Shahriar Kabir
First Runner-Up - Team Name: PolyUTS
University: The Hong Kong Polytechnic University & University of Technology Sydney
Supervisor: Kin-Man Lam
Tutor: Tianshan Liu
Zi Heng Chi, Shao Zhi Wang, Chun Tzu Chang,
Xin Yue Li, Akshay Holkar, Samantha Pronger, Md Islam
Second Runner-Up - Team Name: NFPUndercover
University: University of Moratuwa
Supervisor: Chamira Edussooriya
Tutor: Ashwin De Silva
Jathurshan Pradeepkumar, Udith Haputhanthri,
Mohamed Afham Mohamed Aflal, Mithunjha Anandakumar