Skip to main content

5-MICC at ICASSP 2020 Vote for Clip

VIP Cup 2024 at ICIP 2024

IEEE VIP Cup 2024 at IEEE ICIP 2024

IEEE SPS Video and Image Processing Cup at IEEE ICIP 2024
SS-OCT Image Analysis

IEEE ICIP 2024 Website | 27-30 October 2024 | 2024 VIP Cup 2024 Official Document
[Sponsored by the IEEE Signal Processing Society]
 

Optical Coherence Tomography (OCT) is a retina non-invasive imaging technique widely used for diagnosis and treatment of many eye-related diseases. Different anomalies such as Age related Macular Degeneration (AMD), Diabetic Retinopathy (DR) or Diabetic Macular Edema (DME) can be diagnosed by OCT images.  Due to the importance of early stage and accurate diagnosis of eye-related diseases, providing high resolution and clear OCT images is of high importance. Therefore, analyzing and processing of OCT images have been known as one of the important and applicable biomedical image processing research areas. 

Different processings have been applied on OCT images, such as image super-resolution, image de-noising, image reconstruction, image classification and image segmentation. Despite many algorithms working on OCT image analysis, still there is a need for improving the quality of the resulting images and the accuracy of classification. Therefore, this challenge has been dedicated to the problem of OCT image enhancement and classification.

Task Description

The challenge contains the following three tasks:

  1. De-noising of noisy OCT images.Since many of captured OCT images are noisy and this can highly decrease the accuracy of diagnosis of eye related diseases, de-noising can be considered as one of the important steps in OCT image analysis. Hence, this task is dedicated to the problem of OCT image de-noising. The task is to de-noise the available noisy OCT B-scans and try to produce the best results.  A sample for a noisy B-scan has been shown in Figure.1

     

  2. Super-resolution. To prevent motion artifacts, capturing OCT images is usually done at rates lower than nominal sampling rate, which results in low resolution images. Using super-resolution methods, high resolution images can be reconstructed from the low resolution ones. Due to the importance of this issue, this task has been dedicated to the super-resolution problem. The aim is to obtain high-resolution OCT B-scans from low-resolution OCT B-scans.
  3. Volume-based classification of OCT dataset into several sub-classes. The aim of this task is to classify several observed cases (where there are several B-scans for each case) into healthy (0), diabetic patients with DME (1) and non-diabetic patients with other ocular diseases (2) classes.
    Keywords:Optical Coherence Tomography (OCT), De-noising, Super-resolution, Classification.

Full technical details, dataset(s), evaluation metrics, and all other pertinent information about the competition is located in the 2024 VIP Cup Official Document.

Important Dates

  • Challenge announcement:  January 2024
  • Release of the training dataset: 31 January 2024
  • Team Registration Deadline: 30 March 2024 (Register and submit team's work)
  •  Release of the test-dataset 1: 30 April 2024 (Download dataset)
  • Final Submission of Team’s Work Deadline: 15 June 2024
  •  Announcement of 3 finalist teams: 15 July 2024
  • Final competition at ICIP 2024: October 27-30, 2024

IMPORTANT! The final report containing the group information, a comprehensive description of methodology, sample outputs, results of the train and test phases, evaluation criteria with selected ROC’s and any other information regarding the algorithms in addition to a link for downloading the results of the algorithm should be sent by email to, misp@mui.ac.ir.

Registration and Important Resources

Official VIP Cup Team Registration

  • All teams MUST be registered through the official competition registration system before the deadline in order to be considered as a participating team. Teams also MUST acknowledge, agree to the SPS Student Terms and Conditions, and meet all eligibility requirements at the time of team registration as well as throughout the competition. The Agreement Form can be found in the Official Terms & Conditions document linked in the top section of this page.
  • Registration Link: Register your team for the 2024 VIP Cup before 30 March 2024 and submit work before 15 June 2024! 
  • Download the training dataset!

Competition Organizers

Medical Image and Signal Processing Research Center, Isfahan University of Medical Sciences, Isfahan, Iran.

  • Professor Hossein Rabbani, Medical Image and Signal Processing Research Center, Isfahan University of Medical Sciences, Isfahan, Iran
  • Dr. Azhar Zam, Tandon School of Engineering, New York University, Brooklyn, NY, 11201, USA  and Division of Engineering, New York University Abu Dhabi (NYUAD), Abu Dhabi, United Arab Emirates
  • Dr. Farnaz Sedighin, Medical Image and Signal Processing Research Center, Isfahan University of Medical Sciences, Isfahan, Iran
  • Dr. Parisa Ghaderi-Daneshmand, Medical Image and Signal Processing Research Center, Isfahan University of Medical Sciences, Isfahan, Iran
  • Dr. Mahnoosh Tajmirriahi, school of Advanced Technologies in Medicine , Isfahan University of Medical Sciences, Isfahan, Iran
  • Dr. Alireza Dehghani, Department of Ophthalmology, School of Medicine, Isfahan University of Medical Sciences, Isfahan, Iran  and Didavaran Eye Clinic, Isfahan, Iran
  • Mohammadreza Ommani, Didavaran Eye Clinic, Isfahan, Iran
  • Arsham Hamidi, Biomedical Laser and Optics Group (BLOG), Department of Biomedical Engineering, University of Basel, Basel, Switzerland

Finalist Teams

Grand Prize Recipient:
Team Name: IITRPR-OCT-Diagnose-2024
University: Indian Institute of Technology Ropar
Supervisor: Dr. Puneet Goyal
Tutor: Joy Dhar
Undergraduate Students:
Ankush Naskar, Ashish Gupta, Hemlata Gautam, Satvik Srivastava, Utkarsh Patel

First Runner Up:
Team Name: Ultrabot
University: Vietnamese German University
Supervisor: Cuong Nguyen Tuan
Tutor: Chau Truong Vinh Hoang
Undergraduate Students: 
Duong Tran Hai, Huy Nguyen Minh Nhat, Phuc Nguyen Song Thien, Triet Dao Hoang Minh

Second Runner Up:
Team Name: The Classifiers
University: Isfahan University of Technology
Supervisor: Mohsen Pourazizi
Tutor: Sajed Rakhshani
Undergraduate Students: 
Amirali Arbab, Amirhossein Arbab, Aref Habibi

Contacts

  • Competition Organizers (technical, competition-specific inquiries): 
    Medical Image and Signal Processing Research Center, Isfahan University of Medical Sciences, Isfahan, Iran. Email: misp@mui.ac.ir
  • SPS Staff (Terms & Conditions, Travel Grants, Prizes): Jaqueline Rash, SPS Membership Program and Events Administrator
  • SPS Student Services Committee: Angshul Majumdar, Chair

 


<< Back to IEEE VIP Cup

Read more

VIP Cup 2023 at ICIP 2023

IEEE VIP Cup 2023 at IEEE ICIP 2023

IEEE SPS VIP Cup at ICIP 2023
IEEE SPS VIP Cup 2023: Ophthalmic Biomarker Detection

IEEE ICIP 2023 Website | Sunday, 8 October 2023 | VIP Cup 2023 Website
[Sponsored by the IEEE Signal Processing Society]
 

The IEEE SPS Video and Image Processing Cup (VIP Cup) student competition, presented by the IEEE Signal Processing Society, gives students the opportunity to work together to solve real-life problems using video and image processing methods. After students submit their work, three final teams are selected to present their work and compete for the grand prize at ICIP 2023. 

Interested in competing? The submission deadline is August 27, 2023. For full competition details, eligibility requirements, and team registration, download the 2023 VIP Cup Official Document.

  • Supported by Georgia Institute of Technology (Georgia Tech)
  • ICIP Committee
  • IVMSP (Image, Video, and Multidimensional Signal Processing) Technical Committee

Ophthalmic Biomarker Detection

Ophthalmic clinical trials that study treatment efficacy of eye diseases are performed with a specific purpose and a set of procedures that are predetermined before trial initiation. Hence, they result in a controlled data collection process with gradual changes in the state of a diseased eye. In general, these data include 1D clinical measurements and 3D optical coherence tomography (OCT) imagery. Physicians interpret structural biomarkers for every patient using the 3D OCT images and clinical measurements to make personalized decisions for every patient.

Two main challenges in medical image processing have been generalization and personalization. Teams are asked to predict the presence or absence of six different biomarkers simultaneously on every OCT scan in the held-out test set. There will be two phases to the competition leading to the final competition at ICIP 2023.

PHASE 1: Teams register for the competition challenge through the official 2023 VIP Cup Registration System and by filling out this form. Both steps need to be completed in order to register your team and compete.

The leaderboard for PHASE 1 will be hosted on Codalab: HERE.

PHASE 2: The TOP 10 TEAMS from the PHASE 1 leaderboard will be invited to re-train their models and submit the same biomarker prediction csv files for each image in the test set to assess how well the model is able to personalize.

Results for PHASE 2 will be published HERE.
Note: In the event of a tie, the performance in phase 1 will determine the winner.

FINAL COMPETITION: The teams will present their work at ICIP 2023.

Full details of this competition including team eligibility and how to submit your work
can be found in the 2023 VIP Cup Official Document

Each team member needs to read the full document.

Data Set

To download the dataset, visit Z​enodo.

Challenge Evaluation Criteria

To measure the performance of the biomarker detection task, we will make use of the macro averaged F1-score.


The alternative equation for F1 score

The macro-averaged F1 score (or macro F1 score) is computed using the arithmetic mean (aka unweighted mean) of all the per-class F1 scores.

This method treats all classes equally regardless of their support values.

Team Formation Eligibility

Each team participating should be composed of one faculty member or someone with a PhD degree employed by the university (the Supervisor), at most one graduate student (the Tutor), and at least three, but no more than ten undergraduate students. At least three of the undergraduate team members must hold either regular or student memberships of the IEEE Signal Processing Society. Undergraduate students who are in the first two years of their college studies, as well as high school students who are capable to contribute are welcome to participate in a team. A participant cannot be on more than one team.

Prize for Finalists

The three teams with highest performance in the open competition based on the above criteria will be selected as finalists and invited to participate in the final competition at ICIP 2023. The champion team will receive a grand prize of $5,000. The first and the second runner-up will receive a prize of $2,500 and $1,500, respectively, in addition to travel grants and complimentary conference registrations.

  • Up to three student members from each finalist team will be provided travel support to attend the conference in-person. In-person attendance of the physical conference is required for reimbursement.
  • Complimentary conference registration for the three finalist team members from each team who present at ICIP. These complimentary conference registrations cannot be used to cover any papers accepted by the conference. If you are one of the three finalist team members from each team and wish to receive complimentary registration and/or conference banquet access, you must email Jaqueline Rash, with this information once your team has been selected as a finalist.
  • The three finalist team members from each team will also be invited to join the Conference Banquet and the SPS Student Job Fair, so that they can meet and talk to SPS leaders and global experts. Please note registration to the Conference Banquet and Student Job Fair is limited and based on availability.

Timeline

  • 1 July 2023: Registration opens (open till last day of submission) and Training, Validation & Testing (hidden ground truth) data + Starter code availability
  • 20 August 2023: Phase 1 + Submission ends (Top 10 TEAM announced)
  • 21 August 2023: Phase 2 starts (Submission form available)
  • 1 September 2023: Phase 2 submission ends (both csv and report)
  • 3 September 2023: Three finalist teams announced
  • 8 October 2023: VIP Cup at ICIP 2023

Organizing Committee

School of Electrical and Computer Engineering at the Georgia Institute of Technology (Georgia Tech).

Finalist Teams

Team Name: Synapse (Grand Prize Winner)
University: Bangladesh University of Engineering and Technology (BUET)
Supervisor: M. Sohel Rahman
Tutor: Sheikh Saifur Rahman Jony
Students:
H.A.Z. Sameen Shahgir, Khondker Salman Sayeed, Tanjeem Azwad Zaman, Md. Asif Haider

Team Name: Neurons (First Runner-Up Prize)
University: Bangladesh University of Engineering and Technology (BUET)
Supervisor: Dr. Lutfa Akter
Tutor: Tianshan Liu
Students:
Md. Abtahi Majeed Chowdhury, Asif Quadir, Md. Touhidul Islam, Mahmudul Hasan

Team Name: IITH (Second Runner-Up Prize)
University: Indian Institute of Technology Hyderabad
Supervisor: Soumya Jana
Students:
Aaseesh Rallapalli, Utkarsh Doshi, Lokesh Venkata Siva Maruthi Badisa

 


<< Back to IEEE VIP Cup

Read more

VIP Cup 2022 at ICIP 2022

IEEE VIP Cup 2022 at IEEE ICIP 2022

IEEE SPS VIP Cup at ICIP 2022
IEEE SPS VIP Cup 2022: Synthetic Image Detection

The IEEE SPS Video and Image Processing Cup (VIP Cup) student competition, presented by the IEEE Signal Processing Society, gives students the opportunity to work together to solve real-life problems using video and image processing methods. After students submit their work, three final teams are selected to present their work and compete for the grand prize at ICIP 2022.

Interested in competing? The submission deadline is August 8, 2022. For full competition details, eligibility requirements, and team registration, visit the IEEE Signal Processing Society website.

Synthetic Image Detection

The topic is “Synthetic Image Detection” in which the aim is to distinguish real versus AI-based content in images. Teams are requested to design a strategy for synthetic image detection by relying on image processing and machine learning techniques.

Challenge Organization

Competitors are asked to work in the challenging scenario where it is not known the method used to generate synthetic data. More specifically the test data comprises:

  • both fully synthetic images and partially manipulated ones,
  • generative models that include not only GANs, but also more recent diffusion-based models.

Being able to discriminate synthetic images, fully and partially synthetic, vs pristine ones can represent a step forward to the advancement of forensics tools. The challenge will consist of two stages: an open competition that any eligible team can participate in, and an invitation-only final competition. Eligible teams must submit their entries by September 5, 2022. The three teams with the highest performance will be selected by September 10, 2022 and invited to join the final competition. The final competition will be judged at ICIP 2022, which will be held on October 16-19, 2022

Open Competition - Part I

Part 1 of the open competition is designed to give teams a simplified version of the problem at hand to become familiar with the task. Participants will be provided with a labeled training dataset of real and synthetic images. Synthetic images can be fully or partially synthetic. Images will undergo JPEG compression at different quality levels and resizing prior to compression. Teams will be provided with PYTHON scripts to apply these operations to the training dataset. Teams are requested to provide the executable code to the organizers in order to test the algorithms on the evaluation dataset (Test-set 1).

Open Competition - Part 2

Part 2 of the competition is designed to address a more challenging task: synthetic image detection on unseen models, i.e. synthetic data generated using architectures not present in training. The task remains the same as for Part 1. Teams are requested to provide the executable code to the organizers in order to test the algorithms on the evaluation dataset (Test-set 2). 

Final Competition

The three highest scoring teams from the open competition will be selected and they can provide an additional submission.

More information on training set and test sets will be available on 25 July on the web page and Piazza class

Challenge Evaluation Criteria

The finalist teams will be selected based on the results achieved during the open competition. Results will be judged for Part 1 and Part 2 by means of balanced accuracy for the detection task.

The final competition score will be the weighted average between the accuracy obtained in Part 1 and Part 2 computed as Score = (0.7 x Accuracy Part I) + (0.3 x Accuracy Part 2)

Team Formation Eligibility

Each team participating should be composed of one faculty member or someone with a PhD degree employed by the university (the Supervisor), at most one graduate student (the Tutor), and at least three, but no more than ten undergraduate students. At least three of the undergraduate team members must hold either regular or student memberships of the IEEE Signal Processing Society. Undergraduate students who are in the first two years of their college studies, as well as high school students who are capable to contribute are welcome to participate in a team. A participant cannot be on more than one team.

Prize for Finalists

The three teams with highest performance in the open competition will be selected as finalists and invited to participate in the final competition at ICIP 2022. The champion team will receive a grand prize of $5,000. The first and the second runner-up will receive a prize of $2,500 and $1,500, respectively, in addition to travel grants and complimentary conference registrations.

  • Up to three student members from each finalist team will be provided travel support to attend the conference in-person. In-person attendance of the physical conference is required for reimbursement. 
  • Complimentary conference registration for all team members.
  • The finalist teams will also be invited to join the Conference Banquet and the SPS Student Job Fair, so that they can meet and talk to SPS leaders and global experts. Please note registration to the Conference Banquet and Student Job Fair is limited and based on availability. *

* These complimentary conference registrations cannot be used to cover any papers accepted by the conference. If you wish to receive complimentary registration and/or conference banquet access, you must email Jaqueline Rash, Jaqueline.rash@ieee.org, with this information once your team has been selected as a finalist.

Timeline

  • 25 July, 2022: Release of data and submission information
  • 8 August, 2022: First submission deadline
  • 13 August, 2022: Ranking publication of the first submission on Test-set 1
  • 22 August, 2022: Second submission deadline
  • 27 August, 2022: Ranking publication of the first and second submission on both Test-sets
  • 1 September, 2022: Team registration deadline
  • 5 September, 2022: Third submission deadline
  • 10 September, 2022: Finalists announced 

Register

Official VIP Cup document

Additional Information

General information and resources are available on the web page and Piazza class. To set up a free account, use the access code "vipcup2022" to join as a student the "VIPCUP 2022: IEEE Video and Image Processing Cup" class.

Organizers

The challenge is organized as a joint effort between the Image Processing Research Group (GRIP) of the University Federico II of Naples (Italy) and NVIDIA (USA). The GRIP team is represented by Prof. Luisa Verdoliva (Associate Professor), Dr. Davide Cozzolino (Assistant Professor), Fabrizio Guillaro (Ph.D. Student) and Riccardo Corvi (Research Fellow). NVIDIA is represented by Dr. Koki Nagano.

This competition is supported by the IEEE Signal Processing Society and SPS Information Forensics and Security Committee.

Finalist Teams

Grand Prize - Team Name: FAU Erlangen-Nürnberg
University: Friedrich-Alexander-Universität Erlangen-Nürnberg
Supervisor: Christian Riess
Tutor: Anatol Maier
Students: Vinzenz Dewor, Luca Beetz, ChangGeng Drewes, Tobias Gessler

First Runner-Up - Team Name: Megatron
University: Bangladesh University of Engineering and Technology
Supervisor: Dr. Shaikh Anowarul Fattah,
Students: Bishmoy Paul, Md Awsafur Rahman, Najibul Haque Sarker, Zaber Ibn Abdul Hakim

Second Runner-Up - Team Name: Sherlock
University: Bangladesh University of Engineering and Technology
Supervisor: Mohammad Ariful Haque
Students: Fazle Rabbi, Asif Quadir, Indrojit Sarkar, Shahriar Kabir Nahin, Sawradip Saha, Sanjay Acharjee

 


<< Back to IEEE VIP Cup

Read more

VIP Cup 2021 at ICIP 2021

IEEE VIP Cup 2021 at ICIP 2021

IEEE VIP Cup 2021
Privacy-Preserving In-Bed Human Pose Estimation

[Sponsored by the IEEE Signal Processing Society]

ICIP 2021 Website | Sunday, September 19, 2021 | VIP Cup 2021 Website

Supported by

  • Northeastern University
  • ICIP committee
  • IVMSP (Image, Video, and Multidimensional Signal Processing) Technical Committee

Introduction

VIP Cup 2021 Image 1

Every person spends around 1/3 of their life in bed. For an infant or a young toddler this percentage can be much higher, and for bed-bound patients it can go up to 100% of their time. Automatic non-contact human pose estimation topic has received a lot of attention/success especially in the last few years in the artificial intelligence (AI) community thanks to the introduction of deep learning and its power in AI modeling. However, the state-of-the-art vision-based AI algorithms in this field can hardly work under the challenges associated with in-bed human behavior monitoring, which includes significant illumination changes (e.g. full darkness at night), heavy occlusion (e.g. covered by a sheet or a blanket), as well as privacy concerns that mitigate large-scale data collection, necessary for any AI model training.

Theoretically, estimating human pose from covered labeled cases to unlabeled ones can be deemed as a domain adaptation problem. For the human pose estimation problem although multiple existing datasets, yet they are mainly RGB images from daily activities, which have huge domain shifts with our target problem. Although the domain adaptation topic has been studied in machine learning in the last few decades, mainstream algorithms mainly focus on the classification problem instead of a regression task, such as pose estimation. The 2021 VIP Cup challenge is in fact a domain adaptation problem for regression with a practical application for in-bed human pose estimation, not been addressed before.

In this 2021 VIP Cup challenge, we seek computer vision-based solutions for in-bed pose estimation under the covers, where no annotations are available for covered cases during model training, while the contestants have access to the large amounts of labeled data in no-cover cases. The successful completion of this task enables the in-bed behavior monitoring technologies to work on novel subjects and environments, where no prior training data is accessible. For more information about the competition please visit: IEEE VIP Cup 2021

Prize

  • The Champion: $5,000
  • The 1st Runner-up: $2,500
  • The 2nd Runner-up: $1,500

Eligibility Criteria

Each team must be composed of: (i) One faculty member (the Supervisor); (ii) At most one graduate student (the Tutor), and; (iii) At least 3 but no more than 10 undergraduates. At least three of the undergraduate team members must be either IEEE Signal Processing Society (SPS) members or SPS student members.

Important Dates

  • May 17, 2021 - Competition guidelines and Training + Validation Sets released
  • July 15, 2021 - Test Set 1 released and the submission guidelines for the evaluation
  • July 30, 2021 - Submission deadline for evaluation (test results, formal report, and code)
  • August 30, 2021 - Finalists (best three teams) announced
  • September 19, 2021 - Competition on Test Set 2 at ICIP 2021

Register

  • Registration Link - Coming Soon!

Organizing Committee

Augmented Cognition Lab at Northeastern University.

 

VIP Cup 2021 Image 2
 
VIP Cup 2021 Image 3

 

Finalist Teams

Grand Prize - Team Name: Samaritan
University: Bangladesh University of Engineering and Technology
Supervisor: Mohammad Ariful Haque
Students:
Sawradip Saha, Sanjay Acharjee, Aurick Das, Shahruk Hossain, Shahriar Kabir

First Runner-Up - Team Name: PolyUTS
University: The Hong Kong Polytechnic University & University of Technology Sydney
Supervisor: Kin-Man Lam
Tutor: Tianshan Liu
Students:
Zi Heng Chi, Shao Zhi Wang, Chun Tzu Chang,
Xin Yue Li, Akshay Holkar, Samantha Pronger, Md Islam

Second Runner-Up - Team Name: NFPUndercover
University: University of Moratuwa
Supervisor: Chamira Edussooriya
Tutor: Ashwin De Silva
Students:
Jathurshan Pradeepkumar, Udith Haputhanthri,
Mohamed Afham Mohamed Aflal, Mithunjha Anandakumar

 


<< Back to IEEE VIP Cup

Read more

VIP Cup 2020 at ICIP 2020

IEEE VIP Cup at ICIP 2020

IEEE VIP Cup 2020
Real-time vehicle detection and tracking at junction
using a fisheye camera

[Sponsored by the IEEE Signal Processing Society]

Organizers

Introduction

With the increasing growth of urbanization, it introduces traffic jams and congestion in several locations around the city. Apart from accidents, that may result in drastic average travel time increase from point A to point B in a city. Especially junctions are critical since delays and accidents tend to be concentrated at these places. Under these circumstances, intelligent traffic systems are unavoidable that are capable of tasks such as vehicle detection, tracking, violation detection and congestion control.

The 2020 VIP-cup challenge focuses on fisheye cameras mounted into street lamps at junctions and vehicle detection and tracking to be used for a junction management system to optimize the flow of traffic and synchronize with other junctions to obtain bottleneck performances throughout the city. Fisheye cameras are used since they tend to be promising in terms of reliability and scene coverage at a chosen junction. They provide 360 degrees of observation view, thus introducing key changes in traffic management.

Although fish eye cameras have a key role in junction management systems, accompanying challenges come with them as well, such as : High distortion ratios, Different scales of same target object moving in different parts of the image, Day/night views variance (night view suffers from low quality related to surrounding lightning conditions), Exposure introduced with vehicle lights (night view). A dataset of traffic videos from several junctions at different times during the day/night is provided with the annotation for training and validation (icip2020.issd.com.tr). The evaluation will be performed based on separate test datasets.

 

Figure 1 Image, Sample View

 

Schedule

  • 30 June 2020: Initial Training Dataset released
  • 30 July 2020: Test Dataset 1 released
  • 10 September 2020: Submission deadline
  • 15 October 2020: Finalists (best three teams) announced
  • 25 October 2020: Competition on Test Dataset 2 virtually at ICIP 2020

Registration

Eligibility Criteria

Each team must be composed of:

  • One faculty member (the Supervisor);
  • At most one graduate student (the Tutor);
  • At least three but no more than ten undergraduate students (the Team Members)
  • At least three of the undergraduate team members must be either IEEE Signal Processing Society (SPS) members or SPS student members.
  • The VIP-Cup is a competition for undergraduate students and therefore Master’s students, regardless of the duration of their Bachelor’s degree, cannot participate as regular Team Members.
  • Participants are expected to have basic knowledge of machine learning/deep learning concepts.

Tasks to Execute and Expected Outcomes

  • Detection of vehicles with high average accuracy and low false positives
  • (Extra-1) Innovate new ideas to track vehicle flow from entering junction until exiting it

Datasets (Training, Validation, Testing datasets)

The dataset is composed of >25k (twenty-five thousand) images for training + validation, 2k (two thousand) images for testing. Images varies from day to night, collected at different junctions with different environment and installation conditions.

Dataset is labeled in a standard COCO format. You may parse it the way you like.

Evaluation Criteria (Scores for Tasks, Outcomes and Overall)

  • Detection speed (20 point): Resulting algorithms will be benchmarked on a selected device for best performance.
  • Detection of vehicles accuracy (80 point):
    Assuming the detected vehicle is bounded by exact required size of bounding box as a detection indication. Final evaluation is done by ISSD with a separate dataset by averaging:
    • False positive detection is penalized by (-1point/per image)
    • Failure to detect is penalized by (-2point/per image)
  • This score applies for each image, then averaged over the chosen set:
    Extra-1 (20 point): Estimation of correct path for vehicle entering junction until leaving it

Submission guidelines

  • Teams are required to submit there model inference evaluation via an intermediate representation, will be provided (30 July)
  • Evaluation scripts (mAP, Average Recall, Average inference-time) will be provided (30 July) to teams in order to assess their work iteratively
  • Best 3 teams in the leaderboard will be asked to submit their source code so we can reproduce the model (all copyrights are preserved)
  • Extra-1 will have different qualification than the detection model itself (released on July 30)
  • After model reproduction on our target machines winner will be announced
  • Use of online available model would NOT be accepted and team would be disqualified

Registration Guidelines

  • Get familiar with the problem
  • Get familiar with submission guidelines
  • Register form through webpage icip2020.issd.com.tr

Finalist Teams

Grand Prize - Team Name: BUET Synapticans
University: Bangladesh University of Engineering and Technology
Supervisor: Taufiq Hasan
Students:
Uday Kamal, Partho Ghosh, Nayeeb Rashid,
Ahsan Habib Akash, Md. Abrar Istiak, Swapnil Saha, Mir Sayeed Mohammad

First Runner-Up - Team Name: Multi-layer Perceptron
University: Bangladesh University of Engineering and Technology
Supervisor: Shaikh Anowarul Fattah
Tutor: Tanvir Mahmud
Students:
Md Awsafur Rahman, Bishmoy Paul, Tasnim Nishat Islam, Md. Jahin Alam,
Muhammad Zubair Hasan, Maisoon Rahman, Md Shariar Azad,
Najibul Haque Sarker, Tanvir Anjum, Barproda Halder

Second Runner-Up - Team Name: Zodiac
University: Bangladesh University of Engineering and Technology
Supervisor: Mohammad Ariful Haque
Tutor: Tanvir Mahmud
Students:
Himaddri Roy, Shafin Bin Hamid, Munshi Sanowar Raihan, Prasun Datta,
Ashiqur Rasul, Md. Mushfiqur Rahman, K M Naimul Hassan

Bibliography

  1. M. Bertozzi, L. Castangia, S. Cattani, A. Prioletti and P. Versari, "360° Detection and tracking algorithm of both pedestrian and vehicle using fisheye images," 2015 IEEE Intelligent Vehicles Symposium (IV), Seoul, 2015, pp. 132-137, doi: 10.1109/IVS.2015.7225675.
  2. Honghong Yang, Shiru Qu," Real-time vehicle detection and counting in complex traffic scenes using background subtraction model with low-rank decomposition", Engineering, 2017.
  3. Shokrolah Shirazi, Mohammad & Morris, Brendan," Vision-based vehicle queue analysis at junctions”, 2015, 10.1109/AVSS.2015.7301732.
  4. Wang, W. & Gee, Tim & Price, Jeff & Qi, Hairong. (2015). Real Time Multi-vehicle Tracking and Counting at Intersections from a Fisheye Camera. Proceedings - 2015 IEEE Winter Conference on Applications of Computer Vision, WACV 2015. 17-24. 10.1109/WACV.2015.10.
  5. Shokrolah Shirazi, Mohammad & Morris, Brendan. (2014). Vision-based turning movement counting at intersections by cooperating zone and trajectory comparison modules. 2014 17th IEEE International Conference on Intelligent Transportation Systems, ITSC 2014. 10.1109/ITSC.2014.6958188.
  6. Shokrolah Shirazi, Mohammad & Morris, Brendan. (2016). Looking at Intersections: A Survey of Intersection Monitoring, Behavior and Safety Analysis of Recent Studies. IEEE Transactions on Intelligent Transportation Systems. PP. 1-21. 10.1109/TITS.2016.2568920.

 


<< Back to IEEE VIP Cup

Read more

VIP Cup 2017

IEEE VIP Cup 2017

IEEE VIP Cup 2017: Traffic Sign Detection Under Challenging Conditions

The IEEE Signal Processing Society announces the first edition of the Signal Processing Society Video and Image Processing (VIP) Cup: traffic sign detection under challenging conditions. Visit the 2017 VIP Cup Website.

 

VIP Cup 2017 image, traffic

 

Robust and reliable traffic sign detection is necessary to bring autonomous vehicles onto our roads. State of the art traffic sign detection algorithms in the literature successfully perform the task over existing databases that mostly lack realistic road conditions. This competition focuses on detecting such traffic signs under challenging conditions.

To facilitate such task and competition, we introduce a novel video dataset that contains a variety of road conditions. In such video sequences, we vary the type and the level of the challenging conditions including a range of lighting conditions, blur, haze, rain and snow levels. The goal of this challenge is to implement traffic sign detection algorithms that can robustly perform under such challenging environmental conditions.

Any eligible team can participate in the competition, whose detailed guidelines and dataset are planned to be released on March 15, 2017 and participating teams should complete their submission by July 1, 2017. The three best teams are selected and announced by August 1, 2017. Three finalist teams will be judged at ICIP 2017, which will be held September 17-20, 2017. In addition to algorithmic performances, demonstration and presentation performances will also affect the final ranking.

The champion team will receive a grand prize of $5,000. The first and the second runner-up will receive a prize of $2,500 and $1,500, respectively, in addition to travel grants and complimentary conference registrations. Each finalist team invited to ICIP 2017 will receive travel grant supported by the SPS on a reimbursement basis. A team member is offered up to $1,200 for continental travel, or $1,700 for intercontinental travel. A maximum of three members per team will be eligible for travel support.

ORGANIZING COMMITTEE

 


<< Back to IEEE VIP Cup

Read more

VIP Cup 2018

IEEE VIP Cup 2018

IEEE VIP Cup 2018: Lung Cancer Radiomics-Tumor Region Segmentation

ORGANIZING COMMITTEE

PROPOSED CHALLENGE (Download full document)

The volume, variety, and velocity of medical imaging data is exploding, making it impractical for clinicians to properly utilize the available information resources in an efficient fashion. At the same time, interpretation of such large amount of medical imaging data by humans is significantly error prone reducing the possibility of extracting informative data. The ability to process such large amounts of data promises to decipher the un-decoded information within medical images; Develop predictive and prognosis models to design personalized diagnosis; Allow comprehensive study of tumor phenotype, and; Assess tissue heterogeneity for diagnosis of different type of cancers. Recently, there has been a great surge of interest on Radiomics, which refers to the process of extracting and analyzing several semi-quantitative (e.g., attenuation, shape, size, and location) and quantitative features (e.g., wavelet decomposition, histogram, and gray-level intensity) from medical images with the ultimate goal of obtaining predictive or prognostic models. Radiomics workflow, typically, consists of the following four main processing tasks:

(i) Image acquisition/modality;
(ii) Image segmentation;
(iii) Feature extraction and qualification, and;
(iv) Statistical analysis and model building.

The Radiomics features can be extracted from different imaging modalities including Magnetic Resonance Imaging (MRI); Positron Emission Tomography (PET), and; Computed Tomography (CT), therefore, have the capability of providing complementary information for clinical decision making in clinical oncology.

Recent developments and advancement in Signal Processing and Machine Learning solutions have paved the way for emergence of cancer Radiomics. However, effectiveness and accuracy of Signal Processing and Machine Learning solutions in this field heavily rely on availability of segmented tumor region, i.e., prior knowledge of where the tumor locates. Consequently, among the aforementioned four tasks, Segmentation is considered as the initial and the main critical task to further advance cancer Radiomics. The conventional clinical approach towards segmentation is manual annotation of the tumour region, however, it is extremely time consuming, depends on the personal expertises/oipinion of the clinician, and is extensively sensitive to inter-observer variability. To address these critical issues, automatic (semi-automatic) segmentation methods are currently investigated (e.g. image-level tags or bounding boxes) to minimize manual input, increase consistency in labeling the tumor cancer region, and to obtain accurate and acceptable results in comparison to manually labeled data.

In the 2018 VIP-CUP, we propose a challenge for segmentation of Lung Cancer Tumor region based on a data set consisting of pre-treatment Computed Tomography (CT) scans of several (more than 400) patients. For the initial stage of the competition, a subset of the data along with the annotations will be provided as the training set together with a smaller subset for validation purposes. The evaluation will then be performed based on a test set provided closer to the submission deadline. For segmenting tumors, the competition teams can choose to utilize the conventional image processing techniques or deep learning methods however based on the available shallow datasets. More information...

PARTICIPATION GUIDELINES

Teams satisfying the eligibility criteria outlined below, are invited to participate in the VIP-CUP.  View the detailed competition instructions together with the data sources

Eligibility Criteria: Each team must be composed of: (i) One faculty member (the Supervisor); (ii) At most one graduate student (the Tutor), and; (iii) At least three but no more than ten undergraduates. At least three of the undergraduate team members must be either IEEE Signal Processing Society (SPS) members or SPS student members. Postdocs and research associates are not considered as faculty members. A graduate student is a student having earned at least a 4-year University degree at the time of submission. An undergraduate student is a student without a 4-year degree. Questions about the 2018 VIP-CUP should be directed to Dr. Arash Mohammadi.

IMPORTANT DATES

  • May 24, 2018 - Dataset available
  • July 31, 2018 - Team registration on IEEE VIP Cup
  • August 14, 2018 - Test data released
  • August 26, 2018 - Results Submission
  • September 8, 2018 - Finalist Teams Announced
  • October 7, 2018 - VIP Cup at ICIP in Athens

 

Download Call for Participation

 


<< Back to IEEE VIP Cup

Read more

VIP Cup 2019 at ICIP 2019

IEEE VIP Cup at ICIP 2019

IEEE VIP Cup 2019: Activity Recognition from Body Cameras

IEEE ICIP 2019 | September 22-25, 2019 | VIP Cup 2019 Website | Details Document

 

SUPPORTED BY:

IEEE Signal Processing Society (SPS)
Computational Health Informatics Group, Oxford University
IBM Research Africa
Centre for Intelligent Sensing, Queen Mary University of London

INTRODUCTION

The increasing availability of wearable cameras enables the collection of first-person videos (FPV) for the recognition of activities at home, in the workplace and during sport activities. FPV activity recognition has important applications, which include assisted living, activity tracking and life-logging. The main challenges of FPV activity recognition are the presence of outlier motions (for example due to other people captured by the camera), motion blur, illumination changes and self-occlusions.

The 2019 VIP-Cup challenge focuses on FPV from a chest-mounted camera and on the privacy-aware recognition of activities, which include generic activities, such as walking, person-to-person interactions, such as chatting and handshaking, and person-to-object interactions, such as using a computer or a whiteboard. As videos captured by body cameras may leak private or sensitive information about individuals, the evaluation of the IEEE VIP-Cup challenge entries will include privacy enhancing solutions jointly with the recognition performance.

A dataset of activities from several subjects is provided with the annotation for training and validation. The evaluation will be performed based on separate test datasets.

PRIZES

  • The Champion: $5,000
  • The 1st Runner-up: $2,500
  • The 2nd Runner-up: $1,500

TRAVEL SUPPORT

Each finalist team invited to the ICIP 2019 will receive travel support by the IEEE SPS on a reimbursement basis. A team member is offered up to $1,200 for continental travel, or $1,700 for intercontinental travel. A maximum of 3 members per team will be eligible for travel support.

ELIGIBILITY CRITERIA

Each team must be composed of: (i) One faculty member (the Supervisor); (ii) At most one graduate student (the Tutor), and; (iii) At least 3 but no more than 10 undergraduates. At least three of the undergraduate team members must be either IEEE Signal Processing Society (SPS) members or SPS student members. Download full details document.

IMPORTANT DATES

  • April 30, 2019 - Participation Guidelines and Initial Training Dataset released
  • May 5, 2019 - Additional Training Dataset released
  • June 30, 2019 - Submission Deadline
  • July 15, 2019 - Finalists (best three teams) announced
  • July 30, 2019 - Test Dataset 1 released
  • September 22, 2019 - Competition on Test Dataset 2 at ICIP 2019

REGISTER: VIP Cup 2019 Registration Page

ORGANIZING COMMITTEE

  • Girmaw Abebe Tadesse, University of Oxford
  • Oliver Bent, University of Oxford
  • Kommy Woldemariam, IBM Research
  • Andrea Cavallaro, Queen Mary University of London

FINALIST TEAMS

Grand Prize - Team Name: PolyUTS
University: University of Technology Sydney and The Hong Kong Polytechnic University
Supervisor: Sean He | Tutor: Rui Zhao
Students: Hayden Crain, Alex Young, Van Khai Do, Nirosh Rambukkana, Tianqi Wen,
Jichen Zhang, Zihang LYU, Yifei Fan, Chris Lee, Evan Cheng

 

First Runner-Up - Team Name: BUET Ravenclaw
University: Bangladesh University of Engineering and Technology
Supervisor: Mohammad Ariful Haque
Students: Sheikh Asif Imran Shouborno, Md. Tariqul Islam,
K. M. Naimul Hassan, Md. Mushfiqur Rahman

 

Second Runner-Up - Team Name: BUET Synapticans
University: Bangladesh University of Engineering and Technology
Supervisor: Taufiq Hasan | Tutor: Asif Shahriyar Sushmit
Students: Ankan Ghosh Dastider, Nayeeb Rashid, Ridwan Abrar,
Ahsan Habib Akash, Md. Abrar Istiak Akib, Partho Ghosh

Questions should be directed to Dr. Girmaw Abebe Tadesse.

 

Download Call for Participation

 


<< Back to IEEE VIP Cup

Read more

5-MICC at ICASSP 2020

5-MICC Contest at ICASSP 2020

5-MICC at ICASSP 2020: Let There Be a Beam!

Organizing Committee

- Dr Wei Liu (Chair), University of Sheffield, UK (w.liu@sheffield.ac.uk)

- Dr Mohammad Reza Anbiyaei, Alzahra University, Iran (m.r.anbiyaei@alzahra.ac.ir)

- Dr Xue Jiang, Shanghai JiaoTong University, China (xuejiang@sjtu.edu.cn)

- Dr Lei Zhang, Glasgow University, UK (Lei.Zhang@glasgow.ac.uk)

Call for Video

The Signal Processing Society is pleased to announce the 5-Minute Video Clip Contest (5-MICC) at ICASSP 2020 in Barcelona (May 4-8).

The topic chosen this year is Beamforming, which, based on a spatially distributed array of sensors (antennas, microphones, hydrophones, etc.), can form a beam in space for enhanced signal reception or transmission, which has a wide range of applications, such as radar, sonar, teleconferencing, radio astronomy, seismology, medical diagnosis and treatment, human computer interface, and wireless communications, etc. The submitted video can cover any aspects of beamforming related areas. For example, it can be a general introduction to beamforming and how it works, one or more specific beamforming techniques, recent developments in beamforming and future directions, one or more specific applications of beamforming, various demonstrations of beamforming devices and systems, and so on. Nonetheless, the contest will also accept “open topic” video submissions, even if they are not related to the topic of beamforming. The rationale for this is to engage the broad Signal Processing community to come up with creative ideas. Open topic submissions will compete together with submissions of the chosen yearly topic of beamforming, evaluated by a judging panel with additional members reflecting the submitted open topic areas.

The contest is open for submissions from IEEE SPS members, including undergraduate and graduate students of all majors, as well as researchers from all over the world. Each team must be composed of: (i) One faculty member (the Supervisor); (ii) At most one graduate student (the Tutor), and; (iii) At least three but no more than five undergraduates. At least three undergraduate team members must be either IEEE SPS student members or SPS members by the time they submit the full 5-minute video before the deadline of 28 February 2020 (see below).

There will be three stages for the call:

1.) Submission of 30-Second Trailers:

Submission deadline: January 20, 2020.

Announcement of the best 10 teams: January 31, 2020.

Each submission should include a report, in the form of an IEEE conference paper, up to 2 pages, on the main idea / concept of the full video that will be submitted with the related written script. The selected best 10 teams will be identified and invited to send the final 5-minute video to participate in the final competition by February 28, 2020.

2.) Submission of the Full 5-Minute Video:

Submission deadline: March 1, 2020.

Announcement of the 3 best videos: March 16, 2020.

Three finalist teams will be selected by the judging panel and three student members per team will be invited to the ICASSP conference by offering travel grants in addition to complimentary conference registration*. The three finalist videos will be ranked by the organizing TC.

3.) Final Contest at ICASSP 2020 in Barcelona:

Conference dates: May 4-9, 2020

The three finalist teams’ 5-minute videos will be available on the ICASSP website at least one month before the conference and they will be voted by the conference participants until 2-days into the conference. The final ranking will be decided by the judging panel, also taking into account the popular vote. The winners and final team ranking will be announced during the conference. The finalist teams will also be invited to join the Conference Banquet, as well as the Student Career Luncheon, so that they can meet and talk to SPS leaders and global experts. Notification of Winner is announced during the ICASSP conference.

Registration is now OPEN! Team Registration & Video Upload

Prizes:

Grand Prize:  US$5,000

First Runner-Up Prize:  US$2,500

Second Runner-Up Prize:  US$1,500


Additional details are available on Piazza (access code: 5micc) .

* These complimentary conference registrations cannot be used to cover any papers accepted by the conference.


<< Back to IEEE Five-Minute Video Clip Contest (5-MICC)

Read more