Assessment of the Effectiveness of Seven Biometric Feature Normalization Techniques

You are here

Top Reasons to Join SPS Today!

1. IEEE Signal Processing Magazine
2. Signal Processing Digital Library*
3. Inside Signal Processing Newsletter
4. SPS Resource Center
5. Career advancement & recognition
6. Discounts on conferences and publications
7. Professional networking
8. Communities for students, young professionals, and women
9. Volunteer opportunities
10. Coming soon! PDH/CEU credits
Click here to learn more.

Assessment of the Effectiveness of Seven Biometric Feature Normalization Techniques

By: 
Lee Friedman; Oleg V. Komogortsev

The importance of normalizing biometric features or matching scores is understood in the multimodal biometric case, but there is less attention to the unimodal case. Prior reports assess the effectiveness of normalization directly on biometric performance. We propose that this process is logically comprised of two independent steps: (1) methods to equalize the effect of each biometric feature on the similarity scores calculated from all the features together and (2) methods of weighting the normalized features to optimize biometric performance. In this report, we address step 1 only and focus exclusively on normally distributed features. We show how differences in the variance of features lead to differences in the strength of the influence of each feature on the similarity scores produced from all the features. Since these differences in variance have nothing to do with importance in the biometric sense, it makes no sense to allow them to have greater weight in the assessment of biometric performance. We employed two types of features: (1) real eye-movement features and (2) synthetic features. We compare six variance normalization methods (histogram equalization, L1-normalization, median normalization, z-score normalization, min–max normalization, and L-infinite normalization) and one distance metric (Mahalanobis distance) in terms of how well they reduce the impact of the variance differences. The effectiveness of different techniques on real data depended on the strength of the inter-correlation of the features. For weakly correlated real features and synthetic features, histogram equalization was the best method followed by L1 normalization.

SPS on Twitter

  • DEADLINE EXTENDED: The 2023 IEEE International Workshop on Machine Learning for Signal Processing is now accepting… https://t.co/NLH2u19a3y
  • ONE MONTH OUT! We are celebrating the inaugural SPS Day on 2 June, honoring the date the Society was established in… https://t.co/V6Z3wKGK1O
  • The new SPS Scholarship Program welcomes applications from students interested in pursuing signal processing educat… https://t.co/0aYPMDSWDj
  • CALL FOR PAPERS: The IEEE Journal of Selected Topics in Signal Processing is now seeking submissions for a Special… https://t.co/NPCGrSjQbh
  • Test your knowledge of signal processing history with our April trivia! Our 75th anniversary celebration continues:… https://t.co/4xal7voFER

IEEE SPS Educational Resources

IEEE SPS Resource Center

IEEE SPS YouTube Channel