The technology we use, and even rely on, in our everyday lives –computers, radios, video, cell phones – is enabled by signal processing. Learn More »
1. IEEE Signal Processing Magazine
2. Signal Processing Digital Library*
3. Inside Signal Processing Newsletter
4. SPS Resource Center
5. Career advancement & recognition
6. Discounts on conferences and publications
7. Professional networking
8. Communities for students, young professionals, and women
9. Volunteer opportunities
10. Coming soon! PDH/CEU credits
Click here to learn more.
The importance of normalizing biometric features or matching scores is understood in the multimodal biometric case, but there is less attention to the unimodal case. Prior reports assess the effectiveness of normalization directly on biometric performance. We propose that this process is logically comprised of two independent steps: (1) methods to equalize the effect of each biometric feature on the similarity scores calculated from all the features together and (2) methods of weighting the normalized features to optimize biometric performance. In this report, we address step 1 only and focus exclusively on normally distributed features. We show how differences in the variance of features lead to differences in the strength of the influence of each feature on the similarity scores produced from all the features. Since these differences in variance have nothing to do with importance in the biometric sense, it makes no sense to allow them to have greater weight in the assessment of biometric performance. We employed two types of features: (1) real eye-movement features and (2) synthetic features. We compare six variance normalization methods (histogram equalization, L1-normalization, median normalization, z-score normalization, min–max normalization, and L-infinite normalization) and one distance metric (Mahalanobis distance) in terms of how well they reduce the impact of the variance differences. The effectiveness of different techniques on real data depended on the strength of the inter-correlation of the features. For weakly correlated real features and synthetic features, histogram equalization was the best method followed by L1 normalization.
Home | Sitemap | Contact | Accessibility | Nondiscrimination Policy | IEEE Ethics Reporting | IEEE Privacy Policy | Terms | Feedback
© Copyright 2024 IEEE - All rights reserved. Use of this website signifies your agreement to the IEEE Terms and Conditions.
A public charity, IEEE is the world's largest technical professional organization dedicated to advancing technology for the benefit of humanity.