The technology we use, and even rely on, in our everyday lives –computers, radios, video, cell phones – is enabled by signal processing. Learn More »
1. IEEE Signal Processing Magazine
2. Signal Processing Digital Library*
3. Inside Signal Processing Newsletter
4. SPS Resource Center
5. Career advancement & recognition
6. Discounts on conferences and publications
7. Professional networking
8. Communities for students, young professionals, and women
9. Volunteer opportunities
10. Coming soon! PDH/CEU credits
Click here to learn more.
Bayes’ rule, as one of the fundamental concepts of statistical signal processing, provides a way to update our belief about an event based on the arrival of new pieces of evidence. Uncertainty is traditionally modeled by a probability distribution. Prior belief is thus expressed by a prior probability distribution, while the update involves the likelihood function, a probabilistic expression of how likely it is to observe the evidence. It has been argued by many statisticians, however, that a broadening of probability theory is required because one may not always be able to provide a probability for every event, due to the scarcity of training data.
Following the theoretical foundations of imprecise probability theory by Walley [1], this “Lecture Notes” column presents a formulation and practical computation of Bayes’ rule in situations where the probabilistic models (i.e., the prior distribution and the likelihood function) are imprecise.
Home | Sitemap | Contact | Accessibility | Nondiscrimination Policy | IEEE Ethics Reporting | IEEE Privacy Policy | Terms | Feedback
© Copyright 2024 IEEE – All rights reserved. Use of this website signifies your agreement to the IEEE Terms and Conditions.
A not-for-profit organization, IEEE is the world's largest technical professional organization dedicated to advancing technology for the benefit of humanity.