Optimal Bayesian Classification With Vector Autoregressive Data Dependency

You are here

Top Reasons to Join SPS Today!

1. IEEE Signal Processing Magazine
2. Signal Processing Digital Library*
3. Inside Signal Processing Newsletter
4. SPS Resource Center
5. Career advancement & recognition
6. Discounts on conferences and publications
7. Professional networking
8. Communities for students, young professionals, and women
9. Volunteer opportunities
10. Coming soon! PDH/CEU credits
Click here to learn more.

Optimal Bayesian Classification With Vector Autoregressive Data Dependency

By: 
Amin Zollanvari; Edward R. Dougherty

In classification theory, it is generally assumed that the data are independent and identically distributed. However, in many practical applications, we face a set of observations that are collected sequentially with a dependence structure among samples. The primary focus of this investigation is to construct the optimal Bayesian classifier (OBC) when the training observations are serially dependent. To model the effect of dependency, we assume the training observations are generated from VAR (p) , which is a multidimensional vector autoregressive process of order p . At the same time, we assume there exists uncertainty about parameters governing the VAR(p) model. To model this uncertainty, we assume that model parameters (coefficient matrices) are random variables with a prior distribution, and find the resulting OBC under the assumption of known covariance matrices of white-noise processes. We employ simulations using both synthetic and real data to demonstrate the efficacy of the constructed OBC.

SPS Social Media

IEEE SPS Educational Resources

IEEE SPS Resource Center

IEEE SPS YouTube Channel