Bayesian Networks and the Imprecise Dirichlet Model Applied to Recognition Problems

  • Cassio P. de Campos
  • Qiang Ji
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6717)


This paper describes an Imprecise Dirichlet Model and the maximum entropy criterion to learn Bayesian network parameters under insufficient and incomplete data. The method is applied to two distinct recognition problems, namely, a facial action unit recognition and an activity recognition in video surveillance sequences. The model treats a wide range of constraints that can be specified by experts, and deals with incomplete data using an ad-hoc expectation-maximization procedure. It is also described how the same idea can be used to learn dynamic Bayesian networks. With synthetic data, we show that our proposal and widely used methods, such as the Bayesian maximum a posteriori, achieve similar accuracy. However, when real data come in place, our method performs better than the others, because it does not rely on a single prior distribution, which might be far from the best one.


Bayesian Network Maximum Entropy Activity Recognition Facial Expression Recognition Dynamic Bayesian Network 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Abellan, J., Moral, S.: Maximum of entropy for credal sets. Int. Journal Uncertain. Fuzziness Knowl.-Based Systems 11(5), 587–597 (2003)MathSciNetCrossRefzbMATHGoogle Scholar
  2. 2.
    Bartlett, M.S., Littlewort, G.C., Frank, M.G., Lainscsek, C., Fasel, I.R., Movellan, J.R.: Automatic recognition of facial actions in spontaneous expressions. Journal of Multimedia 1(6), 22–35 (2006)CrossRefGoogle Scholar
  3. 3.
    Ben-Tal, A., Nemirovski, A.: Lectures on Modern Convex Optimization: Analysis, Algorithms, and Engineering Applications. MPSSeries on Optimization. SIAM, Philadelphia (2001)CrossRefzbMATHGoogle Scholar
  4. 4.
    de Campos, C.P., Cozman, F.G.: Belief updating and learning in semi-qualitative probabilistic networks. In: Conf. on Uncertainty in Artificial Intelligence, pp. 153–160 (2005)Google Scholar
  5. 5.
    de Campos, C.P., Tong, Y., Ji, Q.: Constrained maximum likelihood bayesian networks for facial expression recognition. In: Forsyth, D., Torr, P., Zisserman, A. (eds.) ECCV 2008, Part III. LNCS, vol. 5304, pp. 168–181. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  6. 6.
    Corani, G., de Campos, C.P.: A tree augmented classifier based on extreme imprecise dirichlet model. Int. Journal Approx. Reasoning 51, 1053–1068 (2010)CrossRefzbMATHGoogle Scholar
  7. 7.
    Corani, G., Zaffalon, M.: Learning reliable classifiers from small or incomplete data sets: the naive credal classifier 2. Journal of Machine Learning Research 9, 581–621 (2008)MathSciNetzbMATHGoogle Scholar
  8. 8.
    DeGroot, M.: Optimal Statistical Decisions. McGraw-Hill, New York (1970)zbMATHGoogle Scholar
  9. 9.
    Dempster, A., Laird, N., Rubin, D.: Maximum likelihood from incomplete data via the EM algorithm. Journal of the Royal Stat. Society B 39(1), 1–38 (1977)MathSciNetzbMATHGoogle Scholar
  10. 10.
    Feelders, A., van der Gaag, L.C.: Learning Bayesian network parameters under order constraints. Int. Journal of Approximate Reasoning 42(1-2), 37–53 (2006)MathSciNetCrossRefzbMATHGoogle Scholar
  11. 11.
    Graca, J., Ganchev, K., Taskar, B.: Expectation maximization and posterior constraints. In: NIPS, pp. 569–576. MIT Press, Cambridge (2007)Google Scholar
  12. 12.
    Jaynes, E.T.: Information theory and statistical mechanics. Physical Review 106, 620–630 (1957)MathSciNetCrossRefzbMATHGoogle Scholar
  13. 13.
    Kanade, T., Cohn, J.F., Tian, Y.: Comprehensive database for facial expression analysis. In: Proceedings of the 4th IEEE Int. Conf. on Automatic Face and Gesture Recognition, pp. 46–53 (2000)Google Scholar
  14. 14.
    Lukasiewicz, T.: Credal networks under maximum entropy. In: Conf. on Uncertainty in Artificial Intelligence, pp. 363–370 (2000)Google Scholar
  15. 15.
    Murphy, K.P.: Dynamic Bayesian Networks: Representation, Inference and Learning. Ph.D. thesis, Univ. of California, Berkeley (2002)Google Scholar
  16. 16.
    Niculescu, R.S., Mitchell, T.M., Rao, R.B.: A theoretical framework for learning bayesian networks with parameter inequality constraints. In: Int. Joint Conf. on Artificial Intelligence, pp. 155–160 (2007)Google Scholar
  17. 17.
    Tong, Y., Liao, W., Ji, Q.: Facial action unit recognition by exploiting their dynamic and semantic relationships. IEEE Trans. on Pattern Analysis and Machine Intelligence, 1683–1699 (2007)Google Scholar
  18. 18.
    Walley, P.: Statistical Reasoning with Imprecise Probabilities. Chapman and Hall, London (1991)CrossRefzbMATHGoogle Scholar
  19. 19.
    Wellman, M.P.: Fundamental concepts of qualitative probabilistic networks. Artificial Intelligence 44(3), 257–303 (1990)MathSciNetCrossRefzbMATHGoogle Scholar
  20. 20.
    Wu, C.F.J.: On the convergence properties of the EM algorithm. The Annals of Statistics 11(1), 95–103 (1983)MathSciNetCrossRefzbMATHGoogle Scholar
  21. 21.
    Xiang, T., Gong, S.: Beyond tracking: Modelling activity and understanding behaviour. Int. Journal of Computer Vision 67(1), 21–51 (2006)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • Cassio P. de Campos
    • 1
  • Qiang Ji
    • 2
  1. 1.Dalle Molle Institute for Artificial IntelligenceManno-LuganoSwitzerland
  2. 2.Rensselaer Polytechnic InstituteTroyUSA

Personalised recommendations