Advertisement

On the Applicability of Probabilistic Programming Languages for Causal Activity Recognition

  • Stefan LüdtkeEmail author
  • Maximilian Popko
  • Thomas Kirste
Technical Contribution
  • 24 Downloads

Abstract

Recognizing causal activities of human protagonists, and jointly inferring context information like location of objects and agents from noisy sensor data is a challenging task. Causal models can be used, which describe the activity structure symbolically, e.g. by precondition-effect actions. Recently, probabilistic programming languages (PPLs) arose as an abstraction mechanism that allow to concisely define probabilistic models by a general-purpose programming language, and provide off-the-shelf, general-purpose inference algorithms. In this paper, we empirically investigate whether PPLs provide a feasible alternative for implementing causal models for human activity recognition, by comparing the performance of three different PPLs (Anglican, WebPPL and Figaro) on a multi-agent scenario. We find that PPLs allow to concisely express causal models, but general-purpose inference algorithms that are typically implemented in PPLs are outperformed by an application-specific inference algorithm by orders of magnitude. Still, PPLs can be a valuable tool for developing probabilistic models, due to their expressiveness and simple applicability.

Keywords

Bayesian filtering Causal model Probabilistic programming language Anglican WebPPL Figaro Particle filter 

Notes

Acknowledgements

We are grateful to the anonymous reviewers for their comments and suggestions, which vastly improved the presentation and discussion of our work.

References

  1. 1.
    Andrieu C, Doucet A, Holenstein R (2010) Particle markov chain monte carlo methods. J R Stat Soc Ser B (Stat Methodol) 72(3):269–342MathSciNetCrossRefzbMATHGoogle Scholar
  2. 2.
    Bingham E, Chen J.P, Jankowiak M, Obermeyer F, Pradhan N, Karaletsos T, Singh R, Szerlip P, Horsfall P, Goodman N.D (2018) Pyro: Deep Universal Probabilistic Programming. arXiv preprint arXiv:1810.09538
  3. 3.
    Borgström J, Gordon A.D, Greenberg M, Margetson J, Gael J.V (2011) Measure transformer semantics for Bayesian machine learning. In: European symposium on programming, pp. 77–96. Springer, Berlin.  https://doi.org/10.1007/978-3-642-19718-5_5
  4. 4.
    Carpenter B, Gelman A, Hoffman M.D, Lee D, Goodrich B, Betancourt M, Brubaker M, Guo J, Li P, Riddell A (2017) Stan: a probabilistic programming language. J Stat Softw.  https://doi.org/10.18637/jss.v076.i01
  5. 5.
    De Raedt L, Kersting K, Natarajan S, Poole D (2016) Statistical relational artificial intelligence: logic, probability, and computation. Synth Lect Artif Intell Mach Learn 10:1–189CrossRefzbMATHGoogle Scholar
  6. 6.
    Doucet A, de Freitas N, Gordon N (2001) Sequential Monte Carlo Methods in Practice. Springer, BerlinCrossRefzbMATHGoogle Scholar
  7. 7.
    Fierens D, Van den Broeck G, Renkens J, Shterionov D, Gutmann B, Thon I, Janssens G, De Raedt L (2015) Inference and learning in probabilistic logic programs using weighted boolean formulas. Theory Pract Logic Programm 15(3):358–401MathSciNetCrossRefzbMATHGoogle Scholar
  8. 8.
    Goodman N, Mansinghka V, Roy D, Bonawitz K, Tenenbaum J (2008) Church: a language for generative models. In: Proceedings of the conference on uncertainty in artificial intelligenceGoogle Scholar
  9. 9.
    Goodman N.D, Stuhlmüller A (2014) The design and implementation of probabilistic programming languages. http://dippl.org. Accessed 29 Mar 2018
  10. 10.
    Häggström O (2002) Finite Markov chains and algorithmic applications, vol 52. Cambridge University Press, CambridgeCrossRefzbMATHGoogle Scholar
  11. 11.
    Krüger F, Nyolt M, Yordanova K, Hein A, Kirste T (2014) Computational state space models for activity and intention recognition. A feasibility study. PLoS One 9(11):e109381.  https://doi.org/10.1371/journal.pone.0109381 CrossRefGoogle Scholar
  12. 12.
    Krüger F, Steiniger A, Bader S, Kirste T (2012) Evaluating the robustness of activity recognition using computational causal behavior models. In: Proceedings of the international workshop on situation, activity and goal awareness held at Ubicomp 2012, pp. 1066–1074. ACM, Pittsburgh, PA, USA.  https://doi.org/10.1145/2370216.2370443
  13. 13.
    Kulkarni T.D, Kohli P, Tenenbaum J.B, Mansinghka V (2015) Picture: a probabilistic programming language for scene perception. In: The IEEE conference on computer vision and pattern recognition (CVPR), pp. 4390–4399Google Scholar
  14. 14.
    Lunn DJ, Thomas A, Best N, Spiegelhalter D (2000) Winbugs—a bayesian modelling framework: concepts, structure, and extensibility. Stat Comput 10(4):325–337CrossRefGoogle Scholar
  15. 15.
    McCallum A, Schultz K, Singh S (2009) FACTORIE: probabilistic programming via imperatively defined factor graphs. In: Bengio Y, Schuurmans D, Lafferty JD, Williams CKI, Culotta A (eds) Advances in neural information processing systems 22, Curran Associates, Inc., pp. 1249–1257. http://papers.nips.cc/paper/3654-factorie-probabilistic-programming-via-imperatively-defined-factor-graphs.pdf
  16. 16.
    Nyolt M, Kirste T (2015) On resampling for Bayesian filters in discrete state spaces. In: Proceedings 2015 IEEE 27th international conference on tools with artificial intelligence, pp. 526–533. IEEE computer society.  https://doi.org/10.1109/ICTAI.2015.83
  17. 17.
    Nyolt M, Krüger F, Yordanova K, Hein A, Kirste T (2015-06) Marginal filtering in large state spaces. Int J Approx Reason 61:16–32.  https://doi.org/10.1016/j.ijar.2015.04.003
  18. 18.
    Paige B, Wood F (2014) A compilation target for probabilistic programming languages. In: Xing EP, Jebara T (eds) Proceedings of the 31st international conference on machine learning, proceedings of machine learning research, vol 32, pp. 1935–1943. PMLR, Beijing, ChinaGoogle Scholar
  19. 19.
    Patil A, Huard D, Fonnesbeck CJ (2010) Pymc: Bayesian stochastic modelling in python. J Stat Softw 35(4):1CrossRefGoogle Scholar
  20. 20.
    Pfeffer A (2016) Practical probabilistic programming, 1st edn. Manning Publications Co., Shelter IslandGoogle Scholar
  21. 21.
    Poon H, Domingos P (2006) Sound and efficient inference with probabilistic and deterministic dependencies. AAAI 6:458–463Google Scholar
  22. 22.
    Popko M, Lüdtke S (2018) On the applicability of probabilistic programming languages for causal activity recognition.  https://doi.org/10.5281/zenodo.1635591
  23. 23.
    Sato T, Kameya Y (2008) New advances in logic-based probabilistic modeling by PRISM. Springer, Berlin. pp 118–155  https://doi.org/10.1007/978-3-540-78652-8_5
  24. 24.
    Tolpin D, van de Meert JW, Yang H, Wood F (2016) Design and implementation of probabilistic programming language Anglican. In: Proceedings of the 28th symposium on the implementation and application of functional programming languages, pp. 6:1–6:12. ACMGoogle Scholar
  25. 25.
    Turliuc CR, Dickens L, Russo A, Broda K (2016-11) Probabilistic abductive logic programming using Dirichlet priors. Int J Approx Reason 78:223–240.  https://doi.org/10.1016/j.ijar.2016.07.001
  26. 26.
    Wood F, van de Meert JW, Mansinghka V (2014) A new approach to probabilistic programming inference. In: Artificial intelligence and statisticsGoogle Scholar

Copyright information

© Gesellschaft für Informatik e.V. and Springer-Verlag GmbH Germany, part of Springer Nature 2019

Authors and Affiliations

  1. 1.Institute of Computer ScienceUniversity of RostockRostockGermany

Personalised recommendations