Statistical Model Checking of Dynamic Software Architectures

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9839)


The critical nature of many complex software-intensive systems calls for formal, rigorous architecture descriptions as means of supporting automated verification and enforcement of architectural properties and constraints. Model checking has been one of the most used techniques to automatically verify software architectures with respect to the satisfaction of architectural properties. However, such a technique leads to an exhaustive exploration of all possible states of the system, a problem that becomes more severe when verifying dynamic software systems due to their typical non-deterministic runtime behavior and unpredictable operation conditions. To tackle these issues, we propose using statistical model checking (SMC) to support the verification of dynamic software architectures while aiming at reducing computational resources and time required for this task. In this paper, we introduce a novel notation to formally express architectural properties as well as an SMC-based toolchain for verifying dynamic software architectures described in \(\pi \)-ADL, a formal architecture description language. We use a flood monitoring system to show how to express relevant properties to be verified. We also report the results of some computational experiments performed to assess the efficiency of our approach.


Dynamic software architecture Architecture description language Formal verification Statistical model checking 



This work was partially supported by the Brazilian National Agency of Petroleum, Natural Gas and Biofuels through the PRH-22/ANP/MCTI Program (for Everton Cavalcante) and by CNPq under grant 308725/2013-1 (for Thais Batista).


  1. 1.
    The Go programming language.
  2. 2.
  3. 3.
    Arnold, A., Boyer, B., Legay, A.: Contracts and behavioral patterns for SoS: the EU IP DANSE approach. In: Larsen, K.G., Legay, A., Nyman, U. (eds.) Proceedings of the 1st Workshop on Advances in Systems of Systems, EPTCS, vol. 133, pp. 47–60 (2013)Google Scholar
  4. 4.
    Boyer, B., Corre, K., Legay, A., Sedwards, S.: PLASMA-lab: a flexible, distributable statistical model checking library. In: Joshi, K., Siegle, M., Stoelinga, M., D’Argenio, P.R. (eds.) QEST 2013. LNCS, vol. 8054, pp. 160–164. Springer, Heidelberg (2013). doi: 10.1007/978-3-642-40196-1_12 CrossRefGoogle Scholar
  5. 5.
    Cavalcante, E., Batista, T., Oquendo, F.: Supporting dynamic software architectures: from architectural description to implementation. In: Proceedings of the 12th Working IEEE/IFIP Conference on Software Architecture, pp. 31–40. IEEE Computer Society, USA (2015)Google Scholar
  6. 6.
    Cavalcante, E., Oquendo, F., Batista, T.: Architecture-based code generation: from \(\pi \)-ADL descriptions to implementations in the Go language. In: Avgeriou, P., Zdun, U. (eds.) ECSA 2014. LNCS, vol. 8627, pp. 130–145. Springer, Switzerland (2014). doi: 10.1007/978-3-319-09970-5_13 Google Scholar
  7. 7.
    Cho, S.M., Kim, H.H., Cha, S.D., Bae, D.H.: Specification and validation of dynamic systems using temporal logic. IEE Proc. Softw. 148(4), 135–140 (2001)CrossRefGoogle Scholar
  8. 8.
    Clarke, E.M., Grumberg, O., Peled, D.A.: Model Checking. The MIT Press, Cambridge (1999)Google Scholar
  9. 9.
    Hérault, T., Lassaigne, R., Magniette, F., Peyronnet, S.: Approximate probabilistic model checking. In: Steffen, B., Levi, G. (eds.) VMCAI 2004. LNCS, vol. 2937, pp. 73–84. Springer, Heidelberg (2004). doi: 10.1007/978-3-540-24622-0_8 CrossRefGoogle Scholar
  10. 10.
    Holzmann, G.J.: The logic of bugs. In: 10th ACM SIGSOFT Symposium on Foundations of Software Engineering, pp. 81–87. ACM, New York (2002)Google Scholar
  11. 11.
    Jegourel, C., Legay, A., Sedwards, S.: A platform for high performance statistical model checking - PLASMA. In: Flanagan, C., König, B. (eds.) TACAS 2012. LNCS, vol. 7214, pp. 498–503. Springer, Heidelberg (2012). doi: 10.1007/978-3-642-28756-5_37 CrossRefGoogle Scholar
  12. 12.
    Kim, Y., Choi, O., Kim, M., Baik, J., Kim, T.H.: Validating software reliability early through statistical model checking. IEEE Softw. 30(3), 35–41 (2013)CrossRefGoogle Scholar
  13. 13.
    Legay, A., Delahaye, B., Bensalem, S.: Statistical model checking: an overview. In: Barringer, H., et al. (eds.) RV 2010. LNCS, vol. 6418, pp. 122–135. Springer, Heidelberg (2010). doi: 10.1007/978-3-642-16612-9_11 CrossRefGoogle Scholar
  14. 14.
    Legay, A., Sedwards, S.: On statistical model checking with PLASMA. In: Proceedings of the 2014 Theoretical Aspects of Software Engineering Conference, pp. 139–145. IEEE Computer Society, Washington, DC (2014)Google Scholar
  15. 15.
    Mateescu, R., Oquendo, F.: \(\pi \)-AAL: an architecture analysis language for formally specifying and verifying structural and behavioural properties of software architectures. ACM SIGSOFT Softw. Eng. Notes 31(2), 1–19 (2006)CrossRefGoogle Scholar
  16. 16.
    Oquendo, F.: \(\pi \)-ADL: an architecture description language based on the higher-order typed \(\pi \)-calculus for specifying dynamic and mobile software architectures. ACM SIGSOFT Softw. Eng. Notes 29(3), 1–14 (2004)CrossRefGoogle Scholar
  17. 17.
    Pnueli, A.: The temporal logics of programs. In: Proceedings of the 18th Annual Symposium on Foundations of Computer Science, pp. 46–57. IEEE Computer Society, Washington, DC (1977)Google Scholar
  18. 18.
    Quilbeuf, J., Cavalcante, E., Traonouez, L.M., Oquendo, F., Batista, T., Legay, A.: A logic for statistical model checking of dynamic software architectures. In: Margaria, T., Steffen, B. (eds.) ISoLA 2016. LNCS, vol. 9952, pp. 806–820. Springer, Heidelberg (2016). doi: 10.1007/978-3-319-47166-2_56 CrossRefGoogle Scholar
  19. 19.
    Zhang, P., Muccini, H., Li, B.: A classification and comparison of model checking software architecture techniques. J. Syst. Softw. 83(5), 723–744 (2010)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG 2016

Authors and Affiliations

  1. 1.DIMApFederal University of Rio Grande do NorteNatalBrazil
  2. 2.IRISA-UMR CNRS/Université Bretagne SudVannesFrance
  3. 3.INRIA Rennes Bretagne AtlantiqueRennesFrance

Personalised recommendations