Advertisement

Towards Business Process Execution Adequacy Criteria

  • Antonia Bertolino
  • Antonello Calabró
  • Francesca Lonetti
  • Eda Marchetti
Conference paper
Part of the Lecture Notes in Business Information Processing book series (LNBIP, volume 238)

Abstract

Monitoring of business process execution has been proposed for the evaluation of business process performance. An important aspect to assess the thoroughness of the business process execution is to monitor if some entities have not been observed for some time and timely check if something is going wrong. We propose in this paper business process execution adequacy criteria and provide a proof-of-concept monitoring framework for their assessment. Similar to testing adequacy, the purpose of our approach is to identify the main entities of the business process that are covered during its execution and raise a warning if some entities are not covered. We provide a first assessment of the proposed approach on a case study in the learning context.

Keywords

Business process Monitoring Adequacy criteria Learning assessment 

Notes

Acknowledgements

This work has been partially funded by the Model-Based Social Learning for Public Administrations project (EU FP7-ICT-2013-11/619583).

References

  1. 1.
    Wetzstein, B., Leitner, P., Rosenberg, F., Brandic, I., Dustdar, S., Leymann, F.: Monitoring and analyzing influential factors of business process performance. In: Enterprise Distributed Object Computing Conference, pp. 141–150 (2009)Google Scholar
  2. 2.
    Bertoli, P., Dragoni, M., Ghidini, C., Martufi, E., Nori, M., Pistore, M., Di Francescomarino, C.: Modeling and monitoring business process execution. In: Basu, S., Pautasso, C., Zhang, L., Fu, X. (eds.) ICSOC 2013. LNCS, vol. 8274, pp. 683–687. Springer, Heidelberg (2013) CrossRefGoogle Scholar
  3. 3.
    Calabró, A., Lonetti, F., Marchetti, E.: Monitoring of business process execution based on performance indicators. In: The Euromicro Conference Series on Software Engineering and Advanced Applications (SEAA) (2015)Google Scholar
  4. 4.
    Bertolino, A., Marchetti, E., Morichetta, A.: Adequate monitoring of service compositions. In: Joint Meeting of the European Software Engineering Conference and the ACM SIGSOFT Symposium on the Foundations of Software Engineering, ESEC/FSE 2013, pp. 59–69 (2013)Google Scholar
  5. 5.
    Rapps, S., Weyuker, E.: Selecting software test data using data flow information. IEEE Trans. Softw. Eng. SE–11, 367–375 (1985)CrossRefzbMATHGoogle Scholar
  6. 6.
    OMG: business process model and notation (BPMN). In: 20th ed.: Object Management Group (2011)Google Scholar
  7. 7.
    Horgan, J.R., London, S., Lyu, M.R.: Achieving software quality with testing coverage measures. Computer 27, 60–69 (1994)CrossRefGoogle Scholar
  8. 8.
    Weyuker, E.: The cost of data flow testing: an empirical study. IEEE Trans. Softw. Eng. 16, 121–128 (1990)CrossRefGoogle Scholar
  9. 9.
    Lyu, M., Huang, Z., Sze, S., Cai, X.: An empirical study on testing and fault tolerance for software reliability engineering. In: 14th International Symposium on Software Reliability Engineering, pp. 119–130 (2003)Google Scholar
  10. 10.
    Cai, X., Lyu, M.R.: The effect of code coverage on fault detection under different testing profiles. SIGSOFT Softw. Eng. Notes 30, 1–7 (2005)Google Scholar
  11. 11.
    Falcioni, D., Polini, A., Polzonetti, A., Re, B.: Direct verification of BPMN processes through an optimized unfolding technique. In: 12th International Conference on Quality Software (QSIC), pp. 179–188 (2012)Google Scholar
  12. 12.
    Drools, J.: Drools fusion: complex event processor. http://www.jboss.org/drools/drools-fusion.html
  13. 13.
    Learn PAd project: model-based social learning for public administrations project. http://www.learnpad.eu/
  14. 14.
    Thönssen, B., Hinkelmann, K., Witschel, F.: Models for setting the wiki. In: Thönssen, B., Zhang, C. (eds.) Deliverable D5.1 (The Learn PAd Consortium) (2015)Google Scholar
  15. 15.
    Koetter, F., Kochanowski, M.: A model-driven approach for event-based business process monitoring. In: Rosa, M., Soffer, P. (eds.) BPM Workshops 2012. LNBIP, vol. 132, pp. 378–389. Springer, Heidelberg (2013) CrossRefGoogle Scholar
  16. 16.
    Lee, D., Netravali, A., Sabnani, K., Sugla, B., John, A.: Passive testing and applications to network management. In: Proceedings of International Conference on Network Protocols, pp. 113–122 (1997)Google Scholar
  17. 17.
    Yang, Q., Li, J.J., Weiss, D.M.: A survey of coverage-based testing tools. Comput. J. 52, 589–597 (2009)CrossRefGoogle Scholar
  18. 18.
    Ali, L., Hatala, M., Gašević, D., Jovanović, J.: A qualitative evaluation of evolution of a learning analytics tool. Comput. Edu. 58, 470–489 (2012)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  • Antonia Bertolino
    • 1
  • Antonello Calabró
    • 1
  • Francesca Lonetti
    • 1
  • Eda Marchetti
    • 1
  1. 1.Istituto di Scienza e Tecnologie dell’Informazione “A. Faedo”Consiglio Nazionale delle Ricerche (CNR)PisaItaly

Personalised recommendations