On Measuring the Understandability of Process Models

  • Joachim Melcher
  • Jan Mendling
  • Hajo A. Reijers
  • Detlef Seese
Conference paper
Part of the Lecture Notes in Business Information Processing book series (LNBIP, volume 43)

Abstract

Much efforts are aimed at unveiling the factors that influence a person’s comprehension of a business process model. While various potential factors have been proposed and studied in an experimental setting, little attention is being paid to reliability and validity requirements on measuring a person’s structural understanding of a process model. This paper proposes the concepts to meaningfully argue about these notions, for the sake of improving future measurement instruments. The findings from an experiment, involving 178 students from three different universities, underline the importance of this topic. In particular, it is shown that the coverage of model-related questions is important. This paper provides various recommendations to properly measure structural model comprehension.

Keywords

Process understandability process metric experiment 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Recker, J., Dreiling, A.: Does it matter which process modelling language we teach or use? An experimental study on understanding process modelling languages without formal education. In: Proceedings of the 18th Australasian Conference on Information Systems (ACIS 2007), pp. 356–366 (2007)Google Scholar
  2. 2.
    Sarshar, K., Loos, P.: Comparing the control-flow of EPC and Petri net from the end-user perspective. In: van der Aalst, W.M.P., Benatallah, B., Casati, F., Curbera, F. (eds.) BPM 2005. LNCS, vol. 3649, pp. 434–439. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  3. 3.
    Moher, T.G., Mak, D.C., Blumenthal, B., Leventhal, L.M.: Comparing the comprehensibility of textual and graphical programs: The case of petri nets. In: Cook, C.R., Scholtz, J.C., Spohrer, J.C. (eds.) Empirical Studies of Programmers: Fifth Workshop: Papers Presented at the Fifth Workshop on Empirical Studies of Programmers, pp. 137–161 (1993)Google Scholar
  4. 4.
    Reijers, H.A., Mendling, J.: Modularity in process models: Review and effects. In: Dumas, M., Reichert, M., Shan, M.-C. (eds.) BPM 2008. LNCS, vol. 5240, pp. 20–35. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  5. 5.
    Mendling, J.: Metrics for Process Models: Empirical Foundations of Verification, Error Prediction, and Guidelines for Correctness. LNBIP, vol. 6. Springer, Heidelberg (2008)Google Scholar
  6. 6.
    Vanderfeesten, I., Reijers, H.A., Mendling, J., van der Aalst, W.M.P., Cardoso, J.: On a quest for good process models: The cross-connectivity metric. In: Bellahsène, Z., Léonard, M. (eds.) CAiSE 2008. LNCS, vol. 5074, pp. 480–494. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  7. 7.
    Green, T.R.G., Petre, M.: Usability analysis of visual programming environments: A ‘cognitive dimensions’ framework. Journal of Visual Languages and Computing 7(2), 131–174 (1996)CrossRefGoogle Scholar
  8. 8.
    Mendling, J., Reijers, H.A., Cardoso, J.: What makes process models understandable? In: Alonso, G., Dadam, P., Rosemann, M. (eds.) BPM 2007. LNCS, vol. 4714, pp. 48–63. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  9. 9.
    Mendling, J., Strembeck, M.: Influence factors of understanding business process models. In: Abramowicz, W., Fensel, D. (eds.) BIS 2008. LNBIP, vol. 7, pp. 142–153. Springer, Heidelberg (2008)Google Scholar
  10. 10.
    Melcher, J., Seese, D.: Towards validating prediction systems for process understandability: Measuring process understandability. In: Negru, V., Jebelean, T., Petcu, D., Zaharie, D. (eds.) Proceedings of the 10th International Symposium on Symbolic and Numeric Algorithms for Scientific Computing (SYNASC 2008), pp. 564–571 (2008)Google Scholar
  11. 11.
    Melcher, J., Seese, D.: Towards validating prediction systems for process understandability: Measuring process understandability (experimental results). Research report, Universität Karlsruhe (TH), Institut AIFB (2008), http://digbib.ubka.uni-karlsruhe.de/volltexte/1000009260
  12. 12.
    Melcher, J., Mendling, J., Reijers, H.A., Seese, D.: On measuring the understandability of process models (experimental results). Research report, Universität Karlsruhe (TH), Institut AIFB, Humboldt-Universität, Berlin, Eindhoven University of Technology (2009), http://digbib.ubka.uni-karlsruhe.de/volltexte/1000011993
  13. 13.
    Kan, S.H.: Metrics and Models in Software Quality Engineering, 2nd edn. Addison-Wesley, Reading (2002)Google Scholar
  14. 14.
    Panik, M.J.: Advanced Statistics from an Elementary Point of View. Elsevier Academic Press, Amsterdam (2005)Google Scholar
  15. 15.
    Shapiro, S.S., Wilk, M.B.: An analysis of variance test for normality (complete samples). Biometrika 52(3-4), 591–611 (1965)CrossRefGoogle Scholar
  16. 16.
    Efron, B., Tibshirani, R.J.: An Introduction to the Bootstrap. Chapman & Hall, Boca Raton (1993)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2010

Authors and Affiliations

  • Joachim Melcher
    • 1
  • Jan Mendling
    • 2
  • Hajo A. Reijers
    • 3
  • Detlef Seese
    • 1
  1. 1.Institut AIFBUniversität Karlsruhe (TH)Germany
  2. 2.Humboldt-Universität zu BerlinGermany
  3. 3.School of Industrial EngineeringEindhoven University of TechnologyThe Netherlands

Personalised recommendations