Advertisement

Software and Systems Modeling

, Volume 18, Issue 6, pp 3283–3329 | Cite as

A comparative study of students and professionals in syntactical model comprehension experiments

  • Mohamed El-AttarEmail author
Regular Paper

Abstract

Empirical evaluations can be conducted with students or professionals as subjects. Students are much more accessible than professionals and they are inexpensive, allowing a greater number of empirical studies to be conducted. Professionals are preferred over students due to concerns regarding the external validity of student-based experiments. Professionals are believed to perform differently, most likely better than students. But with respect to evaluating the cognitive effectiveness of software engineering notations, are professionals really better? The literature has suggested that the presentation of information is just as critical as the content it conveys, hence necessitating this type of empirical studies. If professionals are not much better than students, then such important finding can be a springboard to many much-needed empirical evaluations in this field. In this paper, we report on an experiment that compare the performances of professionals and students with respect to syntactical model comprehension, which is a core factor for evaluating the cognitive effectiveness of notations. The experiment involved two groups of professionals and two student groups, totaling 74 professionals and 75 students. The results of the experiment indicate that students can be used as an adequate replacement to professionals in such type of empirical studies.

Keywords

Statechart modeling Use case modeling Student-based experiments Professional-based experiments 

Notes

Acknowledgements

We would like to thank all the software engineering professionals and students who took part in this experiment.

References

  1. 1.
    Amoroso, E.G.: Fundamentals of Computer Security Technology. Prentice-Hall Inc, Upper Saddle River, NJ (1994)zbMATHGoogle Scholar
  2. 2.
    Anda, B., Dreiem, H., Sjøberg, D., Jørgensen, M.: Estimating software development effort based on use cases—experiences from industry. Submitted to UML’2001 (Fourth International Conference on the Unified Modeling Language)Google Scholar
  3. 3.
    Berander, P.: Using students as subjects in requirements prioritization. ISESE pp. 167–176 (2004)Google Scholar
  4. 4.
    Booch, G., Rumbaugh, J., Jacobson, I.: The Unified Modeling Language User Guide, 2nd edn. Addison-Wesley, Boston (2005)Google Scholar
  5. 5.
    OMG: Business Process Model and Notation (BPMN): Specification 2.0 V0.9.15. Object Management Group Inc., Needham (2009)Google Scholar
  6. 6.
    Buhr, R.J.A., Casselman, R.S.: Use Case Maps for Object-Oriented Systems. Prentice Hall, Englewood Cliffs (1996)zbMATHGoogle Scholar
  7. 7.
    Carver, J., Shull, F., Basili, V.: Observational studies to accelerate process experience in classroom studies: an evaluation. In: Proceedings of the 2003 International Symposium on Empirical Software Engineering (ISESE ‘03), pp. 72–79Google Scholar
  8. 8.
    Ciolkowski, M.: What do we know about perspective-based reading? An approach for quantitative aggregation in software engineering. In: 2009 3rd International Symposium on Empirical Software Engineering and Measurement, ESEM 2009, pp. 133–144 (2009)Google Scholar
  9. 9.
    Cliff, N.: Dominance statistics: ordinal analyses to answer ordinal questions. Psychol. Bull. 114(3), 494–509 (1993)CrossRefGoogle Scholar
  10. 10.
    Cliff, N.: Answering ordinal questions with ordinal data using ordinal statistics. Multivar. Behav. Res. 31(3), 331–350 (1996)CrossRefGoogle Scholar
  11. 11.
    Cliff, N.: Ordinal Methods for Behavioral Data Analysis. Lawrence Erlbaum Associates, New Jersey (1996)Google Scholar
  12. 12.
    Dubois, E., Wu, S.: A framework for dealing with and specifying security requirements in information systems. Inf. Syst. Secur. pp. 88–99 (1996)Google Scholar
  13. 13.
    El-Attar, M.: Companion Website to the Analysis of the New Use Case and Statecharts Notations. https://faculty.alfaisal.edu/melattar/physicsofnotations.html. Last accessed December 2018
  14. 14.
    Falessi, D., Juristo, N., Wohlin, C., Turhan, B, Münch, J., Jedlitschka, A., Oivo, M.: Empirical software engineering experts on the use of students and professionals in experiments. Empir. Softw. Eng. 23(1), 452–489 (2018)CrossRefGoogle Scholar
  15. 15.
    Genon, N., Heymans, P., Amyot, D.: Analysing the Cognitive Effectiveness of the BPMN 20 Visual Notation. Software Language Engineering, pp. 377–396. Springer, Berlin Heidelberg (2011)Google Scholar
  16. 16.
    Genon, N., Amyot, D., Heymans, P.: Analysing the Cognitive Effectiveness of the UCM Visual Notation. System Analysis and Modeling: About Models, pp. 221–240. Springer, Berlin (2011)CrossRefGoogle Scholar
  17. 17.
    Gopalakrishnan, S., Krogstie, J., Sindre, G. (2010) Adapting UML activity diagrams for mobile work process modelling: experimental comparison of two notation alternatives. In: PoEM. pp. 145–161Google Scholar
  18. 18.
    Hassan, R., Bohner, S., El-Kassas, M.: Hinchey integrating formal analysis and design to preserve security properties. In: 42nd Hawaii International Conference on System Sciences, HICSS ‘09, pp. 1–10 (2009)Google Scholar
  19. 19.
    Hess, M.R., Kromrey, J.D., Ferron, J.M., Hogarty, K.Y., Hines, C.V.: Robust inference in meta-analysis: an empirical comparison of point and interval estimates using the standardized mean difference and cliff’s delta. In: Annual meeting of the American Educational Research Association (2005). www.coedu.usf.edu/main/departments/me/documents/RobustMeta-AnalysisAERA2005.pdf. Last accessed July 2018
  20. 20.
    Höfer, A., Tichy, W.F.: Status of empirical research in software engineering. No January 1996, pp. 10–19 (2007)Google Scholar
  21. 21.
    Höst, M., Regnell, B., Wohlin, C.: Using students as subjects–a comparative study of students and professionals in lead-time impact assessment. Empir. Softw. Eng. 5, 210–214 (2000)CrossRefzbMATHGoogle Scholar
  22. 22.
    Humphrey, W.S.: A Discipline for Software Engineering. Addison Wesley, Boston (1995)Google Scholar
  23. 23.
    Jacobson, I., Ericsson, M., Jacobson, A.: The Object Advantage. ACM Press, New York (1995)Google Scholar
  24. 24.
    Jürjens, J.: Secure Systems Development with UML. Springer, Berlin (2004)zbMATHGoogle Scholar
  25. 25.
    Jürjens J.: UMLsec: extending UML for secure systems development. In: 5th International Conference UML 2002—The Unified Modelling Language. Model Engineering, Languages, Concepts, and Tools. Dresden, Germany, September/October 2002, Proceedings, volume 2460 of LNCS, pp. 412–425. Springer (2002)Google Scholar
  26. 26.
    Kaner, C., Bach, J., Pettichord, B.: Lessons Learned in Software Testing. Wiley, New York (2003)Google Scholar
  27. 27.
    Kárpáti, P., Sindre, G., Opdahl, A.L.: Visualizing Cyber Attacks with Misuse Case Maps. In: REFSQ pp. 262–275 (2010)Google Scholar
  28. 28.
    Katta, V., Karpati, P., Opdahl, A.L., Sindre, G., Raspotnig, C.: Comparing two misuse modeling techniques for intrusion visualization. In: Lecture Notes in Business Information Processing, vol. 68, pp. 1–15 (2010)Google Scholar
  29. 29.
    Kromrey, J., Hogarty, K., Ferron, J., Hines, C., Hess, M.: Robustness in meta-analysis: an empirical comparison of point and interval estimates of standardized mean differences and Cliff’s delta. In: American Statistical Association 2005 Joint Statistical Meetings, pp. 7 (2005). http://luna.cas.usf.edu/~mbrannic/files/meta/Robust%20Estimates.pdf. Last accessed July 2018
  30. 30.
    Kromrey, J., Hogarty, K.: Analysis options for testing group differences on ordered categorical variables: an empirical investigation of Type 1 error control and statistical power. Mult. Linear Regres. Viewp. 25, 70–82 (1998)Google Scholar
  31. 31.
    Lehmann, E.L.: Non-Parametrics: Statistical Methods Based On Ranks, Revised. Pearson (1998)Google Scholar
  32. 32.
    Lin, L., Nuseibeh, B., Ince, D., Jackson, M.: Moffett introducing abuse frames for analysing security requirements. In: Proceedings of the 11th IEEE International Requirements Engineering Conference, pp. 371–37 (2003)Google Scholar
  33. 33.
    Lodderstedt T.: SecureUML: a UML-based modelling language for model-driven security. In: UML 2002—The Unified Modelling Language. Model Engineering, Languages, Concepts, and Tools. 5th International Conference, Dresden, Germany, September/October 2002, Proceedings, volume 2460 of LNCS, pp. 426–441. Springer, Berlin (2002)Google Scholar
  34. 34.
    Mader, P., Cleland-Huang, J.: A Visual Traceability Modeling Language. Model Driven Engineering Languages and Systems. Springer, Berlin (2010)Google Scholar
  35. 35.
    McMeekin, D.A., Von Konsky, B.R., Robey, M., Cooper, D.J. A.: The significance of participant experience when evaluating software inspection techniques. In: Proceedings of the Australian Software Engineering Conference, ASWEC, pp. 200–209 (2009)Google Scholar
  36. 36.
    Moody, D.L.: The ‘physics’ of notations: toward a scientific basis for constructing visual notations in software engineering. IEEE Trans. Softw. Eng. 35(6), 756–779 (2009)CrossRefGoogle Scholar
  37. 37.
    Moody, D., van Hillegersberg, J.: Evaluating the Visual Syntax of UML: An Analysis of the Cognitive Effectiveness of the UML Family of Diagrams. Software Language Engineering. Springer, Berlin (2009)Google Scholar
  38. 38.
    Moody, D.L., Heymans, P., Matulevicius, R.: Visual syntax does matter: improving the cognitive effectiveness of the i* visual notation. Requir. Eng. 15(2), 141–175 (2010)CrossRefGoogle Scholar
  39. 39.
    Mouratidis, H., Giorgini, P.: Secure tropos: a security-oriented extension of the tropos methodology. Int. J. Softw. Eng. Knowledge Eng. 17(2), 285–309 (2007)CrossRefGoogle Scholar
  40. 40.
    OMG.: Unified Modeling Language, Version 2.4.1. Object Management Group, Inc, Needham (2012). http://www.uml.org
  41. 41.
    Pickard, L.M., Kitchenham, B.A., Jones, P.W.: Combining empirical results in software engineering. Inf. Softw. Technol. 40(11), 811–821 (1998)CrossRefGoogle Scholar
  42. 42.
    Porter, A.A., Votta, L.G., Basili, V.R.: Comparing detection methods for software requirements inspections: a replicated experiment. IEEE Trans. Softw. Eng. 21, 563–575 (1995)CrossRefGoogle Scholar
  43. 43.
    Porter, A., Votta, L.: Comparing detection methods for software requirements inspections: a replication using professional subjects. Empir. Softw. Eng. 3, 355–379 (1998)CrossRefGoogle Scholar
  44. 44.
    Purchase, H.C., Carrington, D., Allder, J.-A.: Empirical evaluation of aesthetics-based graph layout. Empir. Softw. Eng. 7(3), 233–255 (2002)CrossRefzbMATHGoogle Scholar
  45. 45.
    Purchase, H.C., Welland, R., McGill, M., Colpoys, L.: Comprehension of diagram syntax: an empirical study of entity relationship notations. Int. J. Hum. Comput. Stud. 61, 187–203 (2004)CrossRefGoogle Scholar
  46. 46.
    Reijers, H.A., Mendling, J.: A study into the factors that influence the understandability of business process models. IEEE Trans. Syst. Man Cybern. Part A Syst. Hum. 41(3), 449–462 (2011)CrossRefGoogle Scholar
  47. 47.
    Remus, W.: Using students as subjects in experiments on decision support systems. In: Proceeding of the Twenty-Second Annu. Hawaii Int. Conf. Syst. Sci. Vol. III Decision Support and Knowledge Based Systems Track, vol. 3 (1989)Google Scholar
  48. 48.
    Ricca, F., Di Penta, M., Torchiano, M., Tonella, P., Ceccato, M.: How developers' experience and ability influence web application comprehension tasks supported by UML stereotypes: a series of four experiments. IEEE Trans. Softw. Eng. 36, 96–118 (2010)CrossRefGoogle Scholar
  49. 49.
    Røstad, L.: An extended misuse case notation, including vulnerabilities and the insider threat. In: Proceedings of the 12th International Working Conference on Requirements Engineering: Foundation for Software Quality, pp. 33–43 (2006)Google Scholar
  50. 50.
    Runeson, P.: Using students as experiment subjects–an analysis on graduate and freshmen student data. In: Proceedings 7th International Conference on Empirical Assessment & Evaluation in Software Engineering, pp. 95–102 (2003)Google Scholar
  51. 51.
    Salviulo, F., Scanniello, G.: Dealing with identifiers and comments in source code comprehension and maintenance: results from an ethnographically-informed study with students and professionals. In: Proceedings of the 18th International Conference on Evaluation and Assessment in Software Engineering, pp. 48:1–48:10 (2014)Google Scholar
  52. 52.
    Salman, I. Misirli, A.T., Juristo, N. : Are students representatives of professionals in software engineering experiments?. In International Conference on Software Engineering (ICSE), pp. 666– 676 (2015)Google Scholar
  53. 53.
    Schneier, B.: Attack trees. Dr Dobb’s J. 24(12), 21–29 (1999)Google Scholar
  54. 54.
    Shapiro, S.S., Wilk, M.B.: An analysis of variance test for the exponential distribution. TechnoMetrics 14, 355–370 (1972)CrossRefzbMATHGoogle Scholar
  55. 55.
    Siegel, S., Castellan, Jr. N.J.: Non-Parametric Statistics for the Behavioral Sciences, 2nd edn. McGraw-Hill (1988)Google Scholar
  56. 56.
    Sindre, G., Opdahl, A.: Eliciting security requirements with misuse cases. Requir. Eng. J. 10, 34–44 (2005)CrossRefGoogle Scholar
  57. 57.
    Sindre, G.: Mal-activity diagrams for capturing attacks on business processes. In: Proceedings of the 13th International Working Conference on Requirements Engineering: Foundation for Software Quality, pp. 355–366 (2007)Google Scholar
  58. 58.
    Sindre, G., Opdahl, A.L., Breivik, G.F.: Generalization/specialization as a structuring mechanism for misuse cases. In: Proceedings of the 2nd Symposium on Requirements Engineering for Information Security (SREIS’02), pp. 1–16 (2002)Google Scholar
  59. 59.
    Sjoberg, D.I., Anda, B., Arisholm, E., Dyba, T., Jorgensen, M., Karahasanovic, A., Vokác, M.: Conducting realistic experiments in software engineering. In: Proceedings of the 2002 International Symposium on Empirical Software Engineering (ISESE’02), Washington DC, 2002. IEEE Computer SocietyGoogle Scholar
  60. 60.
    Svahnberg, M., Aurum, A., Wohlin, C.: Using students as subjects–an empirical evaluation. In: ESEM, pp. 288–290 (2008)Google Scholar
  61. 61.
    van Lamsweerde, A.: Elaborating security requirements by construction of intentional anti-models. In: Proceedings of the 26th International Conference on Software Engineering, IEEE Computer Society ICSE ‘04, Washington, DC, USA, pp. 148–157 (2004)Google Scholar
  62. 62.
    Wohlin, C., Runeson, P., Host, M., Ohlsson, M.C., Regnell, B., Wesslen, A.: Experimentation in Software Engineering–An Introduction. Kluwer, Alphen aan den Rijn (2000)CrossRefzbMATHGoogle Scholar

Copyright information

© Springer-Verlag GmbH Germany, part of Springer Nature 2019

Authors and Affiliations

  1. 1.Software Engineering DepartmentAlfaisal UniversityRiyadhKingdom of Saudi Arabia

Personalised recommendations