Improving Model Inspection Processes with Crowdsourcing: Findings from a Controlled Experiment

  • Dietmar Winkler
  • Marta Sabou
  • Sanja Petrovic
  • Gisele Carneiro
  • Marcos Kalinowski
  • Stefan Biffl
Conference paper
Part of the Communications in Computer and Information Science book series (CCIS, volume 748)

Abstract

The application of best-practice software inspection processes for early defect detection requires considerable human effort. Crowdsourcing approaches can support inspection activities (a) by distributing inspection effort among a group of human experts and (b) by increasing inspection control. Thus, the application of crowdsourcing techniques aims at making inspection processes more effective and efficient. In this paper, we present a crowdsourcing-supported model inspection (CSI) process and investigate its defect detection effectiveness and efficiency when inspecting an Extended Entity Relationship (EER) model. The CSI process uses so-called Expected Model Elements (EMEs) to guide CSI inspectors during defect detection. We conducted a controlled experiment on defect detection effectiveness, efficiency, and false positives. While CSI effectiveness and efficiency is lower for CSI inspectors, the number of false positives decreases. However, CSI was found promising for increasing the control of defect detection and supports the inspection of large-scale engineering models.

Keywords

Software inspection Software models Crowdsourcing Empirical study Process improvement 

References

  1. 1.
    Parnas, D.L., Lawford, M.: The role of inspection in software quality assurance. IEEE Trans. Softw. Eng. 29(8), 674–676 (2003)CrossRefGoogle Scholar
  2. 2.
    Brambilla, M., Cabot, J., Wimmer, M.: Model-Driven Software Engineering in Practice. Morgan & Claypool Publishers, San Rafael (2012)Google Scholar
  3. 3.
    Fagan, M.E.: Design and code inspections to reduce errors in program development. IBM Syst. J. 15(7), 182–211 (1976)CrossRefGoogle Scholar
  4. 4.
    Aurum, A., Petersson, H., Wohlin, C.: State-of-the-art: Software Inspections after 25 Years. Softw. Test. Verificat. Reliab. 12(3), 133–154 (2002)CrossRefGoogle Scholar
  5. 5.
    Travassos, G.H., Shull, F., Carver, J., Basili, V.: Reading techniques for OO design inspections. Technical report, Fraunhofer Center-Maryland, U. Maryland (2002)Google Scholar
  6. 6.
    LaToza, T.D., van der Hoek, A.: Crowdsourcing in Software Engineering: Models. IEEE Softw. Motiv. Challenges 33(1), 74–80 (2016)CrossRefGoogle Scholar
  7. 7.
    Shull, F., Rus, I., Basili, V.: How perspective-based reading can improve requirements inspections. IEEE Softw. 3(7), 73–79 (2000)Google Scholar
  8. 8.
    Kalinowski, M., Travassos, G.H.: A computational framework for supporting software inspections. In: Proceedings of ASE, Linz, Austria, pp. 46–55 (2004)Google Scholar
  9. 9.
    Lanubile, F., Mallardo, T.: Tool support for distributed inspection. In: Proceedings of COMPSAC. IEEE (2002)Google Scholar
  10. 10.
    Hernandes, E.M., Belgamo, A., Fabbri, S.: Experimental studies in software inspection process - a systematic mapping. In: Proceedings of ICEIS, pp. 66–76 (2013)Google Scholar
  11. 11.
    Mao, K., Capra, L., Harman, M., Jia, Y.: A survey of the use of crowdsourcing in software engineering. J. Syst. Softw., 28 p. (2016)Google Scholar
  12. 12.
    Wohlin C., Runeson P., Höst, M., Ohlsson, M., Regnell, B., Wessl, A.: Experimentation in Software Engineering. Springer, Heidelberg (2012)Google Scholar
  13. 13.
    Winkler, D., Sabou, M., Petrovic, S., Caneiro, G., Kalinowski, M., Biffl, S.: Improving model inspection with crowdsourcing. In: Proceedings of the International Workshop on Crowdsourcing in SE, ICSE, Buenos Aires (2017, to appear)Google Scholar
  14. 14.
    Manning, C.D., Raghavan, P., Schütze, H.: Introduction to Information Retrieval. Cambridge University Press, Cambridge (2008)Google Scholar
  15. 15.
    Thalheim, V.: Extended entity-relationship model. In: Liu, L., Özsu, M.T. (eds.) Encyclopedia of Database Systems, pp. 1083–1091. Springer, New York (2009)Google Scholar
  16. 16.
    Winkler D., Sabou, M., Petrovic, S., Caneiro, G., Kalinowski, M., Biffl, S.: Investigating model quality assurance with a distributed and scalable review process. In: Proceedings of CIbSE, Buenos Aires, Argentina (2017, to appear)Google Scholar
  17. 17.
    Runeson, P.: Using students as experiment subjects–an analysis on graduate and freshmen student data. In: Proceedings of the 7th EASE Conference (2003)Google Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  • Dietmar Winkler
    • 1
  • Marta Sabou
    • 1
  • Sanja Petrovic
    • 1
  • Gisele Carneiro
    • 2
  • Marcos Kalinowski
    • 2
  • Stefan Biffl
    • 1
  1. 1.Institute of Software TechnologyVienna University of TechnologyViennaAustria
  2. 2.Graduate Program in ComputingFluminense Federal UniversityNiteróiBrazil

Personalised recommendations