Usefulness of a Human Error Identification Tool for Requirements Inspection: An Experience Report

  • Vaibhav AnuEmail author
  • Gursimran Walia
  • Gary Bradshaw
  • Wenhua Hu
  • Jeffrey C. Carver
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10153)


Context and Motivation: Our recent work leverages Cognitive Psychology research on human errors to improve the standard fault-based requirements inspections. Question: The empirical study presented in this paper investigates the effectiveness of a newly developed Human Error Abstraction Assist (HEAA) tool in helping inspectors identify human errors to guide the fault detection during the requirements inspection. Results: The results showed that the HEAA tool, though effective, presented challenges during the error abstraction process. Contribution: In this experience report, we present major challenges during the study execution and lessons learned for future replications.


Human Error Future Replication Error Mechanism Error Class Inspection Technique 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.



This work was supported by NSF Awards 1423279 and 1421006. The authors would like to thank the students of the Software Requirements course at North Dakota State University for participating in this study.


  1. 1.
    Anu, V., Walia, G.S., Hu, W., Carver, J.C., Bradshaw, G.: Effectiveness of human error taxonomy during requirements inspection: an empirical investigation. In: Software Engineering and Knowledge Engineering, SEKE 2016 (2016)Google Scholar
  2. 2.
    Anu, V., Walia, G.S., Hu, W., Carver, J.C., Bradshaw, G.: The Human Error Abstraction Assist (HEAA) tool (2016).
  3. 3.
    Hsieh, H.F., Shannon, S.E.: Three approaches to qualitative content analysis. Qual. Health Res. 15(9), 1277–1288 (2005)CrossRefGoogle Scholar
  4. 4.
    Hu, W., Carver, J.C., Anu, V., Walia, G.S., Bradshaw, G.: Detection of requirement errors and faults via a human error taxonomy: a feasibility study. In: 10th ACM/IEEE International Symposium on Empirical Software Engineering and Measurement, ESEM 2016 (2016)Google Scholar
  5. 5.
    Lanubile, F., Shull, F., Basili, V.R.: Experimenting with error abstraction in requirements documents. In: Proceedings of the 5th International Symposium on Software Metrics (1998)Google Scholar
  6. 6.
    Porter, A.A., Votta, L.G., Basili, V.R.: Comparing detection methods for software requirements inspections: a replicated experiment. IEEE Trans. Softw. Eng. 21(6), 563–575 (1995)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  • Vaibhav Anu
    • 1
    Email author
  • Gursimran Walia
    • 1
  • Gary Bradshaw
    • 2
  • Wenhua Hu
    • 3
  • Jeffrey C. Carver
    • 3
  1. 1.North Dakota State UniversityFargoUSA
  2. 2.Mississippi State UniversityStarkvilleUSA
  3. 3.University of AlabamaTuscaloosaUSA

Personalised recommendations