Requirements Engineering

, Volume 11, Issue 4, pp 295–307 | Cite as

The methodological soundness of requirements engineering papers: a conceptual framework and two case studies

  • R. J. WieringaEmail author
  • J. M. G. Heerkens
Original Research


This paper was triggered by concerns about the methodological soundness of many RE papers. We present a conceptual framework that distinguishes design papers from research papers, and show that in this framework, what is called a research paper in RE is often a design paper. We then present and motivate two lists of evaluation criteria, one for research papers and one for design papers. We apply both of these lists to two samples drawn from the set of all submissions to the RE’03 conference. Analysis of these two samples shows that most submissions of the RE’03 conference are design papers, not research papers, and that most design papers present a solution to a problem but neither validate this solution nor investigate the problems that can be solved by this solution. We conclude with a discussion of the soundness of our results and of the possible impact on RE research and practice.


Requirement Engineering Solution Design Knowledge Claim Causal Claim Design Paper 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.



This paper benefited from discussions with Klaas van den Berg and the participants of the CERE04 workshop and from the comments by anonymous reviewers.


  1. 1.
    Archer B (1969) The structure of the design process. In: Broadbent G, Ward A (eds) Design methods in Architecture. Lund Humphries, pp 76–102Google Scholar
  2. 2.
    Babbie E (2001) The practice of social research, 9th edn. Wadsworth, BelmontGoogle Scholar
  3. 3.
    Cooper D, Schindler P (2003) Business research methods, 8th edn. Irwin/McGraw-HillGoogle Scholar
  4. 4.
    Cross N (1994) Engineering design methods: strategies for product design, 2nd edn. Wiley, New YorkGoogle Scholar
  5. 5.
    Cross N (2001) Design cognition: results from protocol and other empirical studies of design activity. In: Eastman C, McCracken M, Newstetter W (eds) Design knowing and learning: cognition in design education. Elsevier, Amsterdam, pp 79–103Google Scholar
  6. 6.
    Davis A, Hickey A (2004) A new paradigm for planning and evaluating requirements engineering research. In: 2nd international workshop on comparative evaluation in requirements engineeering, pp 7–16Google Scholar
  7. 7.
    Gause D, Weinberg G (1989) Exploring requirements: quality before design. Dorset House PublishingGoogle Scholar
  8. 8.
    Glass R, Ramesh V, Vessey I (2004) An analysis of research in the computing disciplines. Commun ACM 47(6):89–94CrossRefGoogle Scholar
  9. 9.
    Hicks M (1999) Problem solving in business and management; hard, soft and creative approaches. International Thomson Business PressGoogle Scholar
  10. 10.
    Kaplan A (1998) The conduct of inquiry. Methodology for behavioral science. Transaction Publishers, 1998. First edition 1964 by Chandler PublishersGoogle Scholar
  11. 11.
    Kitchenham B, Pfleeger S, Hoaglin D, Emam K, Rosenberg J (2002) Preliminary guidelines for empirical research in software engineering. IEEE Trans Softw Eng 28(8):721–733CrossRefGoogle Scholar
  12. 12.
    Kuhn T (1970) Logic of discovery or psychology of research? In: Lakatos I, Musgrave A (eds) Criticism and the growth of knowledge. Cambridge University Press, Cambridge, pp 1–23Google Scholar
  13. 13.
    Lakatos I (1976) Proofs and refutations. In: Worall J, Zahar E (eds) Cambridge University Press, CambridgeGoogle Scholar
  14. 14.
    March J (1994) A primer on decision-making. Free PressGoogle Scholar
  15. 15.
    Pahl G, Beitz W (1986) Konstruktionslehre. Handbuch für Studium und Praxis. Springer, Berlin Heidelberg New YorkGoogle Scholar
  16. 16.
    Parnas D, Clements P (1986) A rational design process: how and why to fake it. IEEE Trans Softw Eng SE-12:251–257Google Scholar
  17. 17.
    Pfleeger S (1995) Experimental design and analysis in software engineering. Ann Softw Eng 1:219–253CrossRefGoogle Scholar
  18. 18.
    Roozenburg N, Eekels J (1995) Product design: fundamentals and Methods. Wiley, New YorkGoogle Scholar
  19. 19.
    Sikkel N, Wieringa R (eds) (2003) In: Proceedings of the 11th IEEE international requirements engineering conference. IEEE Computer Science PressGoogle Scholar
  20. 20.
    Suchman L (1983) Office procedures as practical action: models of work and system design. ACM Trans Office Inf Syst 1:320–328CrossRefGoogle Scholar
  21. 21.
    Suchman L, Wynn E (1984) Procedures and problems in the office. Office Technol People 2:135–154Google Scholar
  22. 22.
    Tichy W, Lukowicz P, Prechelt L, Heinz E (1997) Experimental evaluation in computer science: a quantitative study. J Syst Softw 28:9–18CrossRefGoogle Scholar
  23. 23.
    Wieringa R (1996) Requirements engineering: frameworks for understanding. Wiley, New YorkGoogle Scholar
  24. 24.
    Witte E (1972) Field research on complex-decision-making processes—the phase theorem. Int Stud Manage Organ 2:156–182Google Scholar
  25. 25.
    Wohlin C, Runeson P, Höst M, Ohlsson MC, Regnell B, Weslén A (2002) Experimentation in software engineering: an introduction. Kluwer, DordrechtGoogle Scholar
  26. 26.
    Zelkowitz M, Wallace D (1997) Experimental validation in software engineering. Inform Softw Technol 39:735–743CrossRefGoogle Scholar

Copyright information

© Springer-Verlag London Limited 2006

Authors and Affiliations

  1. 1.Faculty of Electrical Engineering, Mathematics, and Computer ScienceUniversity of TwenteEnschedeThe Netherlands
  2. 2.Faculty of Business, Public Administration, and TechnologyUniversity of TwenteEnschedeThe Netherlands

Personalised recommendations