Evaluations in the Science of the Artificial – Reconsidering the Build-Evaluate Pattern in Design Science Research

  • Christian Sonnenberg
  • Jan vom Brocke
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7286)


The central outcome of design science research (DSR) is prescriptive knowledge in the form of IT artifacts and recommendations. However, prescriptive knowledge is considered to have no truth value in itself. Given this assumption, the validity of DSR outcomes can only be assessed by means of descriptive knowledge to be obtained at the conclusion of a DSR process. This is reflected in the build-evaluate pattern of current DSR methodologies. Recognizing the emergent nature of IT artifacts this build-evaluate pattern, however, poses unfavorable implications regarding the achievement of rigor within a DSR project. While it is vital in DSR to prove the usefulness of an artifact a rigorous DSR process also requires justifying and validating the artifact design itself even before it has been put into use. This paper proposes three principles for evaluating DSR artifacts which not only address the evaluation of an artifact’s usefulness but also the evaluation of design decisions made to build an artifact. In particular, it is argued that by following these principles the prescriptive knowledge produced in DSR can be considered to have a truth-like value.


Design science research evaluation design theory epistemology 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    March, S.T., Smith, G.: Design and Natural Science Research on Information Technology. Decision Support Systems 15(4), 251–266 (1995)CrossRefGoogle Scholar
  2. 2.
    Hevner, A.R., March, S.T., Park, J., Ram, S.: Design Science in Information Systems. MIS Quarterly 28(1), 75–105 (2004)Google Scholar
  3. 3.
    Pries-Heje, J., Baskerville, R., Venable, J.: Strategies for Design Research Evaluation. In: 16th European Conference on Information Systems (ECIS 2008), Galway, Ireland (2008)Google Scholar
  4. 4.
    Gregor, S.: Building Theory in the Sciences of the Artificial. In: Proceedings of the International Conference on Design Science Research in Information Systems and Technologies, DESRIST 2009, Malvern, PA (2009)Google Scholar
  5. 5.
    Iivari, J.: A Paradigmatic Analysis of Information Systems As a Design Science. Scandinavian J. Inf. Systems 19(2) (2007)Google Scholar
  6. 6.
    Gregor, S., Jones, D.: The anatomy of a design theory. Journal of the Association of Information Systems 8(5), Article 2, 312–335 (2007)Google Scholar
  7. 7.
    Sonnenberg, C., vom Brocke, J.: Evaluation Patterns for Design Science Research Artefacts. In: Proceedings of the European Design Science Symposium 2011, Dublin, Ireland. CCIS, vol. 286. Springer (2012)Google Scholar
  8. 8.
    Simon, H.: The sciences of the artificial, 3rd edn. MIT Press (1996)Google Scholar
  9. 9.
    Peffers, K., Tuunanen, T., Rothenberger, M.A., Chatterjee, S.: A Design Science Research Methodology for Information Systems Research. Journal of Management Information Systems 24(3), 45–77 (2007)CrossRefGoogle Scholar
  10. 10.
    Sein, M.K., Henfridsson, O., Purao, S., Rossi, M., Lindgren, R.: Action Design Research. MIS Quarterly 35(1), 37–56 (2011)Google Scholar
  11. 11.
    vom Brocke, J., Recker, J., Mendling, J.: Value-oriented process modeling: integrating financial perspectives into business process re-design. Business Process Management Journal 6(2), 333–356 (2010)CrossRefGoogle Scholar
  12. 12.
    Walls, J., Widmeyer, G.R., El Sawy, O.A.: Building an information system design theory for vigilant EIS. Information Systems Research 3(1), 36–59 (1992)CrossRefGoogle Scholar
  13. 13.
    Purao, S.: Design research in technology and information systems: truth or dare, unpublished paper, School of Information Sciences and Technology, The Pennsylvania State University, University Park, State College, PA (2002)Google Scholar
  14. 14.
    Aier, S., Fischer, C.: Criteria for Progress for Information Systems Design Theories. Information Systems and E-Business Management 9(1), 133–172 (2011)CrossRefGoogle Scholar
  15. 15.
    Rosemann, M., Vessey, I.: Toward Improving the Relevance of Information Systems Research to Practice: The Role of Applicability Checks. MIS Quarterly 32(1), 1–22 (2008)Google Scholar
  16. 16.
    Alexander, C., Ishikawa, S., Silverstein, M.: A Pattern Language. Oxford University Press, New York (1977)Google Scholar
  17. 17.
    Petter, S., Khazanchi, D., Murphy, J.D.: A Design Science Based Evaluation Framework for Patterns. The DATA BASE for Advances in Information Systems 41(3), 9–26 (2010)CrossRefGoogle Scholar
  18. 18.
    Vaishnavi, V.K., Kuechler, W.: Improving and Innovating Information & Communication Technology: Design Science Research Methods and Patterns. Taylor Francis (2008)Google Scholar
  19. 19.
    Venable, J.: A Framework for Design Science Research Activities. In: Proceedings of the 2006 Information Resource Management Association Conference, Washington, DC, USA (2006)Google Scholar
  20. 20.
    Sun, Y., Kantor, P.B.: Cross-Evaluation: A new model for information system evaluation. Journal of the American Society for Information Science and Technology 57(5), 614–628 (2006)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Christian Sonnenberg
    • 1
  • Jan vom Brocke
    • 1
  1. 1.University of LiechtensteinVaduzPrincipality of Liechtenstein

Personalised recommendations