Abstract
The central outcome of design science research (DSR) is prescriptive knowledge in the form of IT artifacts and recommendations. However, prescriptive knowledge is considered to have no truth value in itself. Given this assumption, the validity of DSR outcomes can only be assessed by means of descriptive knowledge to be obtained at the conclusion of a DSR process. This is reflected in the build-evaluate pattern of current DSR methodologies. Recognizing the emergent nature of IT artifacts this build-evaluate pattern, however, poses unfavorable implications regarding the achievement of rigor within a DSR project. While it is vital in DSR to prove the usefulness of an artifact a rigorous DSR process also requires justifying and validating the artifact design itself even before it has been put into use. This paper proposes three principles for evaluating DSR artifacts which not only address the evaluation of an artifact’s usefulness but also the evaluation of design decisions made to build an artifact. In particular, it is argued that by following these principles the prescriptive knowledge produced in DSR can be considered to have a truth-like value.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
March, S.T., Smith, G.: Design and Natural Science Research on Information Technology. Decision Support Systems 15(4), 251–266 (1995)
Hevner, A.R., March, S.T., Park, J., Ram, S.: Design Science in Information Systems. MIS Quarterly 28(1), 75–105 (2004)
Pries-Heje, J., Baskerville, R., Venable, J.: Strategies for Design Research Evaluation. In: 16th European Conference on Information Systems (ECIS 2008), Galway, Ireland (2008)
Gregor, S.: Building Theory in the Sciences of the Artificial. In: Proceedings of the International Conference on Design Science Research in Information Systems and Technologies, DESRIST 2009, Malvern, PA (2009)
Iivari, J.: A Paradigmatic Analysis of Information Systems As a Design Science. Scandinavian J. Inf. Systems 19(2) (2007)
Gregor, S., Jones, D.: The anatomy of a design theory. Journal of the Association of Information Systems 8(5), Article 2, 312–335 (2007)
Sonnenberg, C., vom Brocke, J.: Evaluation Patterns for Design Science Research Artefacts. In: Proceedings of the European Design Science Symposium 2011, Dublin, Ireland. CCIS, vol. 286. Springer (2012)
Simon, H.: The sciences of the artificial, 3rd edn. MIT Press (1996)
Peffers, K., Tuunanen, T., Rothenberger, M.A., Chatterjee, S.: A Design Science Research Methodology for Information Systems Research. Journal of Management Information Systems 24(3), 45–77 (2007)
Sein, M.K., Henfridsson, O., Purao, S., Rossi, M., Lindgren, R.: Action Design Research. MIS Quarterly 35(1), 37–56 (2011)
vom Brocke, J., Recker, J., Mendling, J.: Value-oriented process modeling: integrating financial perspectives into business process re-design. Business Process Management Journal 6(2), 333–356 (2010)
Walls, J., Widmeyer, G.R., El Sawy, O.A.: Building an information system design theory for vigilant EIS. Information Systems Research 3(1), 36–59 (1992)
Purao, S.: Design research in technology and information systems: truth or dare, unpublished paper, School of Information Sciences and Technology, The Pennsylvania State University, University Park, State College, PA (2002)
Aier, S., Fischer, C.: Criteria for Progress for Information Systems Design Theories. Information Systems and E-Business Management 9(1), 133–172 (2011)
Rosemann, M., Vessey, I.: Toward Improving the Relevance of Information Systems Research to Practice: The Role of Applicability Checks. MIS Quarterly 32(1), 1–22 (2008)
Alexander, C., Ishikawa, S., Silverstein, M.: A Pattern Language. Oxford University Press, New York (1977)
Petter, S., Khazanchi, D., Murphy, J.D.: A Design Science Based Evaluation Framework for Patterns. The DATA BASE for Advances in Information Systems 41(3), 9–26 (2010)
Vaishnavi, V.K., Kuechler, W.: Improving and Innovating Information & Communication Technology: Design Science Research Methods and Patterns. Taylor Francis (2008)
Venable, J.: A Framework for Design Science Research Activities. In: Proceedings of the 2006 Information Resource Management Association Conference, Washington, DC, USA (2006)
Sun, Y., Kantor, P.B.: Cross-Evaluation: A new model for information system evaluation. Journal of the American Society for Information Science and Technology 57(5), 614–628 (2006)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2012 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Sonnenberg, C., vom Brocke, J. (2012). Evaluations in the Science of the Artificial – Reconsidering the Build-Evaluate Pattern in Design Science Research. In: Peffers, K., Rothenberger, M., Kuechler, B. (eds) Design Science Research in Information Systems. Advances in Theory and Practice. DESRIST 2012. Lecture Notes in Computer Science, vol 7286. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-29863-9_28
Download citation
DOI: https://doi.org/10.1007/978-3-642-29863-9_28
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-29862-2
Online ISBN: 978-3-642-29863-9
eBook Packages: Computer ScienceComputer Science (R0)