Advertisement

Technical Action Research as a Validation Method in Information Systems Design Science

  • Roel Wieringa
  • Ayşe Moralı
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7286)

Abstract

Current proposals for combining action research and design science start with a concrete problem in an organization, then apply an artifact to improve the problem, and finally reflect on lessons learned. The aim of these combinations is to reduce the tension between relevance and rigor. This paper proposes another way of using action research in design science, which starts with an artifact, and then tests it under conditions of practice by solving concrete problems with them. The aim of this way of using action research in design science is to bridge the gap between the idealizations made when designing the artifact and the concrete conditions of practice that occur in real-world problems.

The paper analyzes the role of idealization in design science and compares it with the requirements of rigor and relevance. It then proposes a way of bridging the gap between idealization and practice by means of action research, called technical action research (TAR) in this paper. The core of TAR is that the researcher plays three roles, which must be kept logically separate, namely of artifact developer, artifact investigator, and client helper. Finally, TAR is compared to other approaches of using action research in design science, and with canonical action research.

Keywords

Enterprise Architecture Knowledge Question Problem Context Design Science Research Cycle 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Hevner, A., March, S., Park, J., Ram, S.: Design science in information system research. MIS Quarterly 28(1), 75–105 (2004)Google Scholar
  2. 2.
    Susman, G., Evered, R.: An assessment of the scientific merits of action research. Administrative Science Quarterly 23(4), 582–603 (1978)CrossRefGoogle Scholar
  3. 3.
    Lewin, K.: Action research and minority problems. Journal of Social Issues 2, 34–46 (1946)CrossRefGoogle Scholar
  4. 4.
    Baskerville, R.: What design science is not. European Journal of Information Systems 17, 441–443 (2008)CrossRefGoogle Scholar
  5. 5.
    Järvinen, P.: Action research is similar to design science. Quality and Quantity 41(1), 37–54 (2007)CrossRefGoogle Scholar
  6. 6.
    Lee, A.: Action is an artifact: What action research and design science offer to each other. In: Kock, N. (ed.) Information Systems Action Research: An Applied View of Emerging Concepts and Methods, pp. 43–60. Springer (2007)Google Scholar
  7. 7.
    March, A., Smith, G.: Design and natural science research on information technology. Decision Support Systems 15(4), 251–266 (1995)CrossRefGoogle Scholar
  8. 8.
    Baskerville, R., Pries-Heje, J., Venable, J.: Soft design science methodology. In: Proceedings of the 4th International Conference on Design Science Research in Information Systems and Technology, DESRIST 2009, pp. 9:1–9:11. ACM Press (2009)Google Scholar
  9. 9.
    Sein, M., Henfridsson, O., Purao, S., Rossi, M., Lindgren, R.: Action design research. MIS Quarterly 35(2), 37–56 (2011)Google Scholar
  10. 10.
    Schön, D.: The Reflective Practitioner: How Professionals Think in Action. Arena (1983)Google Scholar
  11. 11.
    Cartwright, N.: How the Laws of Physics Lie. Oxford University Press (1983)Google Scholar
  12. 12.
    Cartwright, N.: The Dappled World. A Study of the Boundaries of Science. Cambridge University Press (1999)Google Scholar
  13. 13.
    McMullin, E.: Galilean idealization. Studies in the History and Philosophy of Science 16(3), 247–273 (1985)MathSciNetCrossRefGoogle Scholar
  14. 14.
    Boon, M.: How science is applied in technology. International Studies in the Philosophy of Science 20(1), 27–47 (2006)CrossRefGoogle Scholar
  15. 15.
    Laymon, R.: Applying idealized scientific theories to engineering. Synthese 81, 353–371 (1989)CrossRefGoogle Scholar
  16. 16.
    Küppers, G.: On the relation between technology and science—goals of knowledge and dynamics of theories. The example of combustion technology, thermodynamics and fluid dynamics. In: Krohn, W., Layton, E., Weingart, P. (eds.) The Dynamics of Science and Technology. Sociology of the Sciences, II, pp. 113–133. Reidel (1978)Google Scholar
  17. 17.
    Wieringa, R.: Relevance and Problem Choice in Design Science. In: Winter, R., Zhao, J.L., Aier, S. (eds.) DESRIST 2010. LNCS, vol. 6105, pp. 61–76. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  18. 18.
    Vincenti, W.: What Engineers Know and How They Know It. Analytical Studies from Aeronautical History. Johns Hopkins (1990)Google Scholar
  19. 19.
    Wieringa, R.J.: Design science as nested problem solving. In: Proceedings of the 4th International Conference on Design Science Research in Information Systems and Technology, pp. 1–12. ACM, New York (2009)Google Scholar
  20. 20.
    Van Strien, P.: Towards a methodology of psychological practice: The regulative cycle. Theory & Psychology 7(5), 683–700 (1997)CrossRefGoogle Scholar
  21. 21.
    Morali, A., Wieringa, R.J.: Risk-based confidentiality requirements specification for outsourced it systems. In: Proceedings of the 18th IEEE International Requirements Engineering Conference (RE 2010), Sydney, Australia, Los Alamitos, California, pp. 199–208. IEEE Computer Society (September 2010)Google Scholar
  22. 22.
    Lee, A., Baskerville, R.: Generalizing generalizability in information systems research. Information Systems Research 14(3), 221–243 (2003)CrossRefGoogle Scholar
  23. 23.
    Seddon, P., Scheepers, R.: Other-settings generalizability in IS research. In: International Conference on Information Systems (ICIS), pp. 1141–1158 (2006)Google Scholar
  24. 24.
    Seddon, P., Scheepers, R.: Towards the improved treatment of generalization from knowledge claims in IS research: drawing general conclusions from samples. European Journal of Information Systems, 1–16 (2011), doi:10.1057/ejis.2011.9Google Scholar
  25. 25.
    Zambon, E., Etalle, S., Wieringa, R.J., Hartel, P.H.: Model-based qualitative risk assessment for availability of IT infrastructures. Software and Systems Modeling 10(4), 553–580 (2011)CrossRefGoogle Scholar
  26. 26.
    Engelsman, W., Wieringa, R.: Goal-Oriented Requirements Engineering and Enterprise Architecture: Two Case Studies and Some Lessons Learned. In: Regnell, B., Damian, D. (eds.) REFSQ 2011. LNCS, vol. 7195, pp. 306–320. Springer, Heidelberg (2012)CrossRefGoogle Scholar
  27. 27.
    Davison, R., Martinsons, M., Kock, N.: Principles of canonical action research. Information Systems Journal 14, 65–86 (2004)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Roel Wieringa
    • 1
  • Ayşe Moralı
    • 2
  1. 1.Department of Electrical Engineering, Mathematics, and Computer ScienceUniversity of TwenteThe Netherlands
  2. 2.PwCGentBelgium

Personalised recommendations