Research Methods

  • Jan Recker
Part of the Progress in IS book series (PROIS)


Information systems research as a social science is complex, diverse, and pluralistic, meaning that it can take many forms of inquiry, theory, and outcomes. The way information systems research is conducted as well as the goals, theory and assumptions of the research can vary significantly. This is probably most evident in the choices related to the selection of an appropriate research methodology, as we discussed in Sect. 3.3 above.


Structural Equation Modelling Quantitative Method Qualitative Method Survey Research Theoretical Construct 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


  1. 1.
    Aguirre-Urreta, M.I., Marakas, G.M.: Revisiting bias due to construct misspecification: Different results from considering coefficients in standardized form. MIS Q. 36, 123–138 (2012)Google Scholar
  2. 4.
    Ang, J.S.K., Sum, C.-C., Yeo, L.-N.: A multiple-case design methodology for studying MRP success and CSFs. Inf. Manag. 39, 271–281 (2002)CrossRefGoogle Scholar
  3. 5.
    Arazy, O., Kumar, N., Shapira, B.: A theory-driven design framework for social recommender systems. J. Assoc. Inf. Syst. 11, 455–490 (2010)Google Scholar
  4. 7.
    Avison, D.E., Myers, M.D.: Information systems and anthropology: An anthropological perspective on IT and organizational cuture. Inform. Tech. People. 8, 43–56 (1995)CrossRefGoogle Scholar
  5. 9.
    Bailey, J.E., Pearson, S.W.: Development of a tool for measuring and analyzing computer user satisfaction. Manag. Sci. 29, 530–545 (1983)CrossRefGoogle Scholar
  6. 10.
    Bandara, W., Gable, G.G., Rosemann, M.: Factors and measures of business process modelling: Model building through a multiple case study. Eur. J. Inf. Syst. 14, 347–360 (2005)CrossRefGoogle Scholar
  7. 11.
    Bandara, W., Rosemann, M.: What are the secrets of successful process modeling? Insights from an Australian case study. Systèmes d'Information et Management 10, 47–68 (2005)Google Scholar
  8. 12.
    Baskerville, R., Myers, M.D.: Special issue on action research in information systems: Making IS research relevant to practice: Foreword. MIS Q. 28, 329–335 (2004)Google Scholar
  9. 13.
    Baskerville, R., Wood-Harper, A.T.: A critical perspective on action research as a method for information systems research. J. Inf. Technol. 11, 235–246 (1996)CrossRefGoogle Scholar
  10. 14.
    Baskerville, R., Wood-Harper, A.T.: Diversity in information systems action research methods. Eur. J. Inf. Syst. 7, 90–107 (1998)CrossRefGoogle Scholar
  11. 16.
    Benbasat, I., Goldstein, D.K., Mead, M.: The case research strategy in studies of information systems. MIS Q. 11, 369–388 (1987)CrossRefGoogle Scholar
  12. 18.
    Beynon-Davies, P.: Information management in the British National Health Service: The pragmatics of strategic data planning. Int. J. Inf. Manag. 14, 84–94 (1994)CrossRefGoogle Scholar
  13. 22.
    Boudreau, M.-C., Gefen, D., Straub, D.W.: Validation in information systems research: A state-of-the-art assessment. MIS Q. 25, 1–16 (2001)CrossRefGoogle Scholar
  14. 25.
    Bryant, A., Charmaz, K.C. (eds.): The SAGE handbook of grounded theory. Sage, London (2007)Google Scholar
  15. 26.
    Burgess, C., Lund, K.: Modelling parsing constraints with high-dimensional context space. Lang. Cognit. Process. 12, 177–210 (1997)CrossRefGoogle Scholar
  16. 28.
    Burton-Jones, A., Lee, A.S.: Thinking about measures and measurement. In: Sprague Jr, R.H. (ed.): Proceedings of the 44th Hawaii International Conference on System Sciences. IEEE Computer Society, Kauai, Hawaii (2011) 1-10Google Scholar
  17. 29.
    Carlsson, S.A.: Developing information systems design knowledge: A critical realist perspective. Electron. J. Bus. Res. Methodol. 3, 93–102 (2005)Google Scholar
  18. 30.
    Cavaye, A.L.M.: Case study research: A multi-faceted research approach for IS. Inf. Syst. J. 6, 227–242 (1996)CrossRefGoogle Scholar
  19. 31.
    Centefelli, R.T., Bassellier, G.: Interpretation of formative measurement in information systems research. MIS Q. 33, 689–708 (2009)Google Scholar
  20. 34.
    Chan, H.C., Kim, H.-W., Tan, W.C.: Information systems citation patterns from international conference on information systems articles. J. Am. Soc. Inf. Sci. Technol. 57, 1263–1274 (2006)CrossRefGoogle Scholar
  21. 35.
    Chen, W.S., Hirschheim, R.: A paradigmatic and methodological examination of information systems research from 1991 to 2001. Inf. Syst. J. 14, 197–235 (2004)CrossRefGoogle Scholar
  22. 36.
    Churchill Jr., G.A.: A paradigm for developing better measures of marketing constructs. J. Mark. Res. 16, 64–73 (1979)CrossRefGoogle Scholar
  23. 40.
    Creswell, J.W.: Research design: Qualitative, quantitative, and mixed methods approaches, 3rd edn. Sage, Thousand Oaks (2009)Google Scholar
  24. 42.
    Crowston, K., Myers, M.D.: Information technology and the transformation of industries: Three research perspectives. J. Strateg. Inf. Syst. 13, 5–28 (2004)CrossRefGoogle Scholar
  25. 44.
    Darke, P., Shanks, G., Broadbent, M.: Successfully completing case study research: Combining Rigour, Relevance and pragmatism. Inf. Syst. J. 8, 273–290 (1998)CrossRefGoogle Scholar
  26. 45.
    Davis, F.D.: Perceived usefulness, Perceived ease of use, and user Acceptance of information technology. MIS Q. 13, 319–340 (1989)CrossRefGoogle Scholar
  27. 48.
    de Vaus, D.A.: Surveys in social research, 5th edn. Taylor & Francis Books, London (2001)Google Scholar
  28. 52.
    Denzin, N.K., Lincoln, Y.S. (eds.): Handbook of qualitative research, 3rd edn. Sage, Thousand Oaks (2005)Google Scholar
  29. 53.
    Diamantopoulos, A.: Incorporating formative measures into covariance-based structural equation models. MIS Q. 35, 335–358 (2001)Google Scholar
  30. 54.
    Diamantopoulos, A., Siguaw, J.A.: Formative versus reflective indicators in organizational measure development: A comparison and empirical illustration. Br. J. Manag. 17, 263–282 (2006)CrossRefGoogle Scholar
  31. 55.
    Diamantopoulos, A., Winklhofer, H.M.: Index construction with formative indicators: An alternative to scale development. J. Mark. Res. 38, 259–277 (2001)CrossRefGoogle Scholar
  32. 56.
    Dubé, L., Paré, G.: Rigor in information systems positivist case research: Current practices, trends, and recommendations. MIS Q. 27, 597–635 (2003)Google Scholar
  33. 59.
    Eisenhardt, K.M.: Building theories from case study research. Acad. Manag. Rev. 14, 532–550 (1989)Google Scholar
  34. 61.
    Esteves, J., Pastor, J.: Using a multimethod approach to research enterprise systems implementations. Electron. J. Bus. Res. Method. 2, 69–82 (2004)Google Scholar
  35. 64.
    Fink, A., Kosecoff, J.: How to conduct surveys: A step-by-step guide. Sage, Beverly Hills (1985)Google Scholar
  36. 65.
    Fowler, F.J.: Survey research methods, 3rd edn. Sage, Thousand Oaks (2001)Google Scholar
  37. 66.
    Gable, G.G.: Integrating case study and survey research methods: An example in information systems. Eur. J. Inf. Syst. 3, 112–126 (1994)CrossRefGoogle Scholar
  38. 68.
    Galliers, R.D., Whitley, E.A.: Vive les differences? Developing a profile of european information systems research as a basis for international comparisons. Eur. J. Inf. Syst. 16, 20–35 (2007)CrossRefGoogle Scholar
  39. 69.
    Gefen, D., Straub, D.W., Boudreau, M.-C.: Structural equation modeling and regression: Guidelines for research practice. Commun. Assoc. Inf. Sys. 4, (2000)Google Scholar
  40. 70.
    Germonprez, M., Hovorka, D.S., Collopy, F.: A theory of tailorable technology design. J. Assoc. Inf. Syst. 8, 351–367 (2007)Google Scholar
  41. 71.
    Gibson, M., Arnott, D.: The use of focus groups in design science research. In: Toleman, M., Cater-Steel, A., Roberts, D. (eds.) Proceedings of the 18th Australasian Conference on Information Systems. The University of Southern Queensland, Toowoomba, pp 327–337 (2007)Google Scholar
  42. 72.
    Glaser, B.G., Strauss, A.L.: The discovery of grounded theory: Strategies for qualitative research. Aldine Publishing Company, Chicago (1967)Google Scholar
  43. 73.
    Goodhue, D.L.: Development and measurement validity of a task-technology fit instrument for user evaluations of information systems. Decis. Sci. 29, 105–139 (1998)CrossRefGoogle Scholar
  44. 75.
    Greene, J.C., Caracelli, V.J., Graham, W.F.: Toward a conceptual framework for mixed method evaluation designs. Educational Evaluation and Policy Analysis 11, 255–274 (1989)Google Scholar
  45. 77.
    Gregor, S., Jones, D.: The anatomy of a design theory. J. Assoc. Inf. Syst. 8, 312–335 (2007)Google Scholar
  46. 78.
    Haenlein, M., Kaplan, A.M.: A beginner’s guide to partial least squares analysis. Underst. Stat. 3, 283–297 (2004)CrossRefGoogle Scholar
  47. 81.
    Hevner, A.R.: A three cycle view of design science research. Scand. J. Inf. Syst. 19, 87–92 (2007)Google Scholar
  48. 82.
    Hevner, A.R., Chatterjee, S.: Design research in information systems: Theory and practice. Springer, New York (2010)CrossRefGoogle Scholar
  49. 83.
    Hevner, A.R., March, S.T., Park, J., Ram, S.: Design science in information systems research. MIS Q. 28, 75–105 (2004)Google Scholar
  50. 84.
    Hirschheim, R., Newman, M.: Symbolism and information systems development: Myth. metaphore and magic. Inf. Syst. Res. 2, 29–62 (1991)CrossRefGoogle Scholar
  51. 85.
    Hoaglin, D.C., Light, R.J., McPeek, B., Mosteller, F., Stoto, M.A.: Data for decisions: Information strategies for policy makers. Abt Books, Cambridge (1982)Google Scholar
  52. 87.
    Iivari, J.: A paradigmatic analysis of information systems as a design science. Scand. J. Inf. Syst. 19, 39–64 (2007)Google Scholar
  53. 89.
    Johnson, R.B., Onwuegbuzie, A.J.: Mixed methods research: A research paradigm whose time has come. Educ. Res. 33, 14–26 (2004)Google Scholar
  54. 90.
    Jöreskog, K.G., Sörbom, D.: LISREL 8: User’s Reference Guide. Scientific Software International, Lincolnwood (2001)Google Scholar
  55. 92.
    Kaplan, B., Duchon, D.: Combining qualitative and quantitative methods in information systems research: A case study. MIS Q. 12, 571–586 (1988)CrossRefGoogle Scholar
  56. 93.
    Kemmis, S., McTaggart, R.: The action research reader, 3rd edn. Deakin University, Geelong (1988)Google Scholar
  57. 94.
    Kim, G., Shin, B., Grover, V.: Investigating two contradictory views of formative measurement in information systems research. MIS Q. 34, 345–366 (2010)Google Scholar
  58. 96.
    Klein, H.K., Myers, M.D.: A set of principles for conducting and evaluating interpretive field studies in information systems. MIS Q. 23, 67–94 (1999)CrossRefGoogle Scholar
  59. 100.
    Landauer, T.K., Foltz, P.W., Laham, D.: Introduction to latent semantic analysis. Discourse Processes 25, 259–284 (1998)CrossRefGoogle Scholar
  60. 101.
    Lee, A.S.: A scientific methodology for MIS case studies. MIS Q. 13, 32–50 (1989)CrossRefGoogle Scholar
  61. 105.
    Levitt, S.D., Dubner, S.J.: Freakonomics: A Rogue economist explores the hidden side of everything. William Morrow, New York (2005)Google Scholar
  62. 106.
    Lincoln, Y.S., Guba, E.G.: Naturalistic inquiry. Sage, Beverly Hills (1985)Google Scholar
  63. 107.
    Liu, F., Myers, M.D.: An analysis of the AIS basket of top journals. J. Sys. Inf. Technol. 13, 5–24 (2011)Google Scholar
  64. 109.
    MacKenzie, S.B., Podsakoff, P.M., Podsakoff, N.P.: Construct measurement and validation procedures in MIS and behavioral research: integrating new and existing techniques. MIS Q. 35, 293–334 (2011)Google Scholar
  65. 110.
    March, S.T., Smith, G.F.: Design and natural science research on information technology. Decis. Support. Syst. 15, 251–266 (1995)CrossRefGoogle Scholar
  66. 111.
    Markus, M.L.: Power, politics, and MIS implementation. Commun. ACM 26, 430–444 (1983)CrossRefGoogle Scholar
  67. 112.
    Markus, M.L.: Electronic mail as the medium of managerial choice. Organ. Sci. 5, 502–527 (1994)CrossRefGoogle Scholar
  68. 113.
    Markus, M.L., Majchrzak, A., Gasser, L.: A design theory for systems that support emergent knowledge processes. MIS Q. 26, 179–212 (2002)Google Scholar
  69. 115.
    McKay, J., Marshall, P.: A review of design science in information systems. In: Campbell, B., Underwood, J., Bunker, D. (eds.) Proceedings of the 16th Australasian Conference on Information Systems. Australasian Chapter of the Association for Information Systems, Sydney (2005)Google Scholar
  70. 116.
    Mingers, J.: Combining IS research methods: Towards a pluralist methodology. Inf. Syst. Res. 12, 240–259 (2001)CrossRefGoogle Scholar
  71. 117.
    Moore, G.C., Benbasat, I.: Development of an instrument to measure the perceptions of adopting an information technology innovation. Inf. Syst. Res. 2, 192–222 (1991)CrossRefGoogle Scholar
  72. 118.
    Müller-Wienbergen, F., Müller, O., Seidel, S., Becker, J.: Leaving the beaten tracks in creative work – A design theory for systems that support convergent and divergent thinking. J. Assoc. Inf. Syst. 12, 714–740 (2011)Google Scholar
  73. 119.
    Myers, M.D.: Interpretive research in information systems. In: Mingers, J., Stowell, F. (eds.) Information systems: An emerging discipline? pp. 239–268. McGraw- Hill, Maidenhead (1997)Google Scholar
  74. 120.
    Myers, M.D.: Qualitative research in business and management. Sage, Thousand Oaks (2009)Google Scholar
  75. 121.
    Myers, M.D., Newman, M.: The qualitative interview in IS research: examining the craft. Inf. Organ. 17, 2–26 (2007)CrossRefGoogle Scholar
  76. 123.
    Newsted, P.R., Huff, S., Munro, M., Schwarz, A.: Introduction to survey instruments. MISQ Discovery (1998)Google Scholar
  77. 124.
    Niehaves, B.: Epistemological perspectives on design science. Scand. J. Inf. Syst. 19, 99–110 (2007)Google Scholar
  78. 126.
    Orlikowski, W.J.: CASE tools as organizational change: Investigating incremental and radical changes in systems development. MIS Q. 17, 309–340 (1993)CrossRefGoogle Scholar
  79. 128.
    Petter, S., Straub, D.W., Rai, A.: Specifying formative constructs in IS research. MIS Q. 31, 623–656 (2007)Google Scholar
  80. 129.
    Pfeffer, K., Tuunanen, T., Rothenberger, M.A., Chatterjee, S.: A design science research methodology for information systems research. J. Manag. Inf. Syst. 24, 45–77 (2007)CrossRefGoogle Scholar
  81. 135.
    Recker, J.: Evaluations of process modeling grammars: Ontological, qualitative and quantitative analyses using the example of BPMN. Springer, Berlin (2011)Google Scholar
  82. 137.
    Recker, J., Rosemann, M.: A measurement instrument for process modeling research: Development, test and procedural model. Scand. J. Inf. Syst. 22, 3–30 (2010)Google Scholar
  83. 138.
    Recker, J., Rosemann, M.: The measurement of perceived ontological deficiencies of conceptual modeling grammars. Data. Knowl. Eng. 69, 516–532 (2010)CrossRefGoogle Scholar
  84. 140.
    Reich, B.H., Benbasat, I.: An empirical investigation of factors influencing the success of customer-oriented strategic systems. Inf. Syst. Res. 1, 325–347 (1990)CrossRefGoogle Scholar
  85. 142.
    Rialp, A., Rialp, J., Urbano, D., Vaillant, Y.: The born-global phenomenon: a comparative case study research. J. Int. Entrep. 3, 133–171 (2005)CrossRefGoogle Scholar
  86. 143.
    Robson, C.: Real world research, 2nd edn. Blackwell Publishers, Oxford (2002)Google Scholar
  87. 145.
    Sale, J.E.M., Lohfeld, L.H., Brazil, K.: Revisiting the quantitative-qualitative debate: implications for mixed-methods research. Qual. Quant. 36, 43–53 (2002)CrossRefGoogle Scholar
  88. 146.
    Sarker, S., Lee, A.S.: Using a case study to test the role of three key social enablers in ERP implementation. Inf. Manag. 40, 813–829 (2002)CrossRefGoogle Scholar
  89. 148.
    Sein, M.K., Henfridsson, O., Purao, S., Rossi, M., Lindgren, R.: Action design research. MIS Q. 35, 37–56 (2011)Google Scholar
  90. 149.
    Sethi, V., King, W.R.: Development of measures to assess the extent to which an information technology application provides competitive advantage. Manag. Sci. 40, 1601–1627 (1994)CrossRefGoogle Scholar
  91. 150.
    Shadish, W.R., Cook, T.D., Campbell, D.T.: Experimental and quasi-experimental designs for generalized causal inference, 2nd edn. Houghton Mifflin, Boston (2001)Google Scholar
  92. 151.
    Shanks, G.: Guidelines for conducting positivist case study research in information systems. Aust. J. Inf. Syst. 10, 76–85 (2002)Google Scholar
  93. 152.
    Simon, H.A.: The sciences of the artificial, 3rd edn. MIT Press, Cambridge (1996)Google Scholar
  94. 153.
    Singh, S.: Fermat’s last theorem: The story of a riddle that confounded the world’s greatest minds for 358 years. Fourth Estate, London (1997)Google Scholar
  95. 154.
    Slater, L.: Opening Skinner’s box: Great psychological experiments of the twentieth century. Norton & Company, New York (2005)Google Scholar
  96. 155.
    Smith, A.E., Humphreys, M.S.: Evaluation of unsupervised semantic mapping of natural language with Leximancer concept mapping. Behav. Res. Methods Instrum. Comput. 38, 262–279 (2006)CrossRefGoogle Scholar
  97. 156.
    Smith, N.C.: The case study: A useful research method for information management. J. Inf. Technol. 5, 123–133 (1990)CrossRefGoogle Scholar
  98. 160.
    Stephens, C.S., Ledbetter, W.N., Mitra, A.: Executive or functional manager? The nature of the CIO’s job. MIS Q. 16, 449–467 (1992)CrossRefGoogle Scholar
  99. 162.
    Stratman, J.K., Roth, A.V.: Enterprise Resource Planning (ERP) competence constructs: Two-stage multi-item scale development and validation. Decis. Sci. 33, 601–628 (2002)CrossRefGoogle Scholar
  100. 163.
    Straub, D.W.: Validating instruments in MIS research. MIS Q. 13, 147–169 (1989)CrossRefGoogle Scholar
  101. 166.
    Straub, D.W., Boudreau, M.-C., Gefen, D.: Validation guidelines for IS positivist research. Commun. Assoc. Inf. Syst. 13, 380–427 (2004)Google Scholar
  102. 167.
    Strauss, A.L., Corbin, J.: Grounded theory methodology: An overview. In: Denzin, N.K., Lincoln, Y.S. (eds.) Handbook of qualitative research, pp. 273–285. Sage, Thousand Oaks (1994)Google Scholar
  103. 168.
    Strauss, A.L., Corbin, J.: Basics of qualitative research: Techniques and procedures for developing grounded theory, 2nd edn. Sage, Thousand Oaks (1998)Google Scholar
  104. 169.
    Susman, G.I., Evered, R.D.: An assessment of the science merits of action research. Adm. Sci. Q. 23, 582–603 (1978)CrossRefGoogle Scholar
  105. 170.
    Tashakkori, A., Teddlie, C. (eds.): Handbook of mixed methods in social and behavioral research. Sage, Thousand Oaks (2003)Google Scholar
  106. 171.
    Urquhart, C., Lehmann, H., Myers, M.D.: Putting the theory back into grounded theory: Guidelines for grounded theory studies in information systems. Inf. Syst. J. 20, 357–381 (2010)CrossRefGoogle Scholar
  107. 178.
    Walsham, G.: IS strategy and implementation: A case study of a building society. SIGOIS Bull. 14, 13–16 (1993)CrossRefGoogle Scholar
  108. 179.
    Walsham, G.: Interpretive case studies in IS research: Nature and method. Eur. J. Inf. Syst. 4, 74–81 (1995)CrossRefGoogle Scholar
  109. 180.
    Walsham, G.: Doing interpretive research. Eur. J. Inf. Syst. 15, 320–330 (2006)CrossRefGoogle Scholar
  110. 183.
    Wang, C.L., Ahmed, P.K., Rafiq, M.: Knowledge management orientation: Construct development and empirical validation. Eur. J. Inf. Syst. 17, 219–235 (2008)CrossRefGoogle Scholar
  111. 193.
    Wohlin, C., Runeson, P., Höst, M., Ohlsson, M.C., Regnell, B., Wesslén, A.: Experimentation software engineering: An introduction. Kluwer Academic, Boston (2000)CrossRefGoogle Scholar
  112. 195.
    Yin, R.K.: Case study research: Design and methods, 4th edn. Sage, Thousand Oaks (2009)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Jan Recker
    • 1
  1. 1.School of Information SystemsQueensland University of TechnologyBrisbane, QLDAustralia

Personalised recommendations