Advertisement

Laboratory Experiments of Configural Modeling

  • Arch Woodside
  • Rouxelle de Villiers
  • Roger Marshall
Chapter
  • 603 Downloads

Abstract

This chapter provides an overview of the laboratory experiments in this study and outlines the numerous methodological considerations for the application of fsQCA, a modification the QCA method. A description of the in-basket simulations and decision aids used in the laboratory experiments is provided, followed by a, step-by-step description of the research procedure.

Keywords

Boolean Algebra Truth Table Causal Condition Vice President Qualitative Comparative Analysis 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. Alessi, S. M. (1988). Fidelity in the design of instructional simulations. Journal of Computer Based Instruction, 15(2), 40–47.Google Scholar
  2. Amenta, E., & Poulsen, J. D. (1994). Where to begin: A survey of five approaches to selecting independent measures for qualitative comparative analysis. Sociological Methods and Research, 23(1), 22–53. doi: 10.1177/1525822X03257689.CrossRefGoogle Scholar
  3. Anderson, P. H., Cannon, H. M., Malik, D., & Thavikulwat, P. (1988). Games as instuments of assessment: A framework for evaluation. Development in Business Simulation and Experiential Learning, 25, 31–37.Google Scholar
  4. Anderson, G. L., & Herr, K. (1999). The new paradigm wars: Is there room for rigorous practitioner knowledge in schools and universities? Educational Researcher, 28(5), 12–21.CrossRefGoogle Scholar
  5. Anderson, P. H., & Lawton, L. (2009). Business simulations and cognitive learning: Developments, desires and future directions. Simulation and Gaming, 40(2), 193–216. doi: 10.1177/1046878108321624.CrossRefGoogle Scholar
  6. Armstrong, J. S. (1991). Prediction of consumer behavior by experts and novices. Journal of Consumer Research, 18(2), 251–256.CrossRefGoogle Scholar
  7. Armstrong, J. S. (2006). Findings from evidence-based forecasting: methods for reducing forecast error. International Journal of Forecasting, 22, 583–598.CrossRefGoogle Scholar
  8. Armstrong, J. S., & Brodie, R. J. (1994). Effects of portfolio planning methods on decision-making: Experimental results. International Journal of Research in Marketing, 11(1), 73–84.CrossRefGoogle Scholar
  9. Berg-Schlosser, D., & De Meur, G. (2009). Comparative research design: Case and variable selection. In B. Rihoux & C. C. Ragin (Eds.), Configurational comparative methods: Qualitative comparative analysis (QCA) and related techniques (pp. 19–32). Thousand Oaks, CA: Sage.CrossRefGoogle Scholar
  10. Bloom, B. S. (1956). Taxonomy of educational objectives: The cognitive domain. New York: David McKay.Google Scholar
  11. Bredemeier, M. E., & Greenblat, C. S. (1981). The educational effectiveness of simulation games. Simulation and Games, 12(3), 307–332. doi: 10.1177/104687818101200304.CrossRefGoogle Scholar
  12. Burns, R. B., & Burns, R. A. (2008). Business research methods and statistics using SSPS. London: Sage.Google Scholar
  13. Byrne, D., & Ragin, C. C. (2009). The Sage handbook of case-based methods. London: Sage.CrossRefGoogle Scholar
  14. Campbell, D. T. (1957). Factors relevant to the validity of experiments in social settings. Psychological Bulletin, 54(4), 297–312. doi: 10.1037/h0040950.CrossRefGoogle Scholar
  15. Campbell, D. T., & Stanley, J. C. (1963). Experimental and quasi-experimental designs for research on teaching. In N. L. Gage (Ed.), Handbook on research on teaching (pp. 171–246). Chicago: Rand McNally.Google Scholar
  16. Campbell, D. T., & Stanley, J. C. (1966). Experimental and quasi-experimental design for research. Chicago: Rand McNally.Google Scholar
  17. Cannon, H. M. (1995). Dealing with the complexity paradox in business simulation games. Developments in Business Simulation & Experiential Exercises, 22, 97–103.Google Scholar
  18. Cannon, H. M., & Burns, A. C. (1999). A framework for assessing the competencies reflected in simulation performance. Developments in Business Simulation & Experiential Exercises, 26, 40–44.Google Scholar
  19. Cohen, L., & Manion, L. (1994). Research methods in education. London: Routledge.Google Scholar
  20. Cohen, L., Manion, L., & Morrison, K. (2000). Research methods in education. New York: Routledge Falmer.CrossRefGoogle Scholar
  21. Cook, T. D., & Campbell, D. T. (1979). Quasi-experimentation: Design and analysis for field settings. Boston: Houghton Mifflin.Google Scholar
  22. Cox, E. P. (1980). The optimal number of response alternatives for a scale: A review. Journal of Marketing Research, 17(4), 407–422. doi: 10.2307/3150495.CrossRefGoogle Scholar
  23. de Bono, E. (1976). Teaching thinking. England: Penguin Books.Google Scholar
  24. Dimitrov, D. M., & Rumrill, P. D. (2003). Pretest-posttest designs and measurement of change. Work, 20, 159–165. Speaking of research. IOS Press.Google Scholar
  25. Feinstein, A. H., & Cannon, H. M. (2002). Constructs of simulation evaluation. Simulation and Gaming, 33(4), 425–440. doi: 10.1177/1046878102238606.CrossRefGoogle Scholar
  26. Fielding, N., & Warnes, R. (2009). Computer-based qualitative methods in case study research. In D. Byrne & C. C. Ragin (Eds.), Sage handbook of case-based methods (pp. 271–288). London: Sage.Google Scholar
  27. Fiss, P. C. (2009). Practical issues in QCA. Professional development workshop on Qualitative Comparative Analysis (QCA) conducted at the meeting of the Academy of Management. http://www-bcf.usc.edu/~fiss/QCA_PDW_2009_Fiss_Practical_Issues.pdf
  28. Fiss, P. C. (2011). Building better causal theories: A fuzzy set approach to typologies in organization research. Academy of Management Journal, 54(2), 393–420.CrossRefGoogle Scholar
  29. Gigerenzer, G., & Brighton, H. (2009). Homo Heuristicus: Why biased minds make better inferences. Topics in Cognitive Science, 1, 107–143. doi: 10.1111/j.1756-8765.2008.01006.x.CrossRefGoogle Scholar
  30. Gigerenzer, G., & Gaissmaier, W. (2011). Heuristic decision making. Annual Review of Psychology, 62(1), 451–482. doi: 10.1146/annurev-psych-120709-145346.CrossRefGoogle Scholar
  31. Gigerenzer, G., & Goldstein, D. G. (1996). Reasoning the fast and frugal way: Models of bounded rationality. Psychological Review, 103(4), 650–669. doi: 10.1037/0033-295x.103.4.650.CrossRefGoogle Scholar
  32. Gosen, J., & Washbush, J. (2004). A review of the scholarship on assessing experiential learning effectiveness. Simulation and Gaming, 35(2), 270–293. doi: 10.1177/1046878104263544.CrossRefGoogle Scholar
  33. Green, K. C. (2002). Forecasting decisions in conflict situations: A comparison of game theory, role-playing and unaided judgment. International Journal of Forecasting, 18, 321–344.CrossRefGoogle Scholar
  34. Green, K. C. (2005). Game theory, simulated interaction, and unaided judgment for forecasting decisions in conflicts: Further evidence. International Journal of Forecasting, 21(3), 463–472. doi: 10.1016/j.ijforecast.2005.02.006.CrossRefGoogle Scholar
  35. Gross, M. E. (2010). Aligning public-private partnership contracts with public objectives for transportation infrastructure. PhD thesis. Virginia Technicon, Virginia, VA. http://scholar.lib.vt.edu.ezproxy.aut.ac.nz/theses/available/etd-08242010-173605/unrestricted/Gross_ME_D_2010.pdf
  36. Hays, R. T., & Singer, M. J. (1989). Simulation fidelity in training systems design: Bridging the gap between reality and training. New York: Springer.CrossRefGoogle Scholar
  37. Jordan, E., Gross, M. E., Javernick-Will, A. M., & Garvin, M. J. (2011). Use and misuse of qualitative comparative analysis. Construction Management and Economics, 29(11), 1159–1173. doi: 10.1080/01446193.2011.640339.CrossRefGoogle Scholar
  38. Kaplan, A. (1967). A philosophical discussion of normality. Archives of General Psychiatry, 17(3), 325–330.CrossRefGoogle Scholar
  39. Kent, R. (2009). Case-centred methods and quantitative analysis. In D. Bryne & C. C. Ragin (Eds.), The Sage handbook of case-based methods. London: Sage.Google Scholar
  40. Kibbee, J. M. (1961). Model building for management games. Paper presented at the meeting of the Simulation and gaming: A symposium, New York.Google Scholar
  41. Likert, R. (1932). A technique for the measurement of attitude. Archives of Psychology, 140, 1–55.Google Scholar
  42. Marx, A. (2006). Toward more robust model specification in QCA. Results from a methodological experiment (Compasss Working Paper, WP2006-43).Google Scholar
  43. McClelland, D. C. (1998). Identifying competencies with behavioral-event interviews. Psychological Science, 9(5), 331–339. doi: 10.1111/1467-9280.00065.CrossRefGoogle Scholar
  44. Newton, P., & Burgess, D. (2008). Exploring types of educational action research: Implication for research validity. International Journal of Qualitative Methods, 7(4), 18–30.Google Scholar
  45. Norris, D. R. (1986). External validity of business games. Simulation and Games, 17(4), 447–459.CrossRefGoogle Scholar
  46. Parasuraman, A. (1981). Assesing the worth of business simulation games: Problems and Prospects. Simulation and Games, 12(2), 189–200.CrossRefGoogle Scholar
  47. Peter, P. J. (1981). Construct validity: A review of basic issues and marketing practices. Journal of Marketing Research, 18(2), 133–145.CrossRefGoogle Scholar
  48. Pierfy, D. (1977). Comparative simulation game research: Stumbling blocks and stepping stones. Simulation and Games, 8(2), 255–268. doi: 10.1177/003755007782006.CrossRefGoogle Scholar
  49. Ragin, C. C. (1987). The comparative method: Moving beyond qualitative and quantitative Strategies. Berkeley, CA: University of California Press.Google Scholar
  50. Ragin, C. C. (1994). Constructing social research. Thousand Oaks, CA: Pine Forge Press.Google Scholar
  51. Ragin, C. C. (2000). Fuzzy-set social science. Chicago: University of Chicago Press.Google Scholar
  52. Ragin, C. C. (2004). Redesigning social inquiry [Slide-show PPT]. http://eprints.ncm.ac.uk/379/1/RSDI-RMF.pdf
  53. Ragin, C. C. (2006a). How case-orientated research challenges variable-orientated research. Comparative Social Research, 16, 27–42.Google Scholar
  54. Ragin, C. C. (2006b). The limitations of net-effect thinking. In B. Rihoux & H. Grimm (Eds.), Innovative comparative methods for policy analysis. Beyond the quantitative-qualitative divide (pp. 13–41). New York: Springer.CrossRefGoogle Scholar
  55. Ragin, C. C. (2006c). Set relations in social research: Evaluating their consistency and coverage. Political Analysis, 14(3), 291–310. doi: 10.1093/pan/mpj019.CrossRefGoogle Scholar
  56. Ragin, C. C. (2007a). Fuzzy sets: Calibration versus measurement. In J. M. Box-Steffensmeier, H. E. Brady, & D. Collier (Eds.), The Oxford handbook of political methodology. http://toktok.persiangig.com/other/Calibration%20fuzzy%20measurement.pdf
  57. Ragin, C. C. (2007b). Qualitative comparative analysis using fuzzy-sets (fsQCA). In B. Rihoux & C. C. Ragin (Eds.), Configurational comparative analysis. Thousand Oaks, CA: Sage.Google Scholar
  58. Ragin, C. C. (2008a). Online tutorial: Qualitative comparative analysis and fuzzy sets. http://www.fsqca.com
  59. Ragin, C. C. (2008b). Redesigning social inquiry. London: University of Chicago Press.Google Scholar
  60. Ragin, C. C. (2008c). Redesigning social inquiry: Fuzzy sets and beyond. Chicago: University of Chicago Press.Google Scholar
  61. Ragin, C. C. (2008d). What is qualitative comparative analysis (QCA)? Symposium conducted at the meeting of the 3rd ESRC Research Methods Festival. Retrieved May 5, 2010, from http://eprints.ncrm.ac.uk/250/1/What_is_QCA.pdf
  62. Ragin, C. C. (2009). Qualitative comparative analysis using fuzzy sets (fsQCA). In B. Rihoux & C. Ragin (Eds.), Configurational comparative methods: Qualitative comparative analysis (QCA) and related techniques. Thousand Oaks, CA: Sage.Google Scholar
  63. Ragin, C. C., & Rihoux, B. (2004). Qualitative comparative analysis (QCA): State of the art and prospects. Qualitative Methods Newsletter of the American Political Science Association Organized Section on Qualitative Methods, 2(2), 3–13.Google Scholar
  64. Rihoux, B. (2006). Qualitative comparative analysis (QCA) and related systematic comparative methods: Recent advances and remaining challenges for social science research. International Sociology, 21(5), 679–706. doi: 10.1177/0268580906067836.CrossRefGoogle Scholar
  65. Rihoux, B., & De Meur, G. (2009). Crisp-set qualitative comparative analysis (csQCA). In B. Rihoux & C. C. Ragin (Eds.), Configurational comparative methods: Qualitative comparative analysis (QCA) and related techniques (pp. 33–68). Thousand Oaks, CA: Sage.CrossRefGoogle Scholar
  66. Rihoux, B., & Lobe, B. (2008). The case for qualitative comparative analysis (QCA): Adding leverage for thick cross-case comparison. In D. Byrne & C. C. Ragin (Eds.), The Sage handbook of case-based methods (pp. 222–242). Thousand Oaks, CA: Sage.Google Scholar
  67. Rihoux, B., & Ragin, C. C. (2009). Configurational comparative methods. London: Sage.Google Scholar
  68. Salmon, P. (2003). How do we recognize good research? The Psychologist, 16(1), 24–27.Google Scholar
  69. Schank, R. C., Berhman, T. R., & Macpherson, K. A. (1999). Learning by doing. In C. M. Reigeluth (Ed.), Instructional-design theories and models: A new paradigm of instructional theory. Mahwah, NJ: Erlbaum.Google Scholar
  70. Schank, R. C., Fano, A., Jona, M., & Bell, B. (1993). The design of goal-based scenarios. Evanston, IL: Northwestern University Press.Google Scholar
  71. Spanier, N. (2011). Competence and incompetence training, impact on executive decision-making capability: Advancing theory and testing. Doctoral thesis. Auckland University of Technology, Auckland, New Zealand.Google Scholar
  72. Wagemann, C., & Schneider, C. Q. (2007). Standards of good practice in qualitative comparative analysis (QCA) and fuzzy sets. Compasss working paper, WP2007-51.Google Scholar
  73. Wolfe, J. (1976). Correlates and measures of the external validity of computer-based business policy decision-making environments. Simulation and Games, 7, 411–438. doi: 10.1177/003755007674003.CrossRefGoogle Scholar
  74. Wolfe, J. (1985). The teaching effectiveness of games in collegiate business schools. Simulation and Games, 16, 251–288.CrossRefGoogle Scholar
  75. Wolfe, J., & Roberts, C. R. (1986). The external validity of a business management game: A five-year longitudinal study. Simulation and Games, 17, 45–59. doi: 10.1177/0037550086171004.CrossRefGoogle Scholar
  76. Woodside, A. G. (1990). Outdoor advertising as experiments. Journal of the Academy of Marketing Science, 18(3), 229–237. doi: 10.1177/009207039001800305.CrossRefGoogle Scholar
  77. Woodside, A. G. (2010). Principles of case study research. Boston: Boston College, Carroll School of Business. http://gfa2010.univie.ac.at/fileadmin/user_upload/GFKconference/GFA2010/documents/jbr/JBR-example.pdf.Google Scholar
  78. Woodside, A. G. (2011a). Case study research: Theory, methods, practice. Bingley, England: Emerald Group.Google Scholar
  79. Woodside, A. G. (2011b). Responding to the severe limitations of cross-sectional surveys: Commenting on Rong and Wilkinson’s perspectives. Australasian Marketing Journal, 19(3), 153–156. doi: 10.1016/j.ausmj.2011.04.004.CrossRefGoogle Scholar
  80. Woodside, A. G. (2012). Incompetency training: Theory, practice and remedies. Journal of Business Research, 65, 279–293. doi: 10.1016/j.jbusres.2011.10.025.CrossRefGoogle Scholar
  81. Woodside, A. G. (2013). Moving beyond multiple regression analysis to algorithms: Calling of adoption of a paradigm shift from symmetric to asymmetric thinking in data analysis and crafting theory. Journal of Business Research, 10. doi: 10.1016/j.jbusres.2012.12.02
  82. Woodside, A. G., Hsu, S., & Marshall, R. (2010). General theory of cultures’ consequences on international tourism behavior. Journal of Business Research, 64(8), 785–799.CrossRefGoogle Scholar
  83. Woodside, A. G., & Zhang, M. (2012). Identifying X-consumers using causal recipes: “Whales” and “Jumbo Shrimps” Casino Gamblers. Journal of Gambling Studies, 28(1), 13–26. doi: 10.1007/s10899-011-9241-5.CrossRefGoogle Scholar
  84. Wright, B. D. (1999). Fundamental measurement for psychology. In S. Embretson & S. Hershberger (Eds.), The new rules of measurement. NJ: Lawrence Erlbaum.Google Scholar
  85. Yamasaki, S., & Rihoux, B. (2009). A commented review of applications. In B. Rihoux & C. C. Ragin (Eds.), Configurational comparative methods: Qualitative Comparative Analysis (QCA) and related techniques (pp. 123–146). Thousand Oaks, CA: Sage.CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  • Arch Woodside
    • 1
  • Rouxelle de Villiers
    • 2
  • Roger Marshall
    • 3
  1. 1.Boston CollegeChestnut HillUSA
  2. 2.Department of MarketingUniversity of WaikatoHamiltonNew Zealand
  3. 3.Department of Marketing, Advertising, Retailing & SalesAuckland University of TechnologyAucklandNew Zealand

Personalised recommendations