Advertisement

Understanding and Modeling Configural Causality

  • Arch Woodside
  • Rouxelle de Villiers
  • Roger Marshall
Chapter
  • 620 Downloads

Abstract

A major objective of this study is to design developmental interventions or combinations of causal conditions (used interchangeably with “teaching methods”) that include managers’ use of appropriate heuristics and other decision-making tools to ensure decision competency and decision confidence. This study investigates the impact of four different tools, namely: role-play or simulated interactions in goal based scenarios; using inter-active decision-making strategies; employing a devil’s advocate to cause dissent and in-depth discussion; and, knowledge-based decision aids in competency and incompetent decision-making. Furthermore, this research aims to improve understanding of why managers make incompetent decisions and explores how they can be educated or supported to make competent decisions. The study extends the work of Armstrong (2003), Armstrong and Green (2005), Gigerenzer (2008), Gigerenzer and Brighton (2009) and Schank, Berhman, and Macpherson (1999) and illuminates, through data gathering and critical analysis, the conceptual deductions in developing a new theory of Decision-Competency Development Interventions (DCDI) by testing several theories with the same model.

Keywords

Decision Confidence Qualitative Comparative Analysis Antecedent Condition Assessment Centre Competency Training 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. Anderson, L., Boud, D., & Cohen, R. (2000). Experience-based learning. In G. Foley (Ed.), Understanding adult education and training (2nd ed., pp. 225–239). St. Leonards, NSW: Allen and Unwin.Google Scholar
  2. Anderson, P. H., & Lawton, L. (2009). Business simulations and cognitive learning: Developments, desires and future directions. Simulation and Gaming, 40(2), 193–216. doi: 10.1177/1046878108321624.CrossRefGoogle Scholar
  3. Andrew, T. (2010). Importance of role play learning activities in technology classes. http://www.suite101.com/content/importance-of-role-play-learning-activities-in-technology
  4. Armstrong, J. S. (1991). Prediction of consumer behavior by experts and novices. Journal of Consumer Research, 18(2), 251–256.CrossRefGoogle Scholar
  5. Armstrong, E. K. (2003). Applications of role playing in tourism management teaching: Evaluation of a learning method. Journal of Hospitality, Leisure, Sport and Tourism Education, 2(1), 5–16. doi: 10.3794/johlste.21.24.CrossRefGoogle Scholar
  6. Armstrong, J. S., & Green, K. C. (2005). Demand forecasting: Evidence-based method. In L. Moutinho & G. Southern (Eds.), Strategic marketing management: A business process approach. Hampshire: Cengage Learning.Google Scholar
  7. Armstrong, J. S., & Green, K. C. (2007). Competitor-oriented objectives: Myth of market share. International Journal of Business, 12(1), 116–136.Google Scholar
  8. Beaver, D. (1999). NLP for lazy learning. Shaftesbury, England: Element Books.Google Scholar
  9. Beirne, M., & Knight, S. (2007). From community theatre to critical management studies. Management Learning, 38(5), 591–611. doi: 10.1177/1350507607083209.CrossRefGoogle Scholar
  10. Bloom, B. S. (1956). Taxonomy of educational objectives: The cognitive domain. New York: David McKay.Google Scholar
  11. Bloom, B. S., Englehart, M. D., First, E. D., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy of educational objectives: The classification of educational goals (Handbook 1: The cognitive domain). New York: David McKay.Google Scholar
  12. Bosse, H. M., Nickel, M., Huwendiek, S., Jünger, J., Schulz, J. H., & Nikendei, C. (2010). Peer role-play and standardised patients in communication training: A comparative study on the student perspective on acceptability, realism, and perceived effect. BMC Medical Education, 10(1), 27. doi: 10.1186/1472-6920-10-27.CrossRefGoogle Scholar
  13. Braumoeller, B. F., & Goertz, G. (2000). The methodology of necessary conditions. American Journal of Political Science, 44(4), 844–858.CrossRefGoogle Scholar
  14. Bray, D. W., & Grant, D. L. (1966). The assessment center in the measurement of potential. Psychological Monographs, 80(17). doi: 10.1037/h0093895.Google Scholar
  15. Brennan, R., & Pearce, G. (2008). Educational drama: A tool for promoting marketing learning? International Journal of Management, 8(1), 1–9. doi: 10.3794/ijme.81.237.Google Scholar
  16. Burns, R. B., & Burns, R. A. (2008). Business research methods and statistics using SSPS. London: Sage.Google Scholar
  17. Byrne, D., & Ragin, C. C. (2009). The Sage handbook of case-based methods. London: Sage.CrossRefGoogle Scholar
  18. Campbell, D. T. (1957). Factors relevant to the validity of experiments in social settings. Psychological Bulletin, 54(4), 297–312. doi: 10.1037/h0040950.CrossRefGoogle Scholar
  19. Campbell, D. T., & Stanley, J. C. (1963a). Experimental and quasi-experimental designs for research. Boston: Houghton Mifflin.Google Scholar
  20. Campbell, D. T., & Stanley, J. C. (1963b). Experimental and quasi-experimental designs for research on teaching. In N. L. Gage (Ed.), Handbook on research on teaching (pp. 171–246). Chicago: Rand McNally.Google Scholar
  21. Castleberry, S. B. (1990). An in-basket exercise for sales courses. Marketing Education Review, 1, 51–56.CrossRefGoogle Scholar
  22. Chan, H., Levitt, R. L., & Garvin, M. J. (2010, November 4–6). Collective effect of strategic, cultural and institutional factors on concession renegotiations. Paper presented at the meeting of the Engineering Project Organization Conference, South Lake Tahoe, CA.Google Scholar
  23. Cohen, M. (2005). Wittgenstein’s beetle and other classic thought experiments. Oxford: Blackwell.CrossRefGoogle Scholar
  24. Cook, T. D., & Campbell, D. T. (1979). Quasi-experimentation: Design and analysis for field settings. Boston: Houghton Mifflin.Google Scholar
  25. Cooper, B. (2005). Applying Ragin’s crisp and fuzzy-set QCA to large datasets: Social class and educational achievement in the National Child Development Study. Social Research Online, 10(2). www.socresonline.org.uk/10/2/cooper.html Google Scholar
  26. Craik, K. H., Ware, A. P., Kamp, J., O’Reilly, C., Staw, B., & Zedeck, S. (2002). Explorations of construct validity in a combined managerial and personality assessment programme. Journal of Occupational and Organizational Psychology, 75(2), 171–193. doi: 10.1348/09631790260098758.CrossRefGoogle Scholar
  27. Darley, J. M. (1999). Methods for the study of evil-doing actions. Personality and Social Psychology Review, 3(3), 269–275. doi: 10.1207/s15327957pspr0303_9.CrossRefGoogle Scholar
  28. Day, N. E. (2003). Can performance raters be more accurate? Investigating the benefits of prior knowledge on performance dimensions. Journal of Managerial Issues, 7, 323–343.Google Scholar
  29. de Bono, E. (1985). The CoRT thinking program. In J. W. Segal, S. F. Chipman, & F. Glaser (Eds.), Thinking and learning skills volume 1: Relating instruction to research. New York, NY: Routledge.Google Scholar
  30. De Bono, E. (1999). Six thinking hats. New York: Little, Brown and Company.Google Scholar
  31. De Meur, G., Rihoux, B., & Yamasaki, S. (2008). Addressing the critiques of CQA. In B. Rihoux & C. Ragin (Eds.), Configural comparative methods, qualitative comparative analysis (CQA) and related techniques. Thousand Oaks, CA: Sage.Google Scholar
  32. Dewey, J. (1963). Experience and education. New York: Collier.Google Scholar
  33. Druckman, D., & Ebner, N. (2007). Onstage or behind the scenes? Relative learning benefits of simulation role-play and design. Simulation and Gaming, 39(4), 465–479. doi: 10.1177/10468787311377.CrossRefGoogle Scholar
  34. Easterby-Smith, M., Thorpe, R., & Lowe, A. (1991). Management research: An introduction 1. London: Sage.Google Scholar
  35. Elm, D. R., & Taylor, S. S. (2010). Representing wholeness: Learning via theatrical productions. Journal of Management Inquiry, 19(2), 127–136. doi: 10.1177/1056492609360407.CrossRefGoogle Scholar
  36. Evans, G. J., McGuire, M., & Thanyi, D. (2010). Using environmental consulting as a team design project: Role play to reality. http://library.queensu.ca/ojs/index.php/PCEEA/article/view/3099/3037
  37. Faria, A. J. (2001). The changing nature of business simulation/gaming research. Simulation and Gaming, 32(1), 97–110. doi: 10.1177/104687810103200108.CrossRefGoogle Scholar
  38. Feinstein, A. H., & Cannon, H. M. (2002). Constructs of simulation evaluation. Simulation and Gaming, 33(4), 425–440. doi: 10.1177/1046878102238606.CrossRefGoogle Scholar
  39. Feldman, D. C., & Lankau, M. J. (2005). Executive coaching: A review and agenda for future research. Journal of Management, 31(6), 829–848. doi: 10.1177/0149206305279599.CrossRefGoogle Scholar
  40. Frederiksen, N., Saunders, D. R., & Wand, B. (1957). The in-basket test. Psychological Monographs: General and Applied, 71(9), 1–28. Mail Scanner has detected a possible fraud attempt from “psycnet.apa.org.ezproxy.aut.ac.nz” claiming to be http://dx.doi.org/ 10.1037/h0093706.CrossRefGoogle Scholar
  41. Gaugler, B. B., Rosenthal, D. B., Thornton, G. C., III, & Bentson, C. (1987). Meta-analysis of assessment center validity. Journal of Applied Psychology, 72, 493–551.CrossRefGoogle Scholar
  42. Gigerenzer, G. (1991). From tools to theories: A heuristic of discovery in cognitive psychology. Psychological Review, 98(2), 254–267. doi: 10.1037//0033-295X.98.2.254.CrossRefGoogle Scholar
  43. Gigerenzer, G. (2004). Fast and frugal heuristics: The tools of bounded rationality. In D. Koehler & N. Harvey (Eds.), Blackwell handbook of judgment and decision making (pp. 62–88). Oxford: Blackwell.CrossRefGoogle Scholar
  44. Gigerenzer, G. (2008). Why heuristics work. Perspectives on Psychological Science, 3(1), 20–29. doi: 10.1111/j.1745-6916.2008.0058.x.CrossRefGoogle Scholar
  45. Gigerenzer, G., & Brighton, H. (2009). Homo Heuristicus: Why biased minds make better inferences. Topics in Cognitive Science, 1, 107–143. doi: 10.1111/j.1756-8765.2008.01006.x.CrossRefGoogle Scholar
  46. Gigerenzer, G., & Selten, R. (2001). Bounded rationality—the adaptive toolbox. Massachusetts, MA: MIT Press.Google Scholar
  47. Gilovich, T. (1991). How we know what isn’t so. New York: Free Press.Google Scholar
  48. Gladwell, M. (2001). The tipping point: How little things can make a big difference. London: Little, Brown Book Group.Google Scholar
  49. Gladwell, M. (2005). Blink: The power of thinking without thinking. New York: Little Brown Company.Google Scholar
  50. Gooding, C., & Zimmerer, T. (1980). The use of specific industry games in the selection, orientation and training of managers. Human Resource Management, 20(3), 300–318. doi: 10.1002/hrm.3930190105.Google Scholar
  51. Gosen, J., & Washbush, J. (2004). A review of the scholarship on assessing experiential learning effectiveness. Simulation and Gaming, 35(2), 270–293. doi: 10.1177/1046878104263544.CrossRefGoogle Scholar
  52. Green, K. C. (2002). Forecasting decisions in conflict situations: A comparison of game theory, role-playing and unaided judgment. International Journal of Forecasting, 18, 321–344.CrossRefGoogle Scholar
  53. Green, K. C. (2005). Game theory, simulated interaction, and unaided judgment for forecasting decisions in conflicts: Further evidence. International Journal of Forecasting, 21(3), 463–472. doi: 10.1016/j.ijforecast.2005.02.006.CrossRefGoogle Scholar
  54. Green, K. C. (2010). Forecasting methods. Retrieved November 19, 2012, from http://www.forecastingprinciples.com/index.php?option=com_content&task=view&id=16&Itemid=16
  55. Green, K. C., & Armstrong, J. S. (2009). Role-thinking: Standing in other people’s shoes to forecast decisions in conflicts. MPRA Paper, 16422 (July), 1–8.Google Scholar
  56. Gross, M. E. (2010). Aligning public-private partnership contracts with public objectives for transportation infrastructure. PhD thesis. Virginia Technicon, Virginia, VA. http://scholar.lib.vt.edu.ezproxy.aut.ac.nz/theses/available/etd-08242010-173605/unrestricted/Gross_ME_D_2010.pdf
  57. Hackney, N. (1971). The fine art of management make-belief. Management Review, 60(11), 44–47.Google Scholar
  58. Hemphill, J. K. (1961). Why people attempt to lead. In L. Petrulo & B. Bass (Eds.), Leadership and interpersonal behavior (pp. 201–215). New York, NY: Holt Rhinehart & Wilson.Google Scholar
  59. Hsu, E. (1989). Role-event gaming simulation in management education. Simulation and Games, 20(4), 409–438. doi: 10.1177/104687818902000402.CrossRefGoogle Scholar
  60. Jeanneret, R., & Silzer, R. (Eds.). (1998). Individual psychological assessment: Predicting behavior in organizational settings. San Francisco: Jossey-Bass.Google Scholar
  61. Jordan, E., Gross, M. E., Javernick-Will, A. M., & Garvin, M. J. (2011). Use and misuse of qualitative comparative analysis. Construction Management and Economics, 29(11), 1159–1173. doi: 10.1080/01446193.2011.640339.CrossRefGoogle Scholar
  62. Kent, R. (2009). Case-centred methods and quantitative analysis. In D. Bryne & C. C. Ragin (Eds.), The Sage handbook of case-based methods. London: Sage.Google Scholar
  63. Kesselman, G. A., Lopez, F. M., & Lopez, F. E. (1982). The development and validation of a self-report scored in-basket test in an assessment center setting. Public Personnel Management Journal, 11, 228–238.CrossRefGoogle Scholar
  64. Keys, B., & Wolfe, J. (1988). Management education and development: Current issues and emerging trends. Journal of Management, 14(2), 205–229. doi: 10.1177/-14920638801400205.CrossRefGoogle Scholar
  65. Kibbee, J. M. (1961). Model building for management games. Paper presented at the meeting of the Simulation and gaming: A symposium, New York.Google Scholar
  66. Knowles, M. (1998). The adult learner: The definitive classic in adult education and human resource development. Houston, TX: Gulf Publishing.Google Scholar
  67. Kolb, D. A. (1984). Experiential learning: Experience as the source of learning and development. Englewood Cliffs, NJ: Prentice Hall.Google Scholar
  68. Kolb, D. A., & Kolb, A. (2008). Experiential learning theory: A dynamic holistic approach to management learning, education and development. In S. J. Armstrong & C. Fukami (Eds.), Handbook of management learning, education and development. London: Sage.Google Scholar
  69. Lambert, S. J., & Fairweather, J. R. (2010). The socio-technical networks of technology users’ innovation in New Zealand: A fuzzy-set qualitative comparative analysis. Canterbury, New Zealand: Lincoln University. http://hdl.handle.net/10182/3339.Google Scholar
  70. Lant, T. K. (1989). Simulation games: A research method for studying organizational behavior. Unpublished manuscript, New York University.Google Scholar
  71. Lazarsfeld, P. F. (1937). Some remarks on the typological procedures in social research. Festschrift fur Sozialforschung, 6, 119–139.Google Scholar
  72. Lean, J., Moizer, J., Towler, M., & Abbey, C. (2006). Simulations and games: Use and barriers in higher education. Active Learning in Higher Education, 7(3), 227–242. doi: 10.1177/1469787406069056.CrossRefGoogle Scholar
  73. Lopez, F. M. (1966). Evaluating executive decision-making: The in-basket technique (AMA Research Study, Vol. 75). New York: American Management Association.Google Scholar
  74. MacCrimmon, K., & Wehrung, D. A. (1984). The risk in-basket. Journal of Business, 57(July), 367–387.Google Scholar
  75. Mauro, P. (1995). Corruption and growth. Quarterly Journal of Economics, 110(3), 681–712. doi: 10.2307/2946696.CrossRefGoogle Scholar
  76. McClelland, D. C. (1998). Identifying competencies with behavioral-event interviews. Psychological Science, 9(5), 331–339. doi: 10.1111/1467-9280.00065.CrossRefGoogle Scholar
  77. McGrath, J. E. (1982). Dilemmatics: The study of research choices and dilemmas. In J. E. McGrath, J. Martin, & R. A. Kulka (Eds.), Judgment calls in research. Beverly Hills, CA: Sage.Google Scholar
  78. Meier, F. C., Newell, W. T., & Pazer, H. L. (1969). Simulation in business and economics. Englewood Cliffs, NJ: Prentice-Hall.Google Scholar
  79. Meyer, H. (1970). The validity of the in-basket test as a measure of managerial performance. Personnel Psychology, 23(3), 297–307. doi: 10.1111/j.1744-6570.1970.tb01657.x.CrossRefGoogle Scholar
  80. Miethe, T. D., & Drass, K. A. (1999). Exploring the social context of instrumental and expressive homicides: An application of qualitative comparative analysis. Journal of Quantitative Criminology, 15(1), 1–21. doi: 10.1023/A:1007550704282.CrossRefGoogle Scholar
  81. Moses, J., Rihoux, B., & Kittel, B. (2005). Mapping political methodology: Reflections on an European perspective. European Political Science-EPS, 4(1), 55–68. doi: 10.1057/palgrave.eps.2210006.CrossRefGoogle Scholar
  82. Pearce, G. (2004). The advantages (benefits) and disadvantages (weaknesses) of educational drama. International Journal of Management Education, 4(2), 29–45.Google Scholar
  83. Pearce, G., & Jackson, J. (2006). Today’s educational drama: Planning for tomorrow’s marketers. Marketing Intelligence and Planning, 24(3), 218–232. doi: 10.1108/02634500610665691.CrossRefGoogle Scholar
  84. Pearson, M. M., Barnes, J. W., & Onken, M. H. (2006). Development of a computerized in-basket exercise for the classroom: A sales management example. Journal of Marketing Education, 28(3), 227–236. doi: 10.1177/0273475306291467.CrossRefGoogle Scholar
  85. Popper, K. (1963). Conjectures and refutations: The growth of scientific knowledge. London: Routledge.Google Scholar
  86. Ragin, C. C. (1987). The comparative method: Moving beyond qualitative and quantitative Strategies. Berkeley, CA: University of California Press.Google Scholar
  87. Ragin, C. C. (2000). Fuzzy-set social science. Chicago: University of Chicago Press.Google Scholar
  88. Ragin, C. C. (2004). Redesigning social inquiry [Slide-show PPT]. http://eprints.ncm.ac.uk/379/1/RSDI-RMF.pdf
  89. Ragin, C. C. (2006a). How case-orientated research challenges variable-orientated research. Comparative Social Research, 16, 27–42.Google Scholar
  90. Ragin, C. C. (2006b). The limitations of net-effect thinking. In B. Rihoux & H. Grimm (Eds.), Innovative comparative methods for policy analysis. Beyond the quantitative-qualitative divide (pp. 13–41). New York: Springer.CrossRefGoogle Scholar
  91. Ragin, C. C. (2006c). Set relations in social research: Evaluating their consistency and coverage. Political Analysis, 14(3), 291–310. doi: 10.1093/pan/mpj019.CrossRefGoogle Scholar
  92. Ragin, C. C. (2008b). Redesigning social inquiry. London: University of Chicago Press.Google Scholar
  93. Ragin, C. C. (2008c). Redesigning social inquiry: Fuzzy sets and beyond. Chicago: University of Chicago Press.Google Scholar
  94. Randall, E. J., Cooke, E. F., & Smith, L. (1985). A successful application of the assessment center concept to the salesperson selection process. Journal of Personal Selling and Sales Management, 5, 53–62.Google Scholar
  95. Rihoux, B. (2006). Qualitative comparative analysis (QCA) and related systematic comparative methods: Recent advances and remaining challenges for social science research. International Sociology, 21(5), 679–706. doi: 10.1177/0268580906067836.CrossRefGoogle Scholar
  96. Rihoux, B., & Grimm, H. (2006). Innovative comparative methods for policy analysis. New York: Springer.CrossRefGoogle Scholar
  97. Rihoux, B., & Lobe, B. (2008). The case for qualitative comparative analysis (QCA): Adding leverage for thick cross-case comparison. In D. Byrne & C. C. Ragin (Eds.), The Sage handbook of case-based methods (pp. 222–242). Thousand Oaks, CA: Sage.Google Scholar
  98. Rihoux, B., & Ragin, C. C. (2009). Configurational comparative methods. London: Sage.Google Scholar
  99. Schank, R. C. (1994). What we learn when we learn by doing. Evanston, IL: Northwestern University Press.Google Scholar
  100. Schank, R. C. (1995). What we learn when we learn by doing. Evanston, IL: Institute for the Learning Sciences Northwestern University. http://cogprintes.org/637/1/LearnbyDoing_Schank.html.Google Scholar
  101. Schank, R. C., Berhman, T. R., & Macpherson, K. A. (1999). Learning by doing. In C. M. Reigeluth (Ed.), Instructional-design theories and models: A new paradigm of instructional theory. Mahwah, NJ: Erlbaum.Google Scholar
  102. Schank, R. C., Fano, A., Jona, M., & Bell, B. (1993). The design of goal-based scenarios. Evanston, IL: Northwestern University Press.Google Scholar
  103. Schibrowsky, J. A., & Peltier, J. W. (1995). The dark side of experiential learning activities. Journal of Marketing Education, 17(1), 13–24. doi: 10.1177/027347539501700103.CrossRefGoogle Scholar
  104. Schippmann, J. S., Hughes, G. L., & Prien, E. P. (1987). The use of structured multi-domain job analysis for the construction of assessment center methods and procedures. Journal of Business and Psychology, 1(1), 353–366. doi: 10.1007/BF01018144.CrossRefGoogle Scholar
  105. Schippmann, J. S., Prien, E. P., & Katz, J. A. (1990). Reliability and validity of in-basket performance measures. Personnel Psychology, 43(4), 837–859. doi: 10.1111/j.1744-6570.1990.tb00685.x.CrossRefGoogle Scholar
  106. Schrodt, P. (2006). Beyond the linear frequentist orthodoxy. Political Analysis, 14(3), 335–339. doi: 10.1093/pan/mpj013.CrossRefGoogle Scholar
  107. Seawright, J. (2005). Qualitative comparative analysis vis-à-vis regression. Studies in Comparative International Development, 40(1), 3–26. doi: 10.1007/BF02686284.CrossRefGoogle Scholar
  108. Shimko, B. W. (1992). Pre-hire assessment of the new work force: Finding wheat (and work ethic) among the chaff. Business Horizons, 35(3), 60–66.CrossRefGoogle Scholar
  109. Simon, H. A. (1976). Administrative behavior (3rd ed.). New York: Free Press.Google Scholar
  110. Spangenberg, H. H., & Theron, C. C. (2003). Validation of the high performance leadership competencies as measure by an assessment centre in-basket. SA Journal of Industrial Psychology, 29(2), 29–38. doi: 10.4102/sajip.v29i2.106.CrossRefGoogle Scholar
  111. Spanier, N. (2011). Competence and incompetence training, impact on executive decision-making capability: Advancing theory and testing. Doctoral thesis. Auckland University of Technology, Auckland, New Zealand.Google Scholar
  112. Stearns, J. M., Ronald, K., Greenlee, T. B., & Crespy, C. T. (2003). Contexts for communication: Teaching expertise through case-based in-basket exercises. Journal of Education for Business, 78(4), 213–219. doi: 10.1080/08832320309598603.CrossRefGoogle Scholar
  113. Taylor, S. S. (2003). Knowing in your gut and in your head: Doing theatre and my underlying epistemology of communication. Management Communication Quarterly, 17(2), 272–279. doi: 10.1177/0893318903256239.CrossRefGoogle Scholar
  114. Torbet, W. R. (1989). Leading organizational transformation. In R. Woodman & W. Pasmore (Eds.), Research in organizational change and development (Vol. 3, pp. 83–116). Greenwich, CT: JAI.Google Scholar
  115. Tse, D. K., Lee, K., Vertinsky, I., & Wehrung, D. A. (1988). Does culture matter? A cross-cultural study of executives’ choice, decisiveness, and risk adjustment in international marketing. Journal of Marketing, 52(4), 81–95. doi: 10.2307/1251635.CrossRefGoogle Scholar
  116. Tufte, E. R. (2000). Visual explanations: Images and quantities, evidence and narrative. Cheshire, CT: Graphics Press.Google Scholar
  117. Wagemann, C., & Schneider, C. Q. (2007). Standards of good practice in qualitative comparative analysis (QCA) and fuzzy sets. Compasss working paper, WP2007-51.Google Scholar
  118. Wagemann, C., & Schneider, C. Q. (2010). Qualitative comparative analysis (QCA) and Fuzzy-Sets: Agenda for a research approach and a data analysis technique. Comparative Sociology, 9(3), 376–396. doi: 10.1163/156913210X12493538729838.CrossRefGoogle Scholar
  119. Wagner, C. (2004). Teaching information systems via action memos. Journal of Information Systems Education, 15(1), 5–7.Google Scholar
  120. Weick, K. E., Sutcliffe, K. M., & Obstfeld, D. (2005). Organizing and the process of sensemaking. Organizational Science, 16(4), 409–421. doi: 10.1287/orsc.1050.0133.CrossRefGoogle Scholar
  121. Wolfe, J. (1985). The teaching effectiveness of games in collegiate business schools. Simulation and Games, 16, 251–288.CrossRefGoogle Scholar
  122. Wollowick, H. B., & McNamara, W. J. (1969). Relationship of the components of an assessment center to management success. Journal of Applied Psychology, 53, 348–352.CrossRefGoogle Scholar
  123. Woodside, A. G. (2011a). Case study research: Theory, methods, practice. Bingley, England: Emerald Group.Google Scholar
  124. Woodside, A. G. (2011b). Responding to the severe limitations of cross-sectional surveys: Commenting on Rong and Wilkinson’s perspectives. Australasian Marketing Journal, 19(3), 153–156. doi: 10.1016/j.ausmj.2011.04.004.CrossRefGoogle Scholar
  125. Woodside, A. G. (2012a). Consumer evaluations of competing brands: Perceptual versus predictive validity. Psychology & Marketing, 29(6), 458–466.CrossRefGoogle Scholar
  126. Woodside, A. G. (2012c). Proposing a new logic for data analysis in marketing and consumer behavior: Case study research of large-N survey data for estimating algorithms that accurately profile X (extremely high-use) consumers. Journal of Global Scholars of Marketing Science: Bridging Asia and the World, 22(4), 277–289. doi: 10.1080/21639159.2012.717369.CrossRefGoogle Scholar
  127. Woodside, A. G. (2013). Moving beyond multiple regression analysis to algorithms: Calling of adoption of a paradigm shift from symmetric to asymmetric thinking in data analysis and crafting theory. Journal of Business Research, 10. doi: 10.1016/j.jbusres.2012.12.02
  128. Woodside, A. G., Ko, E., & Huan, T. C. (2012). The new logic in building isomorphic theory of management decision realities. Management Decision, 50(5), 765–777. doi: 10.1108/00251741211227429.CrossRefGoogle Scholar
  129. Woodside, A. G., & Zhang, M. (2012). Identifying X-consumers using causal recipes: “Whales” and “Jumbo Shrimps” Casino Gamblers. Journal of Gambling Studies, 28(1), 13–26. doi: 10.1007/s10899-011-9241-5.CrossRefGoogle Scholar
  130. Yanow, D. (2001). Learning in and from improvising. Reflections on Society for Organizational Learning and MIT, 2, 58–62.Google Scholar
  131. Yin, R. (2003). Case study research: Design and methods (3rd ed.). Thousand Oaks, CA: Sage.Google Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  • Arch Woodside
    • 1
  • Rouxelle de Villiers
    • 2
  • Roger Marshall
    • 3
  1. 1.Boston CollegeChestnut HillUSA
  2. 2.Department of MarketingUniversity of WaikatoHamiltonNew Zealand
  3. 3.Department of Marketing, Advertising, Retailing & SalesAuckland University of TechnologyAucklandNew Zealand

Personalised recommendations