Microworlds for experimental research: Having your (control and collection) cake, and realism too

  • Nicholas DifonzoEmail author
  • Donald A. Hantula
  • Prashant Bordia
Simulated Research Environments


Microworlds (MWs) are dynamic computer-generated environments that subjects interact with in the laboratory and that simulate conditions encountered in the field. Precise levels of experimental control and improved accuracy and efficiency of data collection procedures are characteristic of MWs. It is proposed that these benefits are achieved with concomitant gains in internal validity (afforded by high levels of experimental realism) and external validity (afforded by the replication of the temporal-interactive nature of most field phenomena). To illustrate these ideas, three sets of MW studies are described that investigated rumor and behavior in the stock market (Broker), escalation behavior (Inve$tment Choice$), and the application of foraging theory to internet shopping (Cybershopper).


Behavior Research Method Experimental Realism Apply Behavior Analysis Cash Holding Human Decision Process 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


  1. Ainslie, G. (1992).Picoeconomics: The strategic interaction of successive motivational states within the person. New York: Cambridge University Press.Google Scholar
  2. Ajzen, I. (1977). Intuitive theories of events and the effects of base-rate information on prediction.Journal of Personality & Social Psychology,35, 303–314.CrossRefGoogle Scholar
  3. Albers, G., Brand, H., &Cellerier, G. (1991). A microworld for genetic artificial intelligence. In M. Yazdani & R. W. Lawler (Eds.),Artificial intelligence and education (Vol. 2, pp. 63–77). Norwood, NJ: Ablex.Google Scholar
  4. Andreassen, P. B. (1987). On the social psychology of the stock market: Aggregate attributional effects and the regressiveness of prediction.Journal of Personality & Social Psychology,53, 490–496.CrossRefGoogle Scholar
  5. Aronson, E., &Carlsmith, J. M. (1968). Experimentation in social psychology. In G. Lindzey & E. Aronson (Eds.),The handbook of social psychology (2nd ed., Vol. 2, pp. 1–79). Reading, MA: Addison-Wesley.Google Scholar
  6. Asch, S. (1951). Effects of group pressure upon the modification and distortion of judgment. In H. Guetzkow (Ed.),Groups, leadership, and men (pp. 177–190). Pittsburgh: Carnegie Press.Google Scholar
  7. Asher, W. (1994). Experimental controls. In R. J. Corsini (Ed.),Encyclopedia of psychology (2nd ed., Vol. 1, p. 521). New York: Wiley.Google Scholar
  8. Barton, S. (1994). Chaos, self-organization, and psychology.American Psychologist,49, 5–14.CrossRefPubMedGoogle Scholar
  9. Bateman, T. S. (1986). The escalation of commitment in sequential decision making: Situational and personal moderators and limiting conditions.Decision Sciences,17, 33–49.CrossRefGoogle Scholar
  10. Berkowitz, L., &Donnerstein, E. (1982). External validity is more than skin deep: Some answers to criticisms of laboratory experiments.American Psychologist,37, 245–257.CrossRefGoogle Scholar
  11. Boring, E. G. (1954). The nature and history of experimental control.American Journal of Psychology,67, 573–589.CrossRefPubMedGoogle Scholar
  12. Brehmer, B. (1992). Dynamic decision making: Human control of complex systems.Acta Psychologica,81, 211–241.CrossRefPubMedGoogle Scholar
  13. Brehmer, B., &Dörner, D. (1993). Experiments with computersimulated microworlds: Escaping both the narrow straits of the laboratory and the deep blue sea of the field study.Computers in Human Behavior,9, 171–184.CrossRefGoogle Scholar
  14. Brehmer, B., Leplat, J., &Rasmussen, J. (1991). Use of simulation in the study of complex decision making. In J. Rasmussen, B. Brehmer, & J. Leplat (Eds.),Distributed decision making: Cognitive models for cooperative work (pp. 373–386). New York: Wiley.Google Scholar
  15. Cohen, K., &Cyert, R. N. (1965). Simulation of organizational behavior. In James G. March (Ed.),Handbook of organizations (pp. 305–334). Chicago: Rand McNally.Google Scholar
  16. Colle, H. A., &Green, R. F. (1996). Introductory psychology laboratories using graphic simulations of virtual subjects.Behavior Research Methods, Instruments, & Computers,28, 331–335.Google Scholar
  17. Corey, G. (1991).Theory and practice of counseling and psychotherapy (4th ed.). Pacific Grove, CA: Brooks/Cole.Google Scholar
  18. DiFonzo, N., &Bordia, P. (1997). Rumor and prediction: Making sense (but losing dollars) in the stock market.Organizational Behavior & Human Decision Processes,71, 329–353.CrossRefGoogle Scholar
  19. DiFonzo, N., Hantula, D. A., &Bordia, P. (1997). Microworlds for a dynamic I/O psychology in the 21st century.The Industrial- Organizational Psychologist,35, 19–25.Google Scholar
  20. Dipboye, R. L., &Flanagan, M. F. (1979). Research settings in industrial and organizational psychology: Are findings in the field more generalizable than in the laboratory?American Psychologist,34, 141–150.CrossRefGoogle Scholar
  21. Ekker, K., Gifford, G., Leik, S. A., &Leik, R. K. (1988). Using microcomputer game-simulation experiments to study family response to the Mt. St. Helens eruptions.Social Science Computer Review,6, 90–105.CrossRefGoogle Scholar
  22. Elsmore, T. F. (1994). SYNWORK1: A PC-based tool for assessment of performance in a simulated work environment.Behavior Research Methods, Instruments, & Computers,26, 421–426.Google Scholar
  23. Funke, J. (1991). Dealing with dynamic systems: Research strategy, diagnostic approach and experimental results.German Journal of Psychology,16, 24–43.Google Scholar
  24. Gianutsos, R. (1994). Driving advisement with the Elemental Driving Simulator (EDS): When less suffices.Behavior Research Methods, Instruments, & Computers,26, 183–186.Google Scholar
  25. Gifford, R., &Wells, J. (1991). FISH: A commons dilemma simulation.Behavior Research Methods, Instruments, & Computers,23, 437–441.Google Scholar
  26. Goldman, S. V. (1996). Mediating microworlds: Collaborating on high school science activities. In T. Koschmann (Ed.),CSCL: Theory and practice of an emerging paradigm (pp. 45–83). Mahwah, NJ: Erlbaum.Google Scholar
  27. Goltz, S. M. (1992). A sequential learning analysis of decisions in organizations to escalate investments despite continuing costs or losses.Journal of Applied Behavior Analysis,25, 561–574.CrossRefPubMedGoogle Scholar
  28. Goltz, S. M. (in press). Can’t stop on a dime: The roles of matching and momentum in persistence of commitment.Journal of Organizational Behavior Management.Google Scholar
  29. Goltz, S. M., &Northey, J. E., Jr. (in press). Simulating the variability of actual outcomes.Behavior Research Methods, Instruments, & Computers.Google Scholar
  30. Good, R. (1987). Artificial intelligence and science education.Journal of Research in Science Teaching,24, 325–342.CrossRefGoogle Scholar
  31. Guerin, B. (1993).Social facilitation. Paris: Cambridge University Press.CrossRefGoogle Scholar
  32. Hantula, D. A. (1989).Investment and resource allocation following partial returns: An analysis of persistence, escalation, and contrast effects. Unpublished doctoral dissertation, University of Notre Dame.Google Scholar
  33. Hantula, D. A. (1992). The basic importance of escalation.Journal of Applied Behavior Analysis,25, 579–583.CrossRefPubMedGoogle Scholar
  34. Hantula, D. A., &Crowell, C. R. (1994a). Behavioral contrast in a two-option analogue task of financial decision making.Journal of Applied Behavior Analysis,27, 607–617.CrossRefPubMedGoogle Scholar
  35. Hantula, D. A., &Crowell, C. R. (1994b). Intermittent reinforcement and escalation processes in sequential decision making: A replication and theoretical analysis.Journal of Organizational Behavior Management,14, 7–36.CrossRefGoogle Scholar
  36. Hogarth, R. M. (1980). Beyond discrete biases: Functional and dysfunctional aspects of judgmental heuristics.Psychological Bulletin,90, 197–217.CrossRefGoogle Scholar
  37. Keys, J. B., Fulmer, R. M., &Stumpf, S. S. (1996). Microworlds and simuworlds: Practice fields for the learning organization.Organizational Dynamics,24, 36–49.CrossRefGoogle Scholar
  38. Kirby, K. N., &Marakovic, N. N. (1995). Modeling myopic decisions: Evidence for hyperbolic delay-discounting within subjects and amounts.Organizational Behavior & Human Decision Processes,64, 22–30.CrossRefGoogle Scholar
  39. Kruglanski, A. W. (1975). The human subject in the psychology experiment: Fact and artifact. In L. Berkowitz (Ed.),Advances in experimental social psychology (Vol. 8, pp. 101–147). New York: Academic Press.Google Scholar
  40. Lepper, M. R. (1985). Microcomputers in education: Motivational and social issues.American Psychologist,40, 1–18.CrossRefGoogle Scholar
  41. Locke, E. A. (Ed.) (1986).Generalizing from laboratory to field settings: Research findings from industrial-organizational psychology, organizational behavior, and human resource management. Lexington, MA: Lexington Books.Google Scholar
  42. Lowman, J., &Norkus, M. (1987). The SuperShrink interview: Active versus passive questioning and student satisfaction.Computers in Human Behavior,3, 181–192.CrossRefGoogle Scholar
  43. Lundberg, C. C. (1976). Hypothesis creation in organizational behavior research.Academy of Management Review,1, 5–12.CrossRefGoogle Scholar
  44. Milgram, S. (1963). Behavioral study of obedience.Journal of Abnormal Social Psychology,67, 371–378.CrossRefGoogle Scholar
  45. Mook, D. G. (1983). In defense of external invalidity.American Psychologist,38, 379–387.CrossRefGoogle Scholar
  46. Nesher P. (1989). Microworlds in mathematical education: A pedagogical realism. In L. B. Resnick (Ed.),Knowing, learning, and instruction: Essays in honor of Robert Glaser (pp. 187–215). Hillsdale, NJ: Erlbaum.Google Scholar
  47. O’Flaherty, B., &Komaki, J. L. (1992). Going beyond with Bayesian updating.Journal of Applied Behavior Analysis,25, 590–612.Google Scholar
  48. Omodei, M. M., &Wearing, A. J. (1995). The Fire Chief microworld generating program: An illustration of computer-simulated microworlds as an experimental paradigm for studying complex decisionmaking behavior.Behavior Research Methods, Instruments, & Computers,27, 303–316.CrossRefGoogle Scholar
  49. Papert, S. (1980).Mindstorms: Children, computers, and powerful ideas. New York: Basic Books.Google Scholar
  50. Papert, S. (1996).The connected family: Bridging the digital generation gap. Atlanta: Longstreet.Google Scholar
  51. Perkey, M. N. (1986). The effect of a machine-rich environment on courseware development: The process and the product.Behavior Research Methods, Instruments, & Computers,18, 196–204.CrossRefGoogle Scholar
  52. Podsakoff, P. M., &Organ, D. (1986). Self-reports in organizational research: Problems and prospects.Journal of Management,12, 531–544.CrossRefGoogle Scholar
  53. Quintanar, L. R., Crowell, C. R., &Moskal, P. J. (1987). The interactive computer as a social stimulus in human-computer interactions. In G. Salvendy, S. L. Sauter, & J. J. Hurrell, Jr. (Eds.),Social, ergonomic, and stress aspects of work with computers (pp. 303–310). Amsterdam: Elsevier.Google Scholar
  54. Rabinowitz, M., &Feldman, S. (1989). Using computer simulations to investigate individual differences: A look at an addition retrieval model.Learning & Individual Differences,1, 227–246.CrossRefGoogle Scholar
  55. Rachlin, H. R. (1990). Why do people gamble and keep gambling despite heavy losses?Psychological Science,1, 294–297.CrossRefGoogle Scholar
  56. Rachlin, H., &Siegel, E. (1994). Temporal patterning in probabilistic choice.Organizational Behavior & Human Decision Processes,59, 161–176.CrossRefGoogle Scholar
  57. Rajala, A. K. (1996).Foraging for a living in the electronic jungle: Extending the matching law and the delay-reduction hypothesis to consumer behavior on the Internet. Unpublished doctoral dissertation, Temple University, Philadelphia.Google Scholar
  58. Rajala, A. K., &Hantula, D. A. (1997).The behavioral ecology of consumption: Delay reduction in a simulated online shopping mall. Manuscript submitted for publication.Google Scholar
  59. Reiber, L. P. (1992). Computer-based microworlds: A bridge between constructivism and direct instruction.Educational Technology Research & Development,40, 93–106.CrossRefGoogle Scholar
  60. Rosenthal, R., &Rosnow, R. L. (1991).Essentials of behavioral research: Methods and data analysis (2nd ed.). New York: McGraw-Hill.Google Scholar
  61. Rosnow, R. L., &Rosenthal, R. (1997).People studying people: Artifacts and ethics in behavioral research. New York: Freeman.Google Scholar
  62. Rosnow, R. L., Rotheram-Borus, M. J., Ceci, S. J., Blanck, P. D., &Koocher, G. P. (1993). The Institutional Review Board as a mirror of scientific and ethical standards.American Psychologist,48, 821–826.CrossRefPubMedGoogle Scholar
  63. Runkel, P. J., &McGrath, J. E. (1972).Research on human behavior: A systematic guide to method. New York: Holt, Rinehart & Winston.Google Scholar
  64. Schiff, W., Arnone, W., &Cross, S. (1994). Driving assessment with computer-video scenarios: More is sometimes better.Behavior Research Methods, Instruments, & Computers,26, 192–194.Google Scholar
  65. Sitkin, S. B., &Pablo, A. L. (1992). Re-conceptualizing the determinants of risk behavior.Academy of Management Review,17, 9–38.CrossRefGoogle Scholar
  66. Skinner, B. F. (1953).Science and human behavior. New York: Free Press.Google Scholar
  67. Skinner, B. F. (1956). A case history in scientific method.American Psychologist,11, 221–233.CrossRefGoogle Scholar
  68. Skinner, B. F. (1969).Contingencies of reinforcement: A theoretical analysis. Englewood Cliffs, NJ: Prentice-Hall.Google Scholar
  69. Staw, B. M., &Ross, J. (1989). Understanding behavior in escalation situations.Science,246, 216–220.CrossRefPubMedGoogle Scholar
  70. Stone, E. (1977).Research in organizational behavior. Glenview, IL: Scott, Foresman.Google Scholar
  71. Thomas, K. W., &Tymon, W. G. (1982). Necessary properties of relevant research: Lessons from recent criticisms of the organizational sciences.Academy of Management Review,7, 345–352.CrossRefGoogle Scholar
  72. Turkle, S. (1984).The second self: Computers and the human spirit. New York: Simon & Schuster.Google Scholar
  73. Tversky, A., &Kahneman, D. (1980). Causal schemas in judgments under uncertainty. In M. Fishbein (Ed.),Progress in social psychology (pp. 49–72). Hillsdale, NJ: Erlbaum.Google Scholar
  74. Vallacher, R. R., &Nowak, A. (1994).Dynamical systems in social psychology. San Diego: Academic Press.Google Scholar
  75. Walster, E., Aronson, E., &Abrahams, D. (1966). On increasing the persuasiveness of a low prestige communicator.Journal of Experimental Social Psychology,2, 325–342.CrossRefGoogle Scholar
  76. Zajonc, R. B. (1965). Social facilitation.Science,149, 269–274.CrossRefPubMedGoogle Scholar

Copyright information

© Psychonomic Society, Inc. 1998

Authors and Affiliations

  • Nicholas Difonzo
    • 1
    Email author
  • Donald A. Hantula
    • 2
  • Prashant Bordia
    • 3
  1. 1.Department of PsychologyRochester Institute of TechnologyRochester
  2. 2.Temple UniversityPhiladelphia
  3. 3.University of QueenslandSt. LuciaAustralia

Personalised recommendations