Advertisement

Technology, Knowledge and Learning

, Volume 19, Issue 1–2, pp 19–52 | Cite as

On the Embedded Complementarity of Agent-Based and Aggregate Reasoning in Students’ Developing Understanding of Dynamic Systems

  • Walter M. Stroup
  • Uri Wilensky
Article

Abstract

Placed in the larger context of broadening the engagement with systems dynamics and complexity theory in school-aged learning and teaching, this paper is intended to introduce, situate, and illustrate—with results from the use of network supported participatory simulations in classrooms—a stance we call ‘embedded complementarity’ as an account of the relations between two major forms of systems-related learning and reasoning. The two forms of systems reasoning discussed are called ‘aggregate’ and ‘agent-based.’ These forms of reasoning are presented as distinct yet we also outline how there are forms of complementarity, between and within these approaches, useful in analyzing complex dynamic systems. We then explore specific ways in which the embedded complementarity stance can be used to analyze how learner understandings progress in science, technology, engineering, and mathematics-related participatory simulations supported by the HubNet (Wilensky and Stroup 1999c) learning environment developed with support from the National Science Foundation. We found that the learners used and built on the interdependence of agent and aggregate forms of reasoning in ways consistent with the discussion of embedded complementarity outlined in the early parts of the paper.

Keywords

Dynamic systems Complexity theory Agent-based modeling Aggregate modeling Participatory simulations NetLogo HubNet 

Notes

Acknowledgments

We would like to acknowledge the considerable contributions of Dor Abrahamson, Sarah Davis and Andrew Hurford to the activity design, data collection, coding, and analyses contributing to the development of this paper. We would also like to thank Bruce Sherin and Richard Noss for their steadfast support and helpful comments. We are particularly grateful to Michelle Wilkerson-Jerde for her remarkably thorough, insightful, and deeply engaged comments and suggestions regarding early iterations of this paper. Funding from the National Science Foundation, Grant 126227 titled Integrated Simulation and Modeling Environment Project, made this work possible. Texas Instruments also provided significant material support for aspects of this work. The views expressed herein are those of the authors and do not necessarily reflect those of the funding institutions.

References

  1. AAAS (American Association for the Advancement of Science). (1993). Benchmarks for science literacy. New York Oxford: Oxford University Press.Google Scholar
  2. Ares, N., Stroup, W. M., & Schademan, A. R. (2009). The power of mediating artifacts in group-level development of mathematical discourses. Cognition and Instruction, 27(1), 1–24.CrossRefGoogle Scholar
  3. Blauberg, J. X., Sadovsky, V. N., & Yudin, E. G. (1977). Systems theory: Philosophical and methodological problems. Moscow: Progress Publishers.Google Scholar
  4. Borovoy, R., McDonald, M., Martin, F., & Resnick, M. (1996). Things that blink: Computationally augmented name tags. IBM Systems Journal, 35(3), 488–495.CrossRefGoogle Scholar
  5. Boyer, C. (1959). The history of the calculus and its conceptual development (2nd ed.). Mineola: Dover Publications.Google Scholar
  6. Bransford, J. D., Brown, A. L. & Cocking, R. R. (Eds.). (2000). How People Learn. Washington, DC: National Academy Press.Google Scholar
  7. Brown, A. L. (1992). Design experiments: Theoretical and methodological challenges in creating complex interventions. The Journal of the Learning Sciences, 2, 137–178.CrossRefGoogle Scholar
  8. Chen, D., & Stroup, W. (1993). General Systems Theory: Toward a conceptual framework for science and technology education for all. Journal for Science Education and Technology. Google Scholar
  9. Chi, M. T. H. (1992). Conceptual change within and across ontological categories: Examples from learning and discovery in science. In R. Giere (Ed.), Cognitive models of science: Minnesota studies in the philosophy of science (pp. 129–186). Minneapolis, MN: University of Minnesota Press.Google Scholar
  10. Chi, M. T. H. (2000). Misunderstanding emergent processes as causal. In Paper presented at the Annual conference of the American educational research association, April 2000.Google Scholar
  11. Chi, M. T. H. (2005). Common sense conceptions of emergent processes: Why some misconceptions are robust. Journal of the Learning Sciences, 14, 161–199.Google Scholar
  12. Chi, M. T. H., Feltovich, P. J., & Glaser, R. (1981). Categorization and representation of physics problems by experts and novices. Cognitive Science, 5, 121–152.CrossRefGoogle Scholar
  13. Chi, M. T. H., Roscoe, R. D., Slotta, J. D., Roy, M., & Chase, C. C. (2012). Misconceived causal explanations for emergent processes. Cognitive Science, 36(1), 1–61.CrossRefGoogle Scholar
  14. Coakley, J. (2001). Using simulation methods to implement multi-person role-playing board game. http://www.bus.orst.edu/faculty/coakley/beergame.htm.
  15. Cobb, P., Confrey, J., diSessa, A., Lehrer, R., & Schauble, L. (2003). Design experiments in educational research. Educational Researcher, 32(1), 9–13.CrossRefGoogle Scholar
  16. Colella, V. (1998). Participatory simulations: Building collaborative understanding through immersive dynamic modeling. MIT, Masters thesis.Google Scholar
  17. Colella, V., Borovoy, R., & Resnick, M. (1998). Participatory simulations: Using computational objects to learn about dynamic systems. Proceedings of the computer human interface (CHI)’98 conference, Los Angeles, April 1998.Google Scholar
  18. Collier, N. T., & North, M. J. (2011). Repast SC++: A platform for large-scale agent-based modeling. In W. Dubitzky, K. Kurowski, & B. Schott (Eds.), Large-scale computing techniques for complex system simulations. New York: Wiley.Google Scholar
  19. Diehl, E. (1990). Participatory simulation software for managers: The design philosophy behind Microworld’s creator. European Journal of Operations Research, 59, 203–209.Google Scholar
  20. diSessa, A. A., & Cobb, P. (2004). Ontological innovation and the role of theory in design experiments. Journal of the Learning Sciences, 13(1), 77–103.CrossRefGoogle Scholar
  21. Doyle, J. K. (1997). The cognitive psychology of systems thinking. System Dynamics Review, 13(3), 253–266.CrossRefGoogle Scholar
  22. Edelson, D. C. (2002). Design research: What we learn when we engage in design. Journal of the Learning Sciences, 11(1), 105–121.CrossRefGoogle Scholar
  23. Forrester, J. W. (1961). Industrial dynamics. Waltham, MA: Pegasus Communications.Google Scholar
  24. Forrester, J. W. (1968). Principles of systems. Norwalk, CT: Productivity Press.Google Scholar
  25. Forrester, J. W. (1993). Systems dynamics as an organizing framework for pre-college education. System Dynamics Review, 9(2), 183–194.CrossRefGoogle Scholar
  26. Foreman-Roe, S. & Bellinger, G. (2013). Insight Maker [Computer Software]. San Jose, California. Available from http://insightmaker.com/.
  27. Garigliano, L. (1975). SCIS: Children’s understanding of the systems concept. School Science and Mathematics, 75, 245–249.CrossRefGoogle Scholar
  28. Glaser, B. G. (1978). Theoretical sensitivity: Advances in the methodology of grounded theory. Mill Valley, CA: Sociology Press.Google Scholar
  29. Glaser, B. G. (1992). Basics of grounded theory analysis: Emergence vs. forcing. Mill Valley, CA: Sociology Press.Google Scholar
  30. Glaser, B. G., & Strauss, A. (1967). The discovery of grounded theory: Strategies for qualitative research. Chicago: Aldine Publishing Company.Google Scholar
  31. Gobert, J., Horwitz, P., Tinker, B., Buckley, B., Wilensky, U., Levy, S., et al. (2003). Modeling across the curriculum: Scaling up modeling using technology. In the Proceedings of the twenty-fifth annual meeting of the cognitive science society, July 31–August 2, Boston, MA.Google Scholar
  32. Hammer, D., & Berland, L. K. (2014). Confusing claims for data: A critique of common practices for presenting qualitative research on learning. Journal of the Learning Sciences, 23(1), 37–46.CrossRefGoogle Scholar
  33. Hatano, G., & Inagaki, K. (1986). Two courses of expertise. In H. Stevenson, H. Azuma, & K. Hakuta (Eds.), Child development and education in Japan. New York: W.H. Freeman.Google Scholar
  34. Hegedus, S., & Kaput, J. (2003). Exciting new opportunities to make mathematics an expressive classroom activity using newly emerging connectivity technology. In N. A. Pateman, B. J. Dougherty & J. Zilliox (Eds.), Proceedings of the 27th conference of the international group for the psychology of mathematics Education held jointly with the 25th Conference of the North American Chapter of the International Group for the Psychology of Mathematics Education (Vol. 1, pp. 293). Honolulu, Hawaii: College of Education, University of Hawaii.Google Scholar
  35. Hestenes, D. (1995). Modeling software for learning and doing physics. In C. Bernardini, C. Tarsitani, & M. Vicentini (Eds.), Thinking physics for teaching (pp. 25–65). New York: Plenum Press.CrossRefGoogle Scholar
  36. Hmelo-Silver, C. E., & Pfeffer, M. G. (2004). Comparing expert and novice understanding of a complex system from the perspective of structures, behaviors, and functions. Cognitive Science, 28, 127–138.CrossRefGoogle Scholar
  37. Jackson, S., Stratford, S., Krajcik, J., & Soloway, E. (1996). A learner-centered tool for students building models. Communications of the ACM, 39(4), 48–49.CrossRefGoogle Scholar
  38. Jacobson, M., Kapur, M., Hyo-Jeong, S., & Lee, J. (2011). The ontologies of complexity and learning about complex systems. Instructional Science, 39(5), 763–783.CrossRefGoogle Scholar
  39. Karplus, R. (1964). The science curriculum improvement study. Journal of Research in Science Teaching, 2, 293–303.CrossRefGoogle Scholar
  40. Kay, A. (1991). Computer networks and education. Scientific American, 265(3), 138–148.CrossRefGoogle Scholar
  41. Langton, C., & Burkhardt, G. (1997). Swarm. Santa Fe, NM: Santa Fe Institute.Google Scholar
  42. Lemke, J. L. (2000). Across the scales of time: Artifacts, activities, and meanings in ecosocial systems. Mind, Culture, and Activity, 7(4), 273–290.CrossRefGoogle Scholar
  43. Levy, S. T., & Wilensky, U. (2008). Inventing a “mid level” to make ends meet: Reasoning between the levels of complexity. Cognition and Instruction, 26(1), 1–47.CrossRefGoogle Scholar
  44. Luke, S., Cioffi-Revilla, C., Panait, L., Sullivan, K., & Balan, G. (2005). MASON: A multiagent simulation environment. Simulation: Transactions of the Society for Modeling and Simulation International, 81(7), 517–527.CrossRefGoogle Scholar
  45. Mack, A. J. (2007). The role of mathematical aesthetic in network-supported generative design: A case study. Dissertation presented for the completion of requirements for Ph. D. degree at the University of Texas at Austin.Google Scholar
  46. Mandinach, E. B., & Cline, H. F. (1994). Classroom dynamics: Implementing a technology-based learning environment. Hillsdale, NJ: Lawrence Erlbaum Associates.Google Scholar
  47. Mandinach, E. B., & Thorpe, E. T. (1987). The systems thinking and curriculum innovation project: Technical report, part 1. (TR-87). Nichols House, Appian Way, Cambridge MA: Educational Technology Center. Harvard Graduate School of Education, 02138.Google Scholar
  48. Mettes, C. T. C. W. (1987). Factual and procedural knowledge: Learning to solve science problems. In H. Lodewijks, L. E. De Corte, R. Parmentier, & P. Span (Eds.), Learning and Instruction: European research in an international con-text (pp. 285–295). Pergamon/Leuven: Pergamon Press/Leuven University Press.Google Scholar
  49. NRC (National Committee on Science Education Standards, Assessment.). (1996). National science education standards. Washington, DC: National Academy Press.Google Scholar
  50. Ogborn, J., & Wong, D. (1984). A microcomputer dynamic modeling system. Physics education, 19(3), 138–142.Google Scholar
  51. Ogborn, J. (1985). Understanding students’ understandings: An example from dynamics. European Journal of Science Education, 7(2), 141–150.CrossRefGoogle Scholar
  52. Ogborn, J. (1996). Science and the made world, Keynote address: Proceedings of the Science and Technology Education Conference, Hong Kong.Google Scholar
  53. Papert, S. (1980). Mindstorms: Children, computers, and powerful ideas. New York: Basic Books.Google Scholar
  54. Papert, S. (1998). Let’s tie the digital knot. TECHNOS, 7(4), 10–12.Google Scholar
  55. Remmler, C., & Stroup, W. (2012). Implementing participatory activities using cloud-in-a-bottle computing. E.g., http://generative.edb.utexas.edu/apps/or http://generative.edb.utexas.edu/presentations/TRC2013/STEMMin2013.html.
  56. Repenning, A. (1993). AgentSheets: A tool for building domain-oriented dynamic, visual environments. Ph.D. dissertation, Dept. of Computer Science, University of Colorado, Boulder.Google Scholar
  57. Resnick, M. (1994). Turtles, termites and traffic jams: Explorations in massively parallel microworlds. Cambridge, MA: MIT Press.Google Scholar
  58. Resnick, M. (1996). Beyond the centralized mindset. Journal of the Learning Sciences, 5(1), 1–22.CrossRefGoogle Scholar
  59. Resnick, M., & Wilensky, U. (1993). Beyond the deterministic, centralized mindsets: New thinking for new sciences. Presented at the annual conference of the American Educational Research Association, Atlanta, GA.Google Scholar
  60. Resnick, M., & Wilensky, U. (1998). Diving into complexity: Developing probabilistic decentralized thinking through role-playing activities. Journal of the Learning Sciences, 7(2), 153–171.CrossRefGoogle Scholar
  61. Richmond, B., & Peterson, S. (1984). STELLA [Computer Software]. Hanover, NH: High Performance Systems.Google Scholar
  62. Richmond, B., & Peterson, S. (1990). STELLA II [Computer Software]. Hanover, NH: High Performance Systems, Inc.Google Scholar
  63. Roberts, N. (1978). Teaching dynamic feedback systems thinking: An elementary view. Management Science, 24(8), 836–843.CrossRefGoogle Scholar
  64. Roberts, N., Anderson, D., Deal, R., Garet, M., & Shaffer, W. (1983). Introduction to computer simulations: A systems dynamics modeling approach. Reading, MA: Addison Wesley.Google Scholar
  65. Roschelle, J. (1992). Learning by collaborating: Convergent conceptual change. Journal of the Learning Sciences, 2(3), 235–276.CrossRefGoogle Scholar
  66. Schoenfeld, A. H. (1992). On paradigms and methods: What do you do when the ones you know don’t do what you want them to? Issues in the analysis of data in the form of videotapes. Journal of the Learning Sciences, 2, 179–214.CrossRefGoogle Scholar
  67. Senge, P. M. (1990). The fifth discipline. The art and practice of the learning organization, London: Random House. Google Scholar
  68. Slotta, J. D. (2011). In defense of Chi’s ontological incompatibility hypothesis. Journal of the Learning Sciences, 20(1), 151–162.Google Scholar
  69. Slotta, J. D., & Chi, M. T. H. (2006). The impact of ontology training on conceptual change: Helping students understand the challenging topics in science. Cognition and Instruction, 24, 261–289.Google Scholar
  70. Soloway, E., Morris, C., &  Curtis, M. (2001). Making palm-sized computers the PC of choice for K-12. Learning and Leading with Technology, 28(7), 32–34.Google Scholar
  71. Starr, P. (1994). Seductions of Sim. The American Prospect, 5(17), 19–29.Google Scholar
  72. Steed, M. (1992). Stella, A simulation construction kit: Cognitive process and educational implications. Journal of Computers in Mathematics and Science Teaching, 11, 39–52.Google Scholar
  73. Stor, M., & Briggs, W. (1998). Dice and disease in the classroom. The Mathematics Teacher, 91(6), 464–468.Google Scholar
  74. Strauss, A. (1987). Qualitative research for social scientists. Cambridge: Cambridge University Press.CrossRefGoogle Scholar
  75. Strauss, A., & Corbin, J. (1990). Basics of qualitative research. Thousand Oaks, CA: Sage.Google Scholar
  76. Stroup, W. (1993). What the development of non-universal understanding looks like: Results from a qualitative calculus assessment.. Cambridge, MA: Technical Report. Educational Technology Center. Harvard Graduate School of Education.Google Scholar
  77. Stroup, W. (1996). Embodying nominalist constructivism: Making graphical sense of learning the calculus of how much and how fast. Doctoral dissertation. Harvard Graduate School of Education, Cambridge, MA.Google Scholar
  78. Stroup, W. (1997). The root beer game. Calculator Software.Google Scholar
  79. Stroup, W. M. (2002). Understanding qualitative calculus: A structural synthesis of learning research. International Journal of Computers for Mathematical Learning, 7(2), 167–215.CrossRefGoogle Scholar
  80. Stroup, W. (2005). Learning the Basics with Calculus. Journal of Computers in Mathematics and Science Teaching, 24(2), 179–196.Google Scholar
  81. Stroup, W., & Wilensky, U. (2002). Participatory simulations guide. Evanston, IL: Center for Connected Learning and Computer Based Modeling, Northwestern University (Updated 2003, 2004, 2005).Google Scholar
  82. Urquhart, C. (2000). Strategies for conversation and systems analysis in requirements gathering: A qualitative view of analyst-client communication. The Qualitative Report, 4 (1/2). http://www.nova.edu/ssss/QR/QR4-1/urquhart.html.
  83. Ventana Systems. (1995). VenSim [Computer Software]. Harvard, Massachusetts. http://vensim.com.
  84. Wilensky, U. (1993). Connected mathematics: Building concrete relationships with mathematical knowledge. Doctoral dissertation, Cambridge, MA, Media Laboratory, MIT.Google Scholar
  85. Wilensky, U. (1995). Learning probability through building computational models. Proceedings of the nineteenth international conference on the psychology of mathematics education. Recife, Brazil, July 1995.Google Scholar
  86. Wilensky, U. (1996). Modeling rugby: Kick first, generalize later? International Journal of Computers for Mathematical Learning, 1(1), 125–131.Google Scholar
  87. Wilensky, U. (1997). What is normal anyway? Therapy for epistemological anxiety. Educational Studies in Mathematics. Special Edition on Computational Environments in Mathematics Education. R. Noss (Ed.) 33(2), 171–202.Google Scholar
  88. Wilensky, U. (1999a). NetLogo [Computer Software]. Evanston, IL: Center for Connected Learning and Computer-Based Modeling, Northwestern University. Retrieved from http://ccl.northwestern.edu/netlogo.
  89. Wilensky, U. (1999b). GasLab—An extensible modeling toolkit for exploring micro- and macro-views of gases. In N. Roberts, W. Feurzeig, & B. Hunter (Eds.), Computer modeling and simulation in science education. Berlin: Springer.Google Scholar
  90. Wilensky, U. (2001). Modeling nature’s emergent patterns with multi-agent languages. Proceedings of the Eurologo 2001 conference, Linz, Austria.Google Scholar
  91. Wilensky, U. (2003). Statistical mechanics for secondary school: The GasLab modeling toolkit. International Journal of Computers for Mathematical Learning, 8(1), 1–4. [Special Issue on agent-based modeling].CrossRefGoogle Scholar
  92. Wilensky, U., & Jacobson, M. (in press). Complex systems in the learning sciences. In K. Sawyer (Ed.), Handbook of learning sciences (Vol. 2). Google Scholar
  93. Wilensky, U., & Reisman, K. (1998). Learning biology through constructing and testing computational theories—An embodied modeling approach. In Y. Bar-Yam (Ed.), Proceedings of the second international conference on complex systems. Nashua, NH: New England Complex System Institute.Google Scholar
  94. Wilensky, U., & Reisman, K. (2006). Thinking like a wolf, a sheep or a firefly: Learning biology through constructing and testing computational theories—An embodied modeling approach. Cognition & Instruction, 24(2), 171–209.CrossRefGoogle Scholar
  95. Wilensky, U., & Resnick, M. (1999). Thinking in Levels: A dynamic systems perspective to making sense of the world. Journal of Science Education and Technology, 8(1), 3–19.Google Scholar
  96. Wilensky, U., & Stroup, W. (1999a). Participatory simulations: Network-based design for systems learning in classrooms. Proceedings of the conference on computer-supported collaborative learning, CSCL’99, Stanford University.Google Scholar
  97. Wilensky, U., & Stroup, W. (1999b). NetLogo HubNet disease model [Computer Software]. Evanston, IL: Center for Connected Learning and Computer-Based Modeling, Northwestern Institute on Complex Systems, Northwestern University. http://ccl.northwestern.edu/netlogo/models/HubNetDisease.
  98. Wilensky, U., & Stroup, W. (1999c). HubNet [Computer Software]. Evanston, IL: Center for Connected Learning and Computer-Based Modeling, Northwestern University. http://ccl.northwestern.edu/netlogo.
  99. Wilensky, U., & Stroup, W. (2000). Networked gridlock: Students enacting complex dynamic phenomena with the HubNet architecture. Proceedings of the Fourth Annual International Conference of the Learning Sciences: Ann Arbor, MI, June 14–17.Google Scholar
  100. Wilkerson-Jerde, M., & Wilensky, U. (2009). Complementarity in equational and agent-based models: A pedagogical perspective. Paper presented at the Complexity, learning, and research: Under the microscope, new kinds of microscopes, and seeing differently. AERA 2009, San Diego, CA.Google Scholar
  101. Wilkerson-Jerde, M., & Wilensky, U. (2010). Seeing change in the world from different levels: Understanding the mathematics of complex systems. In M. Jacobson (Org.), U. Wilensky (Chair), and Peter Reimann (Discussant), Learning about complexity and beyond: Theoretical and methodological implications for the learning sciences (Vol 2, pp. 187–194). In K. Gomez & J. Radinsky (Eds.) Learning in the disciplines: Proceedings of the 9th international conference of the learning sciences (ICLS 2010), Chicago, IL, June 29–Jul 2.Google Scholar
  102. Zaraza, R., & Fisher, D. (1996). Experiences in developing single-discipline and cross-curricular models for classroom use. http://www.teleport.com/~sguthrie/rondiana.html. Cross Curricular Systems Thinking and Dynamics Using STELLA (CC-STADUS).

Copyright information

© Springer Science+Business Media Dordrecht 2014

Authors and Affiliations

  1. 1.The University of Texas at AustinAustinUSA
  2. 2.Northwestern UniversityEvanstonUSA

Personalised recommendations