Advertisement

Designing Virtual Laboratories to Foster Knowledge Integration: Buoyancy and Density

  • Jonathan M. Vitale
  • Marcia C. Linn
Chapter

Abstract

In this chapter, we report upon the iterative development of an online instructional unit featuring virtual laboratory activities that target the physical science concepts of density and buoyancy. We introduce a virtual laboratory activity that was designed to facilitate exploration of the relationship of mass and volume to buoyancy. We evaluate the virtual laboratory by measuring the extent to which it fosters meaningful experimentation, appropriate interpretation of evidence, and discovery of new ideas. In the first revision, we simplified the exploratory tools. This revision supported better interpretation of evidence related to a specific claim, but limiting potential for discovery of new ideas. In the second revision, we introduced an intuitive graph-based interface that allowed students to specify and rapidly test properties of virtual materials (i.e., mass and volume). This revision facilitated meaningful exploration of students’ ideas, thereby supporting both valid interpretations of evidence related to false claims and discovery of new ideas. We discuss the role that virtual laboratories can play in the design of all laboratory activities by tracking student strategies and offering opportunities to easily test new features.

Keywords

Knowledge integration Authentic inquiry Reflection Discovery Platform Design-based research 

References

  1. Anderson, J. R., Reder, L. M., Simon, H. A., Ericsson, K. A., & Glaser, R. (1998). Radical constructivism and cognitive psychology. Education, 1(1), 227–278.Google Scholar
  2. Barab, S., & Squire, K. (2004). Design-based research: Putting a stake in the ground. The Journal of the Learning Sciences, 13(1), 1–14.CrossRefGoogle Scholar
  3. Bell, P. (2004). On the theoretical breadth of design-based research in education on the theoretical breadth of design-based research in education. Educational Psychologist, 39(4), 243–253.CrossRefGoogle Scholar
  4. Chinn, C. a., & Brewer, W. F. (1993). The role of anomalous data in knowledge acquisition: A theoretical framework and implications for science instruction. Review of Educational Research, 63(1), 1–49.CrossRefGoogle Scholar
  5. Collins, A., Joseph, D., & Bielaczyc, K. (2004). Design research: Theoretical and methodological issues. The Journal of the Learning Sciences, 13(1), 15–42.CrossRefGoogle Scholar
  6. De Jong, T., & Van Joolingen, W. R. (1998). Scientific discovery learning with computer simulations of conceptual domains. Review of Educational Research, 68(2), 179–201.CrossRefGoogle Scholar
  7. De Jong, T., Sotiriou, S., & Gillet, D. (2014). Innovations in STEM education: The go-lab federation of online labs. Smart Learning Environments, 1(3), 1–16.Google Scholar
  8. diSessa, A. A. (2002). Why “conceptual ecology” is a good idea. In M. Limón & L. Mason (Eds.), Reconsidering conceptual change issues in theory and practice (pp. 29–60). Dordrecht: Kluwer.  https://doi.org/10.1007/0-306-47637-1_2.CrossRefGoogle Scholar
  9. Dunbar, K. (1993). Concept discovery in a scientific domain. Cognitive Science, 17(3), 397–434.CrossRefGoogle Scholar
  10. Dunbar, K., & Klahr, D. (1989). Developmental differences in scientific discovery processes. In D. Klahr & K. Kotovsky (Eds.), Complex information processing: The impact of Herbert A. Simon (pp. 109–144). Hillsdale: Erlbaum.Google Scholar
  11. Gobert, J. D., Sao Pedro, M., Raziuddin, J., & Baker, R. S. (2013). From log files to assessment metrics: Measuring students’ science inquiry skills using educational data mining. Journal of the Learning Sciences, 22(4), 521–563.CrossRefGoogle Scholar
  12. Goldstone, R. L., & Wilensky, U. (2008). Promoting transfer by grounding complex systems principles. The Journal of the Learning Sciences, 17(4), 465–516.CrossRefGoogle Scholar
  13. Hewson, P. W., & Hewson, M. G. A. (1984). The role of conceptual conflict in conceptual change and the design of science instruction. Instructional Science, 13(1), 1–13.CrossRefGoogle Scholar
  14. Hmelo-Silver, C. E., Duncan, R. G., & Chinn, C. A. (2007). Scaffolding and achievement in problem-based and inquiry learning: A response to Kirschner, Sweller, and Clark (2006). Educational Psychologist, 42, 99–107.CrossRefGoogle Scholar
  15. Holbrook, J., & Kolodner, J. L. (2000). Scaffolding the development of an inquiry-based (science) classroom. Fourth International Conference of the Learning Sciences, pp. 221–227.Google Scholar
  16. Inhelder, B., & Piaget, J. (1958). The growth of logical thinking from childhood to adolescence. New York: Basic Books.CrossRefGoogle Scholar
  17. Klahr, D., Fay, A. L., & Dunbar, K. (1993). Heuristics for scientific experimentation: A developmental study. Cognitive Psychology, 25, 111–146.CrossRefGoogle Scholar
  18. Koedinger, K. R., & Aleven, V. (2007). Exploring the assistance dilemma in experiments with cognitive tutors. Educational Psychology Review, 19(3), 239–264.CrossRefGoogle Scholar
  19. Kozma, R. B., Chin, E., Russell, J., & Marx, N. (2000). The roles of representations and tools in the chemistry laboratory and their implications for chemistry learning. Journal of the Learning Sciences, 9(2), 105–143.CrossRefGoogle Scholar
  20. Krajcik, J. S., Blumenfeld, P., Marx, R., Bass, K., Fredricks, J., & Soloway, E. (1998). Inquiry in project-based science classrooms: Initial attempts by middle school students. Journal of the Learning Sciences, 7(3), 313–350.CrossRefGoogle Scholar
  21. Krajcik, J. S., Blumenfeld, P. C., Marx, R., & Soloway, E. (2000). Instructional, curricular, and technological supports for inquiry in science classrooms. In J. Minstrell & E. H. Van Zee (Eds.), Inquiring into inquiry learning and teaching in science (pp. 283–315). Washington, DC: American Association for the Advancement of Science.Google Scholar
  22. Linn, M. C. (2006). The knowledge integration perspective on learning and instruction. In R. Sawyer (Ed.), The Cambridge handbook of the learning sciences (pp. 243–264). Cambridge, MA: Cambridge University Press.Google Scholar
  23. Linn, M. C., & Eylon, B.-S. (2011). Science learning and instruction: Taking advantage of technology to promote knowledge integration. New York: Routledge.Google Scholar
  24. Linn, M. C., Clark, D., & Slotta, J. D. (2003). WISE design for knowledge integration. Science Education, 87(4), 517–538.CrossRefGoogle Scholar
  25. Mayer, R. E. (2004). Should there be a three-strikes rule against pure discovery learning? American Psychologist, 59(1), 14–19.CrossRefGoogle Scholar
  26. McElhaney, K. W., & Linn, M. C. (2011). Investigations of a complex, realistic task: Intentional, unsystematic, and exhaustive experimenters. Journal of Research in Science Teaching, 48(7), 745–770.CrossRefGoogle Scholar
  27. McNeill, K. L., Lizotte, D. J., Krajcik, J., & Marx, R. W. (2006). Supporting students’ construction of scientific explanations by fading scaffolds in instructional materials. Journal of the Learning Sciences, 15, 153–191.CrossRefGoogle Scholar
  28. Perkins, D. N., & Grotzer, T. a. (2005). Dimensions of causal understanding: The role of complex causal models in students’ understanding of science. Studies in Science Education, 41(1), 117–165.CrossRefGoogle Scholar
  29. Rutherford, J. (1964). The role of inquiry in science teaching. Journal of Research in Science Teaching, 2, 80–84.  https://doi.org/10.1002/tea.3660020204.CrossRefGoogle Scholar
  30. Sandoval, W. (2014). Conjecture mapping: An approach to systematic educational design research. Journal of the Learning Sciences, 23(1), 18–36.CrossRefGoogle Scholar
  31. Schauble, L., Glaser, R., Raghavan, K., & Reiner, M. (1991). Causal models and experimentation strategies in scientific reasoning. Journal of the Learning Sciences, 1(2), 201–238.CrossRefGoogle Scholar
  32. Smith, C., Carey, S., & Wiser, M. (1985). On differentiation: A case study of the development of the concepts of size, weight, and density. Cognition, 21, 177–237.CrossRefGoogle Scholar
  33. Smith, C., Snir, J., & Grosslight, L. (1992). Models to facilitate using conceptual change: The case of weight-density differentiation. Cognition and Instruction, 9(3), 221–283.CrossRefGoogle Scholar
  34. Smith, J. P., DiSessa, A. A., & Roschelle, J. (1993). Misconceptions reconceived: A constructivist analysis of knowledge in transition. The Journal of the Learning Sciences, 3, 115–163.CrossRefGoogle Scholar
  35. States, N. L. (2013). Next generation science standards: For states, by states. Washington, DC: National Academy Press.Google Scholar
  36. Tschirgi, J. E. (1980). Sensible reasoning: A hypothesis about hypotheses. Child Development, 51(1), 1–10.CrossRefGoogle Scholar
  37. Vitale, J. M., Lai, K., & Linn, M. C. (2015). Taking advantage of automated assessment of student-constructed graphs in science. Journal of Research in Science Teaching, 52(10), 1426–1450.CrossRefGoogle Scholar
  38. Vitale, J. M., Madhok, J., & Linn, M. C. (2016). Designing a data-centered approach to inquiry practices with virtual models of density. In C. Looi, J. Polman, U. Cress, & P. Reimann (Eds.), Transforming learning, empowering learners: The International Conference of the Learning Sciences (ICLS) 2016 (pp. 591–598). Singapore: International Society of the Learning Sciences.Google Scholar
  39. Wason, P. C. (1960). On the failure to eliminate hypotheses in a conceptual task. Quarterly Journal of Experimental Psychology, 12(3), 129–140.CrossRefGoogle Scholar
  40. Wilson, C. D., Taylor, J. A., Kowalski, S. M., & Carlson, J. (2010). The relative effects and equity of inquiry-based and commonplace science teaching on students’ knowledge, reasoning, and argumentation. Journal of Research in Science Teaching, 47(3), 276–301.Google Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  • Jonathan M. Vitale
    • 1
  • Marcia C. Linn
    • 1
  1. 1.Graduate School of EducationUniversity of CaliforniaBerkeleyUSA

Personalised recommendations