Advertisement

Technology, Knowledge and Learning

, Volume 20, Issue 1, pp 27–41 | Cite as

Virtual Experiments or Worked Examples? How to Learn the Control of Variable Strategy

  • Shiyu Liu
Work-in-progress

Abstract

This research investigates the role of virtual experiments and worked examples in the learning of the control of variable strategy (CVS). Sixty-nine seventh-grade students participated in this study over a span of 6 weeks and were engaged in worked example learning and/or virtual experimentation to study the knowledge and procedures associated with CVS. The results show that learning from only worked examples or virtual experiments yields minimal knowledge gain on CVS. While worked examples can be effective learning tools, additional instructional support and hands-on experiences are critical to elicit deeper understanding of CVS. Moreover, virtual experimentation alone is not sufficient given the cognitive demands it requires for students to discover CVS on their own. However, when integrated with worked example learning, conducting virtual experiments with computer visualizations can lead to positive learning outcomes. This work corresponds to the growing research on productive failure, and opens up further discussions on optimal strategies for learning CVS.

Keywords

Control of variable strategy Worked examples Computer visualization Virtual experiments 

Notes

Acknowledgments

Thanks to Dr. Keisha Varma and Dr. Gillian Roehrig for their helpful feedback on the earlier version of this paper and the participants for their enthusiastic cooperation.

References

  1. Achieve Inc. (2013). Next Generation Science Standards. www.nextgenscience.org/next-generation-science-standards.
  2. Alfieri, L., Brooks, P., Aldrich, N., & Tenenbaum, H. (2011). Does discovery-based instruction enhance learning? Journal of Educational Psychology, 103(1), 1–18.CrossRefGoogle Scholar
  3. Ardac, D., & Akaygun, S. (2005). Using static and dynamic visuals to represent chemical change at molecular level. International Journal of Science Education, 27(11), 1269–1298.CrossRefGoogle Scholar
  4. Atkinson, R., & Catrambone, R. (2000). Subgoal learning and the effect of conceptual vs. computational equations on transfer. In L. R. Gleitman & A. K. Joshi (Eds.), Proceedings of the 22th Annual Conference of the Cognitive Science Society (pp.591–596). Mahwah, NJ: Erlbaum.Google Scholar
  5. Atkinson, R., Derry, S., Renkl, A., & Wortham, D. (2000). Learning from examples: Instructional principles from the worked examples research. Review of Educational Research, 70(2), 181–214.CrossRefGoogle Scholar
  6. Carroll, W. (1994). Using worked examples as an instructional support in the algebra classroom. Journal of Educational Psychology, 86(3), 360–367.CrossRefGoogle Scholar
  7. Catrambone, R., & Holyyoak, K. J. (1989). Overcoming contextual limitations on problem-solving transfer. Journal of Experimental Psychology, 15(6), 1147–1156.Google Scholar
  8. Catrambone, R., & Yuasa, M. (2006). Acquisition of procedures: The effects of example elaborations and active learning exercises. Learning and Instruction, 16, 139–153.CrossRefGoogle Scholar
  9. Chen, Z., & Klahr, D. (1999). All other things being equal: Acquisition and transfer of the control of variables strategy. Child Development, 70(5), 1098–1120.CrossRefGoogle Scholar
  10. Chi, M. T. H., Bassok, M., Lewis, M., Reimann, P., & Glaser, R. (1989). Self-explanations: How students study and use examples in learning to solve problems. Cognitive Science, 5, 145–182.CrossRefGoogle Scholar
  11. Chiu, J., & Linn, M. C. (2012). The role of self-monitoring in learning chemistry with dynamic visualizations. In A. Zohar & Y. J. Dori (Eds.), Metacognition in science education (pp. 133–163). Dordrecht, Netherlands: Springer.CrossRefGoogle Scholar
  12. Crippen, K. J., & Earl, B. L. (2004). Considering the efficacy of web-based worked examples in introductory chemistry. Journal of Computers in Mathematics and Science Teaching, 23(2), 151–167.Google Scholar
  13. Darabi, A. A., Nelson, D. W., & Palanki, S. (2007). Acquisition of troubleshooting skills in a computer simulation: Worked example vs. conventional problem solving instructional strategies. Computers in Human Behavior, 23(4), 1809–1819.CrossRefGoogle Scholar
  14. de Jong, T. (1991). Learning and instruction with computer simulations. Education and Computing, 6, 217–229.CrossRefGoogle Scholar
  15. de Jong, T. (2006). Technological advances in inquiry learning. Science, 312, 532–533.CrossRefGoogle Scholar
  16. Dean, D., & Kuhn, D. (2007). Direct instruction vs. discovery: The long view. Science Education, 91(3), 384–397.CrossRefGoogle Scholar
  17. Gerjets, P., Scheiter, K., & Catrambone, R. (2006). Can learning from molar and modular worked examples be enhanced by providing instructional explanations and prompting self-explanations? Learning and Instruction, 16(2), 104–121.CrossRefGoogle Scholar
  18. Hegarty, M. (2004). Dynamic visualizations and learning: Getting to the difficult questions. Learning and Instruction, 14(3), 343–351.CrossRefGoogle Scholar
  19. Jonassen, D. (1997). Instructional design models for well-structured and III-structured problem-solving learning outcomes. Educational Technology Research and Development, 45(1), 65–94.CrossRefGoogle Scholar
  20. Jonassen, D. (2011). Supporting problem solving in PBL. Interdisciplinary Journal of Problem-based Learning, 5(2), 8.CrossRefGoogle Scholar
  21. Kalyuga, S., Chandler, P., Sweller, J., & Tuovinen, J. (2001). When problem solving is superior to studying worked examples. Journal of Educational Psychology, 93(3), 579–588.CrossRefGoogle Scholar
  22. Kapur, M. (2008). Productive failure. Cognition and Instruction, 26(3), 379–424.CrossRefGoogle Scholar
  23. Kapur, M. (2010). Productive failure in mathematical problem solving. Instructional Science, 38(6), 523–550.CrossRefGoogle Scholar
  24. Kapur, M. (2012). Productive failure in learning the concept of variance. Instructional Science, 40(4), 651–672.CrossRefGoogle Scholar
  25. Klahr, D., & Dunbar, K. (1988). Dual space search during scientific reasoning. Cognitive Science, 12, 1–48.CrossRefGoogle Scholar
  26. Klahr, D., & Li, J. (2005). Cognitive research and elementary science instruction: From the laboratory, to the classroom, and back. Journal of Science Education and Technology, 14(2), 217–238.CrossRefGoogle Scholar
  27. Klahr, D., & Nigam, M. (2004). The equivalence of learning paths in early science instruction: Effects of direct instruction and discovery learning. Psychological Science, 15(10), 661–667.CrossRefGoogle Scholar
  28. Klahr, D., Triona, L. M., & Williams, C. (2007). Hands on what? The relative effectiveness of physical versus virtual materials in an engineering design project by middle school children. Journal of Research in Science Teaching, 44(1), 183–203.CrossRefGoogle Scholar
  29. Kohn, A. S. (1993). Preschoolers’ reasoning about density: Will it float? Child Development, 64(6), 1637–1650.CrossRefGoogle Scholar
  30. Kuhn, D., & Angelev, J. (1976). An experimental study of the development of formal operational thought. Child Development, 47(3), 697–706.Google Scholar
  31. Kuhn, D., Iordanou, K., Pease, M., & Wirkala, C. (2008). Beyond control of variables: What needs to develop to achieve skilled scientific thinking? Cognitive Development, 23, 435–451.CrossRefGoogle Scholar
  32. Lee, K., Nicoll, G., & Brooks, D. (2004). A comparison of inquiry and worked example web-based instruction using physlets. Journal of Science Education and Technology, 13(1), 81–88.CrossRefGoogle Scholar
  33. Linn, M. C., Chang, H.-Y., Chiu, J. L., Zhang, H., & McElhaney, K. (2010). Can desirable difficulties overcome deceptive clarity in scientific visualizations? In A. Benjamin (Ed.), Successful remembering and successful forgetting: A Festschrift in honor of Robert A. Bjork (pp. 239–262). New York, NY: Routledge.Google Scholar
  34. Linn, M., Clark, D., & Slotta, J. (2003). WISE design for knowledge integration. Science Education, 87(4), 517–538.CrossRefGoogle Scholar
  35. Matlen, B. J., & Klahr, D. (2013). Sequential effects of high and low instructional guidance on children’s acquisition of experimentation skills: Is it all in the timing? Instructional Science, 41(3), 621–634.CrossRefGoogle Scholar
  36. Miller, G. (1956). The magic number sever, plus or minus two: Some limits on our capacity for processing information. The Psychological Review, 63(2), 81–97.CrossRefGoogle Scholar
  37. Moreno, R. (2006). When worked examples don’t work: Is cognitive load theory at an Impasse? Learning and Instruction, 16(2), 170–181.CrossRefGoogle Scholar
  38. National Research Council. (1996). National science education standards. Washington, DC: National Academy Press.Google Scholar
  39. National Research Council. (2000). Inquiry and the national science standards. Washington, DC: National Academy Press.Google Scholar
  40. National Research Council. (2012). A framework for K-12 science education: Practices, crosscutting concepts, and core ideas. Committee on a Conceptual Framework for New K-12 Science Education Standards. Board on Science Education, Division of Behavioral and Social Sciences and Education. Washington, DC: The National Academics Press.Google Scholar
  41. Pathak, S., Jacobson, M., Kim, B., Zhang, B., & Deng, F. (2008). Learning the physics of electricity with agent-based models: The paradox of productive failure. In T. -W. Chan, G. Biswas, F. -C. Chen, S. Chen, C. Chou, M. Jacobson, Kinshuk, F. Klett, C. -K. Looi, T. Mitrovic, R. Mizoguchi, K. Nakabayashi, P. Reimann, D. Suthers, S. Yang & J. -C. Yang (Eds.), International conference on computers in education (pp. 221–228). Taipei, Taiwan.Google Scholar
  42. Renkl, A., Stark, R., Gruber, H., & Mandl, H. (1998). Learning from worked-out examples: The effects of example variability and elicited self-explanations. Contemporary Educational Psychology, 23(1), 90–108.CrossRefGoogle Scholar
  43. Rittle-Johnson, B., & Star, J. (2009). Compared with what? The effects of different comparisons on conceptual knowledge and procedural flexibility for equation solving. Journal of Educational Psychology, 101(3), 529–544.CrossRefGoogle Scholar
  44. Ross, A. J. (1988). Controlling variables: A meta-analysis of training studies. Review of Educational Research, 57, 405–437.CrossRefGoogle Scholar
  45. Sao Pedro, M. A., Gobert, J. D., & Raziuddin, J. (2010). Comparing pedagogical approaches for the acquisition and long-term robustness of the control of variables strategy. In K. Gomez, L. Lyons, & J. Radinsky (Eds.), Learning in the disciplines: Proceedings of the 9th international conference of the learning sciences, ICLS 2010 (Vol. 1, pp. 1024–1031)., Full Papers Chicago, IL: International Society of the Learning Sciences.Google Scholar
  46. Schauble, L. (1996). The development of scientific reasoning in knowledge-rich contexts. Developmental Psychology, 32(1), 102–119.CrossRefGoogle Scholar
  47. Stark, R., Kopp, V., & Fischer, M. R. (2011). Case-based learning with worked examples in complex domains: Two experimental studies in undergraduate medical education. Learning and Instruction, 21(1), 22–33.Google Scholar
  48. Strand-Cary, M., & Klahr, D. (2008). Developing elementary science skills: Instructional effectiveness and path independence. Cognitive Development, 23(4), 488–511.CrossRefGoogle Scholar
  49. Sweller, J. (1988). Cognitive load during problem solving: Effects on learning. Cognition Science, 12, 257–285.CrossRefGoogle Scholar
  50. Sweller, J., & Chandler, P. (1991). Evidence for cognitive load theory. Cognitive and Instruction, 8, 351–362.CrossRefGoogle Scholar
  51. Sweller, J., & Chandler, P. (1994). Why some material is difficult to learn. Cognition and Instruction, 12(3), 185–233.CrossRefGoogle Scholar
  52. Sweller, J., van Merriënboer, J. J., & Paas, F. G. (1998). Cognitive architecture and instructional design. Educational Psychology Review, 10(3), 251–296.CrossRefGoogle Scholar
  53. Toth, E., Klahr, D., & Chen, Z. (2000). Bridging research and practice: A cognitively based classroom intervention based classroom intervention for teaching experimentation skills to elementary school children. Cognition and Instruction, 18(4), 423–459.CrossRefGoogle Scholar
  54. van Gog, T., Kester, L., & Paas, F. (2011). Effects of worked examples, example-problem, and problem-example on novices’ learning. Contemporary Educational Psychology, 36, 212–218.CrossRefGoogle Scholar
  55. van Gog, T., Paas, F., & van Merriënboer, J. J. (2004). Process-oriented worked examples: Improving transfer performance through enhanced understanding. Instructional Science, 32(1–2), 83–98.Google Scholar
  56. van Gog, T., Paas, F., & van Merriënboer, J. J. (2006). Effects of process-oriented worked examples on troubleshooting transfer performance. Learning and Instruction, 16(2), 154–164.CrossRefGoogle Scholar
  57. Varma, K., & Linn, M. (2012). Using Interactive technology to support students’ understanding of the greenhouse effect and global warming. Journal of Science Education and Technology, 20(1), 453–464.CrossRefGoogle Scholar
  58. Veermans, K., Joolingen, W. V., & de Jong, T. (2006). Use of heuristics to facilitate scientific discovery learning in a simulation learning environment in a physics domain. International Journal of Science Education, 28(4), 341–361.CrossRefGoogle Scholar
  59. Yin, Y., Tomita, M. K., & Shavelson, R. J. (2008). Diagnosing and dealing with student misconceptions: Floating and sinking. Science Scope, 31(8), 34–39.Google Scholar
  60. Zimmerman, C. (2007). The development of scientific thinking skills in elementary and middle school. Developmental Review, 27(2), 172–223.CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media Dordrecht 2014

Authors and Affiliations

  1. 1.Leonhard CenterPennsylvania State UniversityUniversity ParkUSA

Personalised recommendations