Providing guidance in virtual lab experimentation: the case of an experiment design tool

Abstract

The present study employed a quasi-experimental design to assess a computer-based tool, which was intended to scaffold the task of designing experiments when using a virtual lab for the process of experimentation. In particular, we assessed the impact of this tool on primary school students’ cognitive processes and inquiry skills before and after the study’s treatment, using pre- and post-tests. Our research design involved a group of students who used the computer-based tool/scaffold to design the study’s experiments (experimental condition) and a group of students who used a paper-and-pencil worksheet as a scaffold to design the same experiments (control condition). The primary finding of the study was that the use of the computer-based experiment design tool had a more positive effect on students’ inquiry skills related to identifying variables and designing investigations than the paper-and-pencil one. This might be attributed to the functionalities provided only by the computer-based experiment design tool, which enabled students to focus their attention on crucial aspects of the task of designing experiments through (1) maintaining values for constant variables when planning experimental trials and (2) the provision of instant feedback when classifying variables into independent, dependent and controlled variables. Moreover, students in the two conditions displayed differing patterns of interactions among cognitive process and inquiry skills. Implications for designing and assessing similar computer-based scaffolds are discussed.

This is a preview of subscription content, log in to check access.

Fig. 1
Fig. 2

Notes

  1. 1.

    Previous research has largely conceptualized structuring and problematizing strategies of software scaffolds as distinct, namely, one could have the one or the other type (e.g., Kukkonen et al. 2016; Molenaar et al. 2011). Our design incorporated structuring and problematizing strategies in one and the same software scaffold.

  2. 2.

    The option of “fading” would mean that the Go-Lab EDT cannot be conceived as a tool allowing for “working smart” within the frame of distributed intelligence (see the rationale introduced by Pea 2004, p. 443). However, certain functionalities of the Go-Lab EDT, namely the sequential arrangement in planning experimental designs, the table for classifying variables and the provision of the value for constant variables, might be considered as aspects of “working smart”. The latter means that these functionalities could enhance the work of an experienced user of the tool without affecting his/her ability to design a valid experiment.

References

  1. Anderson, L. W., & Krathwohl, D. R. (2001). A taxonomy for learning, teaching, and assessing: A revision of Bloom’s taxonomy of educational objectives. New York: Addison Wesley Longman.

    Google Scholar 

  2. Arnold, J. C., Kremer, K., & Mayer, J. (2014). Understanding students’ experiments—What kind of support do they need in inquiry tasks? International Journal of Science Education, 36, 2719–2749.

    Article  Google Scholar 

  3. Bloom, B. S. (1956). Taxonomy of educational objectives. Handbook I: The cognitive domain. New York: David McKay.

    Google Scholar 

  4. Burns, J., Okey, J., & Wise, K. (1985). Development of an integrated process skill test: TIPS II. Journal of Research in Science Teaching, 22, 169–177.

    Article  Google Scholar 

  5. Chang, K. E., Chen, Y. L., Lin, H. Y., & Sung, Y. T. (2008). Effects of learning support in simulation-based physics learning. Computers & Education, 51, 1486–1498.

    Article  Google Scholar 

  6. Clarke, T., Ayres, P., & Sweller, J. (2005). The impact of sequencing and prior knowledge on learning mathematics through spreadsheet applications. Educational Technology Research and Development, 53, 15–24.

    Article  Google Scholar 

  7. De Backer, L., Van Keer, H., & Valcke, M. (2016). Eliciting reciprocal peer-tutoring groups’ metacognitive regulation through structuring and problematizing scaffolds. The Journal of Experimental Education, 84, 804–828.

    Article  Google Scholar 

  8. De Boer, G. E., Quellmalz, E. S., Davenport, J. L., Timms, M. J., Herrmann-Abell, C. F., Buckley, B. C., et al. (2014). Comparing three online testing modalities: Using static, active, and interactive online testing modalities to access middle school students’ understanding of fundamental ideas and use of inquiry skills related to ecosystems. Journal of Research in Science Teaching, 51, 523–554.

    Article  Google Scholar 

  9. de Jong, T. (2006). Computer simulations—Technological advances in inquiry learning. Science, 312, 532–533.

    Article  Google Scholar 

  10. de Jong, T. (Ed.). (2014). Preliminary inquiry classroom scenarios and guidelines. D1.3. Go-Lab Project (Global Online Science Labs for Inquiry Learning at School). Retrieved from http://www.go-lab-project.eu/sites/default/files/files/deliverable/file/Go-Lab%20D1.3.pdf.

  11. de Jong, T., Sotiriou, S., & Gillet, D. (2014). Innovations in STEM education: The Go-Lab federation of online labs. Smart Learning Environments, 1, 1–16.

    Article  Google Scholar 

  12. de Jong, T., Weinberger, A., van Joolingen, W. R., Ludvigsen, S., Ney, M., Girault, I., et al. (2012). Designing complex and open learning environments based on scenarios. Educational Technology Research & Development, 60, 883–901.

    Article  Google Scholar 

  13. Experiment Design Tool. (2015). http://www.golabz.eu/apps/experiment-design-tool. Accessed 25 January 2018.  

  14. Furtak, E. M. (2006). The problem with answers: An exploration of guided scientific inquiry teaching. Science Education, 90, 453–466.

    Article  Google Scholar 

  15. Glaser, R., Schauble, L., Raghavan, K., & Zeitz, C. (1992). Scientific reasoning across different domains. In E. de Corte, M. Linn, H. Mandl, & L. Verschaffel (Eds.), Computer-based learning environments and problem solving (pp. 345–373). Berlin: Springer.

    Google Scholar 

  16. Go-Lab – Learning by Experience. (2015). http://www.go-lab-project.eu/. Accessed 25 January 2018. 

  17. Go-Lab Sharing and Authoring Platform. (2015). http://www.golabz.eu/. Accessed 25 January 2018. 

  18. Hardy, I., Jonen, A., Möller, K., & Stern, E. (2006). Effects of instructional support within constructivist learning environments for elementary school students’ understanding of ‘‘floating and sinking’’. Journal of Educational Psychology, 98, 307–326.

    Article  Google Scholar 

  19. Havu-Nuutinen, S. (2005). Examining young children’s conceptual change process in floating and sinking from a social constructivist perspective. International Journal of Science Education, 27, 259–279.

    Article  Google Scholar 

  20. Heron, P. R. L., Loverude, M. E., Shaffer, P. S., & McDermott, L. C. (2003). Helping students develop an understanding of Archimedes’ principle. II. Development of research-based instructional materials. American Journal of Physics, 71, 1188–1195.

    Article  Google Scholar 

  21. Hofstein, A., & Lunetta, V. N. (2004). The laboratory in science education: Foundations for the twenty-first century. Science Education, 88, 28–54.

    Article  Google Scholar 

  22. Hofstein, A., Navon, O., Kipnis, M., & Mamlok-Naaman, R. (2005). Developing students’ ability to ask more and better questions resulting from inquiry-type chemistry laboratories. Journal of Research in Science Teaching, 42, 791–806.

    Article  Google Scholar 

  23. Hsin, C.-T., & Wu, H.-K. (2011). Using scaffolding strategies to promote young children’s scientific understandings of floating and sinking. Journal of Science Education and Technology, 20, 656–666.

    Article  Google Scholar 

  24. Inquiry Learning Space on Relative Density. (2015). http://graasp.eu/ils/546b3398e9934012b7c65c65/?lang=el (in Greek). Accessed 25 January 2018. 

  25. Kalyuga, S. (2007). Expertise reversal effect and its implications for learner-tailored instruction. Educational Psychology Review, 19, 509–539.

    Article  Google Scholar 

  26. Kirschner, P. A., Sweller, J., & Clark, R. E. (2006). Why minimal guidance during instruction does not work: An analysis of the failure of constructivist, discovery, problem-based, experiential, and inquiry-based teaching. Educational Psychologist, 41, 75–86.

    Article  Google Scholar 

  27. Klahr, D., & Nigam, M. (2004). The equivalence of learning paths in early science instruction: Effects of direct instruction and discovery learning. Psychological Science, 15, 661–667.

    Article  Google Scholar 

  28. Kremer, K., Specht, C., Urhahne, D., & Mayer, J. (2014). The relationship in biology between the nature of science and scientific inquiry. Journal of Biological Education, 48, 1–8.

    Article  Google Scholar 

  29. Kukkonen, J., Dillon, P., Kärkkäinen, S., Hartikainen-Ahia, A., & Keinonen, T. (2016). Pre-service teachers’ experiences of scaffolded learning in science through a computer supported collaborative inquiry. Education and Information Technologies, 21(2), 349–371. https://doi.org/10.1007/s10639-014-9326-8.

    Article  Google Scholar 

  30. Lin, X., & Lehman, J. D. (1999). Supporting learning of variable control in a computer-based biology environment: Effects of prompting college students to reflect on their own thinking. Journal of Research in Science Teaching, 36, 837–858.

    Article  Google Scholar 

  31. Loucks-Horsley, S., & Olson, S. (Eds.). (2000). Inquiry and the national science education standards: A guide for teaching and learning. Washington, DC: National Academies Press.

    Google Scholar 

  32. Loverude, M. E., Kautz, C. H., & Heron, P. R. L. (2003). Helping students develop an understanding of Archimedes’ principle. I. Research on student understanding. American Journal of Physics, 71, 1178–1187.

    Article  Google Scholar 

  33. Marschner, J., Thillmann, H., Wirth, J., & Leutner, D. (2012). Wie lässt sich die Experimentierstrategie-Nutzung fördern? Ein Vergleich verschiedener gestalteter Prompts. Zeitschrift für Erziehungswissenschaft, 15, 77–93.

    Article  Google Scholar 

  34. Meindertsma, H. B., van Dijk, M. W. G., Steenbeek, H. W., & van Geert, P. L. C. (2014). Stabilty and variability in young children’s understanding of floating and sinking duyring one single-task session. Mind, Brain, and Education, 8, 149–158.

    Article  Google Scholar 

  35. Minner, D. D., Jurist Levy, A., & Century, J. (2010). Inquiry-based science instruction—What is it and does it matter? Results from a research synthesis years 1984-2002. Journal of Research in Science Teaching, 47, 474–496.

    Article  Google Scholar 

  36. Molenaar, I., van Boxtel, C. A. M., & Sleegers, P. J. C. (2010). The effects of scaffolding metacognitive activities in small groups. Computers in Human Behavior, 26, 1727–1738.

    Article  Google Scholar 

  37. Molenaar, I., van Boxtel, C. A. M., & Sleegers, P. J. C. (2011). Metacognitive scaffolding in an innovative learning arrangement. Instructional Science, 39, 785–803.

    Article  Google Scholar 

  38. Molenaar, I., Sleegers, P., & van Boxtel, C. (2014). Metacognitive scaffolding during collaborative learning: A promising combination. Metacognition and Learning, 9, 309–332.

    Article  Google Scholar 

  39. Pea, R. D. (2004). The social and technological dimensions of scaffolding and related theoretical concepts for learning, education, and human activity. Journal of the Learning Sciences, 13, 423–451.

    Article  Google Scholar 

  40. Pedaste, M., Mäeots, M., Siiman, L. A., de Jong, T., van Riesen, S. A., Kamp, E. T., et al. (2015). Phases of inquiry-based learning: Definitions and the inquiry cycle. Educational Research Review, 14, 47–61.

    Article  Google Scholar 

  41. Pollock, E., Chandler, P., & Sweller, J. (2002). Assimilating complex information. Learning and Instruction, 12, 61–86.

    Article  Google Scholar 

  42. Quintana, C., Reiser, B. J., Davis, E. A., Krajcik, J., Fretz, E., Duncan, R. G., et al. (2004). A scaffolding design framework for software to support science inquiry. Journal of the Learning Sciences, 13, 337–386.

    Article  Google Scholar 

  43. Rappolt-Schlichtmann, G., Tenenbaum, H. R., Koepke, M. F., & Fischer, K. W. (2007). Transient and robust knowledge: Contextual support and the dynamics of children’s reasoning about density. Mind, Brain, and Education, 1, 98–108.

    Article  Google Scholar 

  44. Reiser, B. J. (2004). Scaffolding complex learning: The mechanisms of structuring and problematizing student work. Journal of the Learning Sciences, 13, 273–304.

    Article  Google Scholar 

  45. Saye, J. W., & Brush, T. (2002). Scaffolding critical reasoning about history and social issues in multimedia-supported learning environments. Educational Technology Research and Development, 50, 77–96.

    Article  Google Scholar 

  46. Simons, K. D., & Klein, J. D. (2007). The impact of scaffolding and student achievement levels in a problem-based learning environment. Instructional Science, 35, 41–72.

    Article  Google Scholar 

  47. Splash: Virtual Buoyancy Laboratory. (2015). http://www.golabz.eu/lab/splash-virtual-buoyancy-laboratory. Accessed 25 January 2018.

  48. Tsirgi, J. E. (1980). Sensible reasoning: A hypothesis about hypotheses. Child Development, 51, 1–10.

    Article  Google Scholar 

  49. van Joolingen, W. R., Giemza, A., Bollen, L., Bodin, M., Manske, S., Engler, J., et al. (2011). SCY cognitive scaffolds and tools (DII.2). Twente: SCY Consortium.

    Google Scholar 

  50. van Joolingen, W. & Zacharia, Z. C. (2009). Developments in inquiry learning. In N. Balacheff, S. Ludvigsen, T. de Jong, A. Lazonder, & S. Barnes (Eds.), Technology-enhanced learning: A Kaleidosope view (pp. 21–37). Dordrecht: Springer.

    Google Scholar 

  51. Veermans, K., van Joolingen, W. R., & de Jong, T. (2006). Use of heuristics to facilitate scientific discovery learning in a simulation learning environment in a physics domain. International Journal of Science Education, 28(4), 341–361.

    Article  Google Scholar 

  52. Veenman, M. V. J., van Hout-Wolters, B. H. A. M., & Afflerbach, P. (2006). Metacognition and learning: conceptual and methodological considerations. Metacognition and Learning, 1, 3–14.

    Article  Google Scholar 

  53. Wood, D., Bruner, J., & Ross, G. (1976). The role of tutoring in problem solving. Journal of Child Psychology and Psychiatry and Allied Disciplines, 17, 89–100.

    Article  Google Scholar 

  54. Zacharia, Z. C. (2015). Examining whether touch sensory feedback is necessary for science learning through experimentation: A literature review of two different lines of research across K-16. Educational Research Review, 16, 116–137.

    Article  Google Scholar 

  55. Zacharia, Z. C. & de Jong, T. (2014). The effects on students’ conceptual understanding of electric circuits of introducing virtual manipulatives within a physical manipulatives-oriented curriculum. Cognition and Instruction, 32(2), 101–158.

    Article  Google Scholar 

  56. Zacharia, Z. C., Manoli, C., Xenofontos, N., de Jong, T., Pedaste, M., van Riesen, S. A., et al. (2015). Identifying potential types of guidance for supporting student inquiry when using virtual and remote labs in science: A literature review. Educational Technology Research & Development, 63(2), 257–302.

    Article  Google Scholar 

  57. Zervas, P. (Ed.). (2013). The Go-Lab inventory and integration of online labsLabs offered by large scientific organisations. D2.1. Go-Lab Project (Global Online Science Labs for Inquiry Learning at School). Retrieved from http://www.go-lab-project.eu/sites/default/files/files/deliverable/file/Go-Lab-D2.1.pdf.

Download references

Acknowledgements

The authors are thankful to Dr. Anjo Anjewierden and Ms. Siswa A. N. van Riesen for designing and developing the Splash virtual lab and the Experiment design Tool. This study was conducted in the context of the research project Global Online Science Labs for Inquiry Learning at School (Go-Lab), which is funded by the European Community under the Information and Communication Technologies (ICT) theme of the 7th Framework Programme for R&D (Grant Agreement No.: 317601).

Author information

Affiliations

Authors

Corresponding author

Correspondence to Zacharias C. Zacharia.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Appendices

Appendix 1: Experimental design by the control group

Appendix 2: Cognitive processes test

Appendix 3: Inquiry skills test [for each item, its corresponding item number in the TIPSII test by Burns et al. (1985) is given in parentheses]

  1. 1.

    (2) A study of car efficiency is done. The hypothesis tested is that a gasoline additive will increase car efficiency. Five identical cars each receive the same amount of gasoline but with different amounts of Additive A. They travel the same track until they run out of gasoline.

    The research team records the number of kilometers each car travels. How is car efficiency measured in this study?

    1. A.

      The time each car runs out of gasoline.

    2. B.

      The distance each car travels.

    3. C.

      The amount of gasoline used.

    4. D.

      The amount of Additive A used.

  2. 2.

    (6) A police chief is concerned about reducing the speed of cars. He thinks several factors may affect automobile speed.

    Which of the following is a hypothesis he could test about how fast people drive?

    1. A.

      If the drivers are younger, then they are likely to drive faster.

    2. B.

      If the number of cars involved in an accident is larger, then it will be less likely people that are to get hurt.

    3. C.

      If more policemen are on patrol, then the number of car accidents will be fewer.

    4. D.

      If the cars are older, then they are likely to be in more accidents.

  3. 3.

    (10) Jim thinks that if there is more air pressure in a basketball, then it will bounce higher. To investigate this hypothesis he collects several basketballs and an air pump with a pressure gauge. How should Jim test his hypothesis?

    1. A.

      Bounce basketballs with different amounts of force from the same height.

    2. B.

      Bounce basketballs having different air pressures from the same height.

    3. C.

      Bounce basketballs having the same air pressure at different angles from the floor.

    4. D.

      Bounce basketballs having the same amount of air pressure from different heights.

  4. 4.

    (26) A biologist tests this hypothesis: the greater the amount of vitamins given to rats the faster they will grow. How can the biologist measure how fast rats will grow?

    1. A.

      Measure the speed of the rats.

    2. B.

      Measure the amount of exercise the rats receive.

    3. C.

      Weigh the rats every day.

    4. D.

      Weigh the amount of vitamins the rats will eat.

  5. 5.

    (21) A greenhouse manager wants to speed up the production of tomato plants to meet the demands of anxious gardeners. She plants tomato seeds in several trays. Her hypothesis is that the more moisture seeds receive the faster they sprout. How can she test this hypothesis?

    1. A.

      Count the number of days it takes seeds receiving different amounts of water to sprout.

    2. B.

      Measure the height of the tomato plants a day after each watering.

    3. C.

      Measure the amount of water used by plants in different trays.

    4. D.

      Count the number of tomato seeds placed in each of the trays.

  6. 6.

    (23) Lisa wants to measure the amount of heat energy a flame will produce in a certain amount of time. A burner will be used to heat a beaker containing a 1iter of cold water for ten minutes. How will Lisa measure the amount of heat energy produced by the flame?

    1. A.

      Note the change in water temperature after ten minutes.

    2. B.

      Measure the volume of water after ten minutes.

    3. C.

      Measure the temperature of the flame after ten minutes.

    4. D.

      Calculate the time it takes for the liter of water to boil.

  7. 7.

    (27) Some students are considering variables that might affect the time it takes for sugar to dissolve in water. They identify the temperature of the water, the amount of sugar and the amount of water as variables to consider. What is a hypothesis the students could test about the time it takes for sugar to dissolve in water?

    1. A.

      If the amount of sugar is larger, then more water is required to dissolve it. 

    2. B.

      If the water is colder, then it has to be stirred faster to dissolve. 

    3. C.

      If the water is warmer, then more sugar will dissolve. 

    4. D.

      If the water is warmer, then it takes the sugar more time to dissolve. 

A study was done to see if leaves added to soil had an effect on tomato production. Tomato plants were grown in four large tubs. Each tub had the same kind and amount of soil. One tub had 15 kg of rotted leaves mixed in the soil and a second had 10 kg. A third tub had 5 kg and the fourth had no leaves added. Each tub was kept in the sun and watered the same amount. The number of kilograms of tomatoes produced in each tub was recorded.

  1. 8.

    (30) What is a controlled variable in this study?

    1. A.

      Amount of tomatoes produced in each tub.

    2. B.

      Amount of leaves added to the tubs.

    3. C.

      Amount of soil in each tub.

    4. D.

      Number of tubs receiving rotted leaves.

  2. 9.

    (31) What is the dependent or responding variable?

    1. A.

      Amount of tomatoes produced in each tub.

    2. B.

      Amount of leaves added to the tubs.

    3. C.

      Amount of soil in each tub.

    4. D.

      Number of tubs receiving rotted leaves.

  3. 10.

    (32) What is the independent or manipulated variable?

    1. A.

      Amount of tomatoes produced in each tub.

    2. B.

      Amount of leaves added to the tubs.

    3. C.

      Amount of soil in each tub.

    4. D.

      Number of tubs receiving rotted leaves.

  4. 11.

    (35) Ann has an aquarium in which she keeps goldfish. She notices that the fish are very active sometimes but not at others. She wonders what affects the activity of the fish. What is a hypothesis she could test about factors that affect the activity of the fish?

    1. A.

      If you feed fish more, then the fish will become larger.

    2. B.

      If the fish are more active, then they will need more food.

    3. C.

      If there is more oxygen in the water, then the fish will become larger.

    4. D.

      If there is more light on the aquarium, then the fish will be more active.

  5. 12.

    (36) Mr. Bixby has an all-electric house and is concerned about his electricity bill. He decides to study factors that affect how much electrical energy he uses. Which variable might influence the amount of electrical energy used?

    1. A.

      The amount of television the family watches.

    2. B.

      The location of the electricity meter.

    3. C.

      The number of baths taken by family members.

    4. D.

      A and C.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Efstathiou, C., Hovardas, T., Xenofontos, N.A. et al. Providing guidance in virtual lab experimentation: the case of an experiment design tool. Education Tech Research Dev 66, 767–791 (2018). https://doi.org/10.1007/s11423-018-9576-z

Download citation

Keywords

  • Experimental design
  • science education
  • Virtual labs
  • Inquiry skills