Exploring Force and Motion Concepts in Middle Grades Using Computational Modeling: a Classroom Intervention Study

  • Osman AksitEmail author
  • Eric N. Wiebe


Computational thinking (CT) and modeling are authentic practices that scientists and engineers use frequently in their daily work. Advances in computing technologies have further emphasized the centrality of modeling in science by making computationally enabled model use and construction more accessible to scientists. As such, it is important for all students to get exposed to these practices in K-12 science classrooms. This study investigated how a week-long intervention in a regular middle school science classroom that introduced CT and simulation-based model building through block-based programming influenced students’ learning of CT and force and motion concepts. Eighty-two seventh-grade students from a public middle school participated in the study. Quantitative data sources included pre- and post-assessments of students’ understanding of force and motion concepts and CT abilities. Qualitative data sources included classroom observation notes, student interviews, and students’ reflection statements. During the intervention, students were introduced to CT using block-based programming and engaged in constructing simulation-based computational models of physical phenomena. The findings of the study indicated that engaging in building computational models resulted in significant conceptual learning gains for the sample of this study. The affordances of the dynamic nature of computational models let students both observe and interact with the target phenomenon in real time while the generative dimension of model construction promoted a rich classroom discourse facilitating conceptual learning. This study contributes to the nascent literature on integrating CT into K-12 science curricula by emphasizing the affordances and generative dimension of model construction through block-based programming.


Computational thinking Modeling Block-based programming Force and motion Middle school Science 


Compliance with Ethical Standards

Conflict of Interest

The authors declare that they have no conflict of interest.

Ethical Approval

All procedures performed in this study were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards. This article does not contain any studies with animals performed by any of the authors.

Informed Consent

Informed consent was obtained from all individual participants included in the study.


  1. Aho, A. V. (2012). Computation and computational thinking. The Computer Journal, 55(7), 832–835.CrossRefGoogle Scholar
  2. Alonzo, A. C., & Steedle, J. T. (2009). Developing and assessing a force and motion learning progression. Science Education, 93(3), 389–421.CrossRefGoogle Scholar
  3. Bailer-Jones, D. M. (2003). When scientific models represent. International Studies in the Philosophy of Science, 17(1), 59–74.CrossRefGoogle Scholar
  4. Barr, V., & Stephenson, C. (2011). Bringing computational thinking to K-12: what is involved and what is the role of the computer science education community? ACM Inroads, 2(1), 48–54.CrossRefGoogle Scholar
  5. Basu, S., Biswas, G., Sengupta, P., Dickes, A., Kinnebrew, J. S., & Clark, D. (2016). Identifying middle school students’ challenges in computational thinking-based science learning. Research and Practice in Technology Enhanced Learning, 11(13), 1–35.Google Scholar
  6. Basu, S., Kinnebrew, J., Dickes, A., Farris, A. V., Sengupta, P., Winger, J., & Biswas, G. (2012). A science learning environment using a computational thinking approach. In Proceedings of the 20th International Conference on Computers in Education (pp. 722-729).Google Scholar
  7. Bers, M. U. (2010). The TangibleK Robotics program: applied computational thinking for young children. Early Childhood Research and Practice, 12(2). Retrieved November 12, 2019, from
  8. Bers, M. U., Flannery, L., Kazakoff, E. R., & Sullivan, A. (2014). Computational thinking and tinkering: exploration of an early childhood robotics curriculum. Computers & Education, 72, 145–157.CrossRefGoogle Scholar
  9. Boone, W. J., Townsend, J. S., & Staver, J. (2011). Using Rasch theory to guide the practice of survey development and survey data analysis in science education and to inform science reform efforts: an exemplar utilizing STEBI self-efficacy data. Science Education, 95(2), 258–280.CrossRefGoogle Scholar
  10. Boulter, C. J., & Buckley, B. C. (2000). Constructing a typology of models for science education. In Developing models in science education (pp. 41-57). Springer Netherlands.Google Scholar
  11. Brennan, K., & Resnick, M. (2012). New frameworks for studying and assessing the development of computational thinking. In Proceedings of the 2012 Annual Meeting of the American Educational Research Association, Vancouver, Canada.Google Scholar
  12. Broll, B., Zare, H., & Ledeczi, A. (2017). Creating engaging science projects with netsblox. Paper presented at the 2017 IEEE Blocks and Beyond Workshop (B&B).Google Scholar
  13. Casey, P. J. (1997). Computer programming: a medium for teaching problem solving. Computers in the Schools, 13(1-2), 41–51.CrossRefGoogle Scholar
  14. Creswell, J. W. (2015). A concise introduction to mixed methods research. Sage Publications.Google Scholar
  15. Creswell, J. W., & Clark, V. L. P. (2011). Designing and conducting mixed methods research. Sage Publications.Google Scholar
  16. Cronbach, L. J. (1951). Coefficient alpha and the internal structure of tests. Psychometrika, 16(3), 297–334.CrossRefGoogle Scholar
  17. de Jong, T., & van Joolingen, W. R. (1998). Scientific discovery learning with computer simulations of conceptual domains. Review of Educational Research, 68(2), 179–201.CrossRefGoogle Scholar
  18. Delgado, C., & Krajcik, J. (2010). Technology and learning—supports for subject matter learning. In International encyclopedia of education (3rd ed., pp. 197–203). Oxford: Elsevier.CrossRefGoogle Scholar
  19. Denner, J., Werner, L., & Ortiz, E. (2012). Computer games created by middle school girls: can they be used to measure understanding of computer science concepts? Computers & Education, 58(1), 240–249.CrossRefGoogle Scholar
  20. diSessa, A. A. (2000). Changing minds: computers, learning, and literacy. Boston: MIT Press.CrossRefGoogle Scholar
  21. diSessa, A. A. (2004). Metarepresentation: native competence and targets for instruction. Cognition and Instruction, 22(3), 293–331.CrossRefGoogle Scholar
  22. diSessa, A. A., & Sherin, B. L. (1998). What changes in conceptual change? International Journal of Science Education, 20(10), 1155–1191.CrossRefGoogle Scholar
  23. Dykstra Jr., D. I., & Sweet, D. R. (2009). Conceptual development about motion and force in elementary and middle school students. American Journal of Physics, 77(5), 468–476.CrossRefGoogle Scholar
  24. Eryilmaz, A. (2002). Effects of conceptual assignments and conceptual change discussions on students’ misconceptions and achievement regarding force and motion. Journal of Research in Science Teaching, 39(10), 1001–1015.CrossRefGoogle Scholar
  25. Futschek, G. (2006). Algorithmic thinking: the key for understanding computer science. In In International Conference on Informatics in Secondary Schools - Evolution and Perspectives (pp. 159–168). Berlin, Heidelberg: Springer.Google Scholar
  26. Gilbert, S. W. (1991). Model building and a definition of science. Journal of Research in Science Teaching, 28(1), 73–79.CrossRefGoogle Scholar
  27. Grover, S., & Pea, R. (2013). Computational thinking in K–12: a review of the state of the field. Educational Researcher, 42(1), 38–43.CrossRefGoogle Scholar
  28. Grover, S., Cooper, S., & Pea, R. (2014). Assessing computational learning in K-12. In Proceedings of the 2014 Conference on Innovation & Technology in Computer Science Education (pp. 57-62). ACM.Google Scholar
  29. Grover, S., Pea, R., & Cooper, S. (2015). Designing for deeper learning in a blended computer science course for middle school students. Computer Science Education, 25(2), 199–237.CrossRefGoogle Scholar
  30. Grover, S., Pea, R., & Cooper, S. (2016). Factors influencing computer science learning in middle school. In Proceedings of the 47th ACM Technical Symposium on Computing Science Education (pp. 552-557). ACM.Google Scholar
  31. Guzdial, M. (2004). Programming environments for novices. In S. A. Fincher & M. Petre (Eds.), Computer science education research (pp. 127–154). London: Taylor & Francis Group, PLC.Google Scholar
  32. Halloun, I. A., & Hestenes, D. (1985). Common sense concepts about motion. American Journal of Physics, 53(11), 1056–1065.CrossRefGoogle Scholar
  33. Hambrusch, S., Hoffmann, C., Korb, J. T., Haugan, M., & Hosking, A. L. (2009). A multidisciplinary approach towards computational thinking for science majors. In Proceedings of the 40th ACM Technical Symposium on Computer Science Education (SIGCSE '09) (pp. 183–187). New York: ACM.CrossRefGoogle Scholar
  34. Harrison, A. G., & Treagust, D. F. (1998). Modelling in science lessons: are there better ways to learn with models? School Science and Mathematics, 98(8), 420–429.CrossRefGoogle Scholar
  35. Hestenes, D. (2007). Modeling theory for math and science education. In Paper presented at the ICTMA13: The International Community of Teachers of Mathematical Modelling and Applications. Indiana: IL.Google Scholar
  36. Hestenes, D., Wells, M., & Swackhamer, G. (1992). Force concept inventory. The Physics Teacher, 30(3), 141–158.CrossRefGoogle Scholar
  37. Ioannides, C., & Vosniadou, S. (2002). The changing meanings of force. Cognitive Science Quarterly, 2(1), 5–62.Google Scholar
  38. Jones, A. T. (1983). Investigation of students’ understanding of speed, velocity and acceleration. Research in Science Education, 13(1), 95–104.CrossRefGoogle Scholar
  39. K-12 Computer Science Framework Steering Committee. (2016). K–12 computer science framework. Retrieved November 12, 2019, from
  40. Kozhevnikov, M., & Thornton, R. (2006). Real-time data display, spatial visualization ability, and learning force and motion concepts. Journal of Science Education and Technology, 15(1), 111–132.CrossRefGoogle Scholar
  41. Leenaars, F. A., van Joolingen, W. R., & Bollen, L. (2013). Using self-made drawings to support modelling in science education. British Journal of Educational Technology, 44(1), 82–94.CrossRefGoogle Scholar
  42. Linacre, J. M. (2018). Winsteps Rasch measurement computer program. Beaverton: Scholar
  43. Lockwood, J., & Mooney, A. (2018). Developing a Computational Thinking Test using Bebras problems. Paper presented at the Joint Proceedings of the CC-TEL 2018 and TACKLE 2018 Workshops, co-located with 13th European Conference on Technology Enhanced Learning (EC-TEL 2018).Google Scholar
  44. Lu, J. J., & Fletcher, G. H. (2009). Thinking about computational thinking. ACM SIGCSE Bulletin, 41(1), 260–264.CrossRefGoogle Scholar
  45. Lye, S. Y., & Koh, J. H. L. (2014). Review on teaching and learning of computational thinking through programming: what is next for K-12? Computers in Human Behavior, 41, 51–61.CrossRefGoogle Scholar
  46. Maloney, J., Resnick, M., Rusk, N., Silverman, B., & Eastmond, E. (2010). The scratch programming language and environment. ACM Transactions on Computing Education (TOCE), 10(4), 1–15.CrossRefGoogle Scholar
  47. McGrew, K. S. (2009). CHC theory and the human cognitive abilities project: standing on the shoulders of the giants of psychometric intelligence research. Intelligence, 37(1), 1–10.CrossRefGoogle Scholar
  48. National Research Council (NRC). (2010). Committee for the Workshops on Computational Thinking: Report of a workshop on the scope and nature of computational thinking. Washington: National Academies Press.Google Scholar
  49. National Research Council (NRC). (2011). Committee for the Workshops on Computational Thinking: Report of a workshop of pedagogical aspects of computational thinking. Washington: National Academies Press.Google Scholar
  50. National Research Council (NRC) (2012). A framework for K-12 science education: practices, crosscutting concepts, and core ideas. National Academies Press.Google Scholar
  51. Neumann, I., Fulmer, G. W., & Liang, L. L. (2013). Analyzing the FCI based on a force and motion learning progression. Science Education Review Letters, 2013:8-14.Google Scholar
  52. NGSS Lead States. (2013). Next generation science standards: for states, by states. Washington: National Academies Press.Google Scholar
  53. Prain, V., & Tytler, R. (2012). Learning through constructing representations in science: a framework of representational construction affordances. International Journal of Science Education, 34(17), 2751–2773.CrossRefGoogle Scholar
  54. Rasch, G. (1980). Probabilistic models for some intelligence and attainment tests. Chicago: University of Chicago Press.Google Scholar
  55. Reiner, M., Slotta, J. D., Chi, M. T., & Resnick, L. B. (2000). Naive physics reasoning: a commitment to substance-based conceptions. Cognition and Instruction, 18(1), 1–34.CrossRefGoogle Scholar
  56. Resnick, M., Maloney, J., Monroy-Hernández, A., Rusk, N., Eastmond, E., Brennan, K., ... & Kafai, Y. B. (2009). Scratch: Programming for all. Communications of the ACM, 52(11), 60–67.Google Scholar
  57. Robins, A., Rountree, J., & Rountree, N. (2003). Learning and teaching programming: a review and discussion. Computer Science Education, 13(2), 137–172.CrossRefGoogle Scholar
  58. Román-González, M., Pérez-González, J.-C., & Jiménez-Fernández, C. (2016). Which cognitive abilities underlie computational thinking? Criterion validity of the Computational Thinking Test. Computers in Human Behavior, 72, 678–691.CrossRefGoogle Scholar
  59. Román-González, M., Pérez-González, J.-C., Moreno-León, J., & Robles, G. (2018). Can computational talent be detected? Predictive validity of the Computational Thinking Test. International Journal of Child-Computer Interaction, 18, 47–58.CrossRefGoogle Scholar
  60. Rosenquist, M. L., & McDermott, L. C. (1987). A conceptual approach to teaching kinematics. American Journal of Physics, 55(5), 407–415.CrossRefGoogle Scholar
  61. Rutten, N., van Joolingen, W. R., & van der Veen, J. T. (2012). The learning effects of computer simulations in science education. Computers & Education, 58(1), 136–153.CrossRefGoogle Scholar
  62. Schwarz, C. V., & White, B. Y. (2005). Metamodeling knowledge: developing students’ understanding of scientific modeling. Cognition and Instruction, 23(2), 165–205.CrossRefGoogle Scholar
  63. Schwarz, C. V., Reiser, B. J., Davis, E. A., Kenyon, L., Achér, A., Fortus, D., ... & Krajcik, J. (2009). Developing a learning progression for scientific modeling: making scientific modeling accessible and meaningful for learners. Journal of Research in Science Teaching, 46(6), 632-654.Google Scholar
  64. Sengupta, P., Farris, A. V., & Wright, M. (2012). From agents to continuous change via aesthetics: learning mechanics with visual agent-based computational modeling. Technology, Knowledge and Learning, 17(1-2), 23–42.CrossRefGoogle Scholar
  65. Sengupta, P., Kinnebrew, J. S., Basu, S., Biswas, G., & Clark, D. (2013). Integrating computational thinking with K-12 science education using agent-based computation: a theoretical framework. Education and Information Technologies, 18(2), 351–380.CrossRefGoogle Scholar
  66. Sequeira, M., & Leite, L. (1991). Alternate conceptions and history of science in physics teacher education. Science Education, 75(1), 45–56.CrossRefGoogle Scholar
  67. Sherin, B. L. (2001). A comparison of programming languages and algebraic notation as expressive languages for physics. International Journal of Computers for Mathematical Learning, 6(1), 1–61.CrossRefGoogle Scholar
  68. Sherin, B., diSessa, A. A., & Hammer, D. (1993). Dynaturtle revisited: learning physics through collaborative design of a computer model. Interactive Learning Environments, 3(2), 91–118.CrossRefGoogle Scholar
  69. Sneider, C., Stephenson, C., Schafer, B., & Flick, L. (2014). Exploring the science framework and NGSS: computational thinking in the science classroom. Science Scope, 38(3), 10–15.CrossRefGoogle Scholar
  70. Tasar, M. F. (2010). What part of the concept of acceleration is difficult to understand: the mathematics, the physics, or both? ZDM, 42(5), 469–482.CrossRefGoogle Scholar
  71. Tisue, S., & Wilensky, U. (2004, May). Netlogo: a simple environment for modeling complexity. In International conference on complex systems, 21, 16–21.Google Scholar
  72. Tucker, A., Seehorn, D., Carey, S., Moix, D., Fuschetto, B., .... & Verno, A. (2011). CSTA K-12 computer science standards. CSTA Standards Task Force. Retrieved April 20, 2018, from
  73. Voogt, J., Fisser, P., Good, J., Mishra, P., & Yadav, A. (2015). Computational thinking in compulsory education: towards an agenda for research and practice. Education and Information Technologies, 20(4), 715–728.CrossRefGoogle Scholar
  74. Wagh, A., Cook-Whitt, K., & Wilensky, U. (2017). Bridging inquiry-based science and constructionism: exploring the alignment between students tinkering with code of computational models and goals of inquiry. Journal of Research in Science Teaching, 54(5), 615–641.CrossRefGoogle Scholar
  75. Weintrop, D., & Wilensky, U. (2015). To block or not to block, that is the question: students’ perceptions of blocks-based programming. In Proceedings of the 14th International Conference on Interaction Design and Children (pp. 199-208). ACM.Google Scholar
  76. Weintrop, D., & Wilensky, U. (2017). Comparing block-based and text-based programming in high school computer science classrooms. ACM Transactions on Computing Education, 18(1), 1–25.CrossRefGoogle Scholar
  77. Weintrop, D., Beheshti, E., Horn, M., Orton, K., Jona, K., Trouille, L., & Wilensky, U. (2016). Defining computational thinking for mathematics and science classrooms. Journal of Science Education and Technology, 25(1), 127–147.CrossRefGoogle Scholar
  78. Wilensky, U., & Resnick, M. (1999). Thinking in levels: a dynamic systems approach to making sense of the world. Journal of Science Education and Technology, 8(1), 3–19.CrossRefGoogle Scholar
  79. Wilensky, U., Brady, C. E., & Horn, M. S. (2014). Fostering computational literacy in science classrooms. Communications of the ACM, 57(8), 24–28.CrossRefGoogle Scholar
  80. Wilkerson-Jerde, M. H., Gravel, B. E., & Macrander, C. A. (2015). Exploring shifts in middle school learners’ modeling activity while generating drawings, animations, and computational simulations of molecular diffusion. Journal of Science Education and Technology, 24(2-3), 396–415.CrossRefGoogle Scholar
  81. Wu, H. K., & Puntambekar, S. (2012). Pedagogical affordances of multiple external representations in scientific processes. Journal of Science Education and Technology, 21(6), 754–767.CrossRefGoogle Scholar
  82. Yuruk, N., Beeth, M. E., & Andersen, C. (2009). Analyzing the effect of metaconceptual teaching practices on students’ understanding of force and motion concepts. Research in Science Education, 39(4), 449–475.CrossRefGoogle Scholar
  83. Zur-Bargury, I., Parv, B., & Lanzberg, D. (2013). A nationwide exam as a tool for improving a new curriculum. In Proceedings of the 18th ACM Conference on Innovation and Technology in Computer Science Education (pp. 267-272). ACM.Google Scholar

Copyright information

© Springer Nature B.V. 2019

Authors and Affiliations

  1. 1.Dhahran Ahliyya SchoolsDhahranSaudi Arabia
  2. 2.Department of STEM EducationNorth Carolina State UniversityRaleighUSA

Personalised recommendations