Advertisement

C2STEM: a System for Synergistic Learning of Physics and Computational Thinking

  • Nicole M. HutchinsEmail author
  • Gautam Biswas
  • Miklós Maróti
  • Ákos Lédeczi
  • Shuchi Grover
  • Rachel Wolf
  • Kristen Pilner Blair
  • Doris Chin
  • Luke Conlin
  • Satabdi Basu
  • Kevin McElhaney
Article

Abstract

Synergistic learning combining computational thinking (CT) and STEM has proven to be an effective method for advancing learning and understanding in a number of STEM domains and simultaneously helping students develop important CT concepts and practices. We adopt a design-based approach to develop, evaluate, and refine our Collaborative, Computational STEM (C2STEM) learning environment. The system adopts a novel paradigm that combines visual model building with a domain-specific modeling language (DSML) to scaffold learning of high school physics using a computational modeling approach. In this paper, we discuss the design principles that guided the development of our open-ended learning environment (OELE) using a learning-by-modeling and evidence-centered approach for curriculum and assessment design. Students learn by building models that describe the motion of objects, and their learning is supported by scaffolded tasks and embedded formative assessments that introduce them to physics and CT concepts. We have also developed preparation for future learning (PFL) assessments to study students’ abilities to generalize and apply CT and science concepts and practices across problem solving tasks and domains. We use mixed quantitative and qualitative analysis methods to analyze student learning during a semester-long study run in a high school physics classroom. We document some of the lessons learned from this study and discuss directions for future work.

Keywords

STEM+CT Synergistic learning Learning-by-modeling Computational thinking Evidence-centered design Open-ended learning environment 

Notes

Acknowledgments

We thank Dan Schwartz, Brian Broll, Justin Montenegro, Christopher Harris, Naveed Mohammed, Asif Hasan, Carol Tate, Shannon Campe, and Jill Denner for their assistance on this project. This project is supported under National Science Foundation Award DRL-1640199.

Compliance with Ethical Standards

Conflict of Interest

The authors declare that they have no conflict of interest.

Ethical Approval

All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards.

Informed Consent

Informed consent was obtained from all individual participants included in the study.

References

  1. Barab, S., & Squire, K. (2004). Design-based research: Putting a stake in the ground. Journal of the Learning Sciences, 13(1), 1–14.CrossRefGoogle Scholar
  2. Basu, S., Biswas, G., & Kinnebrew, J. S. (2017). Learner modeling for adaptive scaffolding in a computational thinking-based science learning environment. User Modeling and User-Adapted Interaction, 27(1), 5–53.CrossRefGoogle Scholar
  3. Basu, S., Biswas, G., Sengupta, P., Dickes, A., Kinnebrew, J. S., & Clark, D. (2016). Identifying middle school students’ challenges in computational thinking-based science learning. Research and Practice in Technology Enhanced Learning, 11(1), 1–35.CrossRefGoogle Scholar
  4. Biswas, G., Segedy, J. R., & Bunchongchit, K. (2016). From design to implementation to practice a learning by teaching system: Betty’s Brain. International Journal of Artificial Intelligence in Education, 26(1), 350–364.CrossRefGoogle Scholar
  5. Bransford, J. D., Sherwood, R. D., Hasselbring, T. S., Kinzer, C. K., & Williams, S. M. (2012). Anchored instruction: Why we need it and how technology can help. In Cognition, education, and multimedia (pp. 129-156). Routledge.Google Scholar
  6. Bransford, J. D., & Schwartz, D. L. (1999). Chapter 3: Rethinking transfer: A simple proposal with multiple implications. Review of Research in Education, 24(1), 61–101. Retrieved from.  https://doi.org/10.3102/0091732X024001061.CrossRefGoogle Scholar
  7. Broll, B., Volgyesi, P., Sallai, J., & Ledeczi, A. (2016). NetsBlox: a visual language and web-based environment for teaching distributed programming (technical report). http://netsblox.org/NetsBloxWhitePaper.pdf.
  8. Chi, M. T. H. (2005). Common sense conceptions of emergent processes: Why some misconceptions are robust. Journal of the Learning Sciences, 14, 161–199.CrossRefGoogle Scholar
  9. Chin, D. B., Dohmen, I. M., Cheng, B. H., Oppezzo, M. A., Chase, C. C., & Schwartz, D. L. (2010). Preparing students for future learning with Teachable Agents. Educational Technology Research and Development, 58, 649–669.CrossRefGoogle Scholar
  10. Cooper, S., & Cunningham, S. (2010). Teaching computer science in context. ACM Inroads, 1(1), 5–8.CrossRefGoogle Scholar
  11. diSessa, A. A. (2001). Changing minds: Computers, learning, and literacy. Cambridge, MA: MIT Press.Google Scholar
  12. Gomm, R., Hammersley, M., & Foster, P. (Eds.). (2000). Case study method: Key issues, key texts. London: Sage Publications.Google Scholar
  13. Grover, S., & Basu, S. (2017). Measuring student learning in introductory block-based programming: Examining misconceptions of loops, variables, and Boolean logic. In Proceedings of the 2017 ACM SIGCSE Technical Symposium on Computer Science Education (pp. 267-272). ACM.Google Scholar
  14. Grover, S., Jackiw, N., & Lundh, P. (2017). Integrating Dynamic Mathematics to Advance Learning of Computing Concepts for Diverse Student Populations. Presented at the Annual Meeting of the American Educational Research Association. TX: San Antonio.Google Scholar
  15. Grover, S., & Pea, R. (2018). Computational thinking: A competency whose time has come (p. 19). Computer Science Education: Perspectives on Teaching and Learning in School.Google Scholar
  16. Grover, S., Pea, R., & Cooper, S. (2014). Expansive Framing and Preparation for Future Learning in Middle-School Computer Science. In In Proceedings of the 11th International Conference of the Learning Sciences. Boulder: CO.Google Scholar
  17. Hambrusch, S., Hoffmann, C., Korb, J. T., Haugan, M., & Hosking, A. L. (2009). A multidisciplinary approach towards computational thinking for science majors. ACM SIGCSE Bulletin, 41(1), 183–187.CrossRefGoogle Scholar
  18. Harel, I., & Papert, S. (1990). Software design as a learning environment. Interactive Learning Environments, 1(1), 1–32.CrossRefGoogle Scholar
  19. Harris, C. J., Krajcik, J. S., Pellegrino, J. W., & McElhaney, K. W. (2016). Constructing assessment tasks that blend disciplinary core ideas, crosscutting concepts, and science practices for classroom formative applications. Menlo Park, CA: SRI International.Google Scholar
  20. Hestenes, D., Wells, M., & Swackhamer, G. (1992). Force concept inventory. The Physics Teacher, 30(3), 141–158.CrossRefGoogle Scholar
  21. Hew, K. F., & Brush, T. (2007). Integrating technology into K–12 teaching and learning: Current knowledge gaps and recommendations for future research. Educational Technology Research and Development, 55(3), 223–252.CrossRefGoogle Scholar
  22. Jona, K., Wilensky, U., Trouille, L., Horn, M. S., Orton, K., Weintrop, D., & Beheshti, E. (2014). Embedding computational thinking in science, technology, engineering, and math. Presented at Future Directions in Computing Education Summit, Orlando, FL.Google Scholar
  23. K–12 Computer Science Framework. (2016). Retrieved from http://www.k12cs.org.
  24. Klopfer, E., Yoon, S., & Um, T. (2005). Teaching complex dynamic systems to young students with StarLogo. Journal of Computers in Mathematics and Science Teaching, 24(2), 157–178.Google Scholar
  25. Lee, I., Martin, F., & Apone, K. (2014). Integrating computational thinking across the K–8 curriculum. ACM Inroads, 5(4), 64–71.CrossRefGoogle Scholar
  26. Mayer, R. E. (1999). Designing instruction for constructivist learning. Instructional-Design Theories and Models: A New Paradigm of Instructional Theory, 2, 141–159.Google Scholar
  27. McCauley, R., Fitzgerald, S., Lewandowski, G., Murphy, L., Simon, B., Thomas, L., & Zander, C. (2008). Debugging: A review of the literature from an educational perspective. Computer Science Education, 18(2), 67–92.CrossRefGoogle Scholar
  28. McElhaney, K. W., & Linn, M. C. (2011). Investigations of a complex, realistic task: Intentional, unsystematic, and exhaustive experimenters. Journal of Research in Science Teaching, 48(7), 745–931.CrossRefGoogle Scholar
  29. Mislevy, R. J., Haertel, G., Riconscente, M., Rutstein, D. W., & Ziker, C. (2017). Evidence-Centered Assessment Design. In Assessing Model-Based Reasoning using Evidence-Centered Design (pp. 19–24). SpringerBriefs in Statistics. New York: Springer International Publishing.Google Scholar
  30. National Research Council. (2012). A Framework for K–12 Science Education: Practices, Crosscutting Concepts, and Core Ideas. Washington, DC: The National Academies Press.Google Scholar
  31. Lead States, N. G. S. S. (2013). Next Generation Science Standards: For states, by states. Washington, DC: The National Academies Press.Google Scholar
  32. Papert, S. (1991). Situating constructionism. In I. Harel & S. Papert (Eds.), Constructionism (pp. 193–206). Westport, CT: Ablex Publishing.Google Scholar
  33. Redish, E. F., & Wilson, J. M. (1993). Student programming in the introductory physics course: MUPPET. American Journal of Physics, 61(3), 222–232.CrossRefGoogle Scholar
  34. Repenning, A., Webb, D., & Ioannidou, A. (2010). Scalable game design and the development of a checklist for getting computational thinking into public schools. In Proceedings of the 41st ACM Technical Symposium on Computer Science Education (pp. 265–269). New York: ACM.Google Scholar
  35. Román-González, M., Pérez-González, J. C., & Jiménez-Fernández, C. (2017). Which cognitive abilities underlie computational thinking? Criterion validity of the Computational Thinking Test. Computers in Human Behavior, 72, 678–691.CrossRefGoogle Scholar
  36. Schnabel, R. B. (2011). Educating computing’s next generation. Communications of the ACM, 54(4), 5–5.CrossRefGoogle Scholar
  37. Schwartz, D. L., & Arena, D. (2013). Measuring what matters most: Choice- based assessments for the digital age. Cambridge, MA: MIT Press.CrossRefGoogle Scholar
  38. Schwartz, D. L., Chase, C. C., Oppezzo, M. A., & Chin, D. B. (2011). Practicing versus inventing with contrasting cases: The effects of telling first on learning and transfer. Journal of Educational Psychology, 103(4), 759–775.CrossRefGoogle Scholar
  39. Schwartz, D. L., & Martin, T. (2004). Inventing to prepare for future learning: The hidden efficiency of encouraging original student production in statistics instruction. Cognition and Instruction, 22(2), 129–184.CrossRefGoogle Scholar
  40. Sengupta, P., Dickes, A., Farris, A. V., Karan, A., Martin, D., & Wright, M. (2015). Programming in K-12 science classrooms. Communications of the ACM, 58(11), 33–35.CrossRefGoogle Scholar
  41. Sengupta, P., Kinnebrew, J. S., Basu, S., Biswas, G., & Clark, D. (2013). Integrating computational thinking with k-12 science education using agent-based computation: A theoretical framework. Education and Information Technologies, 18(2), 351–380.CrossRefGoogle Scholar
  42. Sherin, B. L. (2001). A comparison of programming languages and algebraic notation as expressive languages for physics. International Journal of Computers for Mathematical Learning, 6(1), 1–61.CrossRefGoogle Scholar
  43. Sherina, B., diSessa, A. A., Hammer, D., & Sherin, B. L. (1993). Dynaturtle revisited: Learning physics through the collaborative design of a computer model. Interactive Learning Environments, 3, 91–118.CrossRefGoogle Scholar
  44. Soloway, E. (1993). Should we teach students to program? Communications of the ACM, 36(10), 21–25.CrossRefGoogle Scholar
  45. Torp, L., & Sage, S. (1998). Problems as possibilities: Problem-based learning for K-12 education. ASCD.Google Scholar
  46. Weintrop, D., Beheshti, E., Horn, M., Orton, K., Jona, K., Trouille, L., & Wilensky, U. (2016). Defining computational thinking for mathematics and science classrooms. Journal of Science Education and Technology, 25(1), 127–147.CrossRefGoogle Scholar
  47. Wiggins, G. T., & McTighe, J. (2005). Understanding by design. Alexandria, VA: Association for Supervision and Curriculum Development.Google Scholar
  48. Wilensky, U., & Resnick, M. (1999). Thinking in Levels: A Dynamic Systems Perspective to Making Sense of the World. Journal of Science Education and Technology, 8(1), 3–19.CrossRefGoogle Scholar
  49. Wing, J. M. (2006). Computational thinking. Communications of the ACM, 49(3), 33–35.CrossRefGoogle Scholar
  50. Wing, J. (2016, March). Computational thinking, 10 years later. Retrieved from https://www.microsoft.com/en-us/research/blog/computational-thinking-10-years-later/
  51. Yoon, S. A., Goh, S. E., & Park, M. (2018). Teaching and learning about complex systems in K-12 science education: A review of empirical studies 1995–2015. Review of Educational Research, 88(2), 285–325.CrossRefGoogle Scholar

Copyright information

© Springer Nature B.V. 2019

Authors and Affiliations

  • Nicole M. Hutchins
    • 1
    Email author
  • Gautam Biswas
    • 1
  • Miklós Maróti
    • 1
  • Ákos Lédeczi
    • 1
  • Shuchi Grover
    • 2
  • Rachel Wolf
    • 3
  • Kristen Pilner Blair
    • 3
  • Doris Chin
    • 3
  • Luke Conlin
    • 4
  • Satabdi Basu
    • 5
  • Kevin McElhaney
    • 5
  1. 1.Vanderbilt UniversityNashvilleUSA
  2. 2.Looking Glass VenturesPalo AltoUSA
  3. 3.Stanford UniversityStanfordUSA
  4. 4.Salem State UniversitySalemUSA
  5. 5.SRI InternationalMenlo ParkUSA

Personalised recommendations