Advertisement

Connecting the STEM dots: measuring the effect of an integrated engineering design intervention

  • Paul R. Hernandez
  • Ralph Bodin
  • Jonathan W. Elliott
  • Badaruddin Ibrahim
  • Karen E. Rambo-Hernandez
  • Thomas W. Chen
  • Michael A. de Miranda
Article

Abstract

Recent publications have elevated the priority of increasing the integration of Science, Technology, Engineering, and Mathematics (STEM) content for K-12 education. The STEM education community must invest in the development of valid and reliable to scales to measure STEM content, knowledge fusion, and perceptions of the nature of STEM. This brief report discusses the development of an instrument to measure student perceptions of the interdependent nature of STEM content knowledge in the context of a complex classroom intervention implemented in five Colorado high schools (N = 275). Specifically, cross-functional science, technology, engineering, and mathematics teams of high school students were formed to complete engineering design problems. Exploratory (pretest) and confirmatory (posttest) factor analyses indicated that a newly adapted scale to measure student perceptions of the interdependent nature of STEM content knowledge had possessed adequate model fit. Furthermore, analysis revealed a novel pattern of results for the intervention. Specifically, students with initially high perceptions of the interdependent nature of STEM sustained their high perceptions at posttest; however, students with initially low perceptions exhibited statistically significantly positive gains from pretest to posttest. Therefore, this intervention may work best with students who are at risk of losing interest in STEM disciplines. The implications of these research findings are discussed.

Keywords

Integrated STEM Education Field study Intervention Engineering design problem 

Notes

Acknowledgments

This program is based upon collaborative work supported by a National Science Foundation Grant No. 0841259; Colorado State University, Thomas W. Chen, Principal Investigator, Michael A. de Miranda and Stuart Tobet Co-Principal Investigators. Any opinions, findings, conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.

References

  1. Angel, S., & LaLonde, D. (1998). Science success strategies: An interdisciplinary course for improving science and mathematics education. Journal of Chemical Education, 75, 1437–1441.CrossRefGoogle Scholar
  2. Arbuckle, J. L. (2009). AMOS (Version 18). Spring House, PA: Amos Development Corporation. Retrieved from, http://amosdevelopment.com.
  3. Brown, A. L. (1992). Design experiments: Theoretical and methodological challenges in creating complex interventions in classroom settings. Journal of the Learning Sciences, 2, 141–178.CrossRefGoogle Scholar
  4. Bruer, J. T. (1993). Schools for thought: A science of learning in the classroom. Cambridge, MA: The MIT Press.Google Scholar
  5. Dart, B. C., Burnett, P. C., Purdie, N., Boulton-Lewis, G., Campbell, J., & Smith, D. (2000). Students’ conceptions of learning, the classroom environment, and approaches to learning. The Journal of Educational Research, 93, 262–270.CrossRefGoogle Scholar
  6. de Miranda, M. A. (2004). The grounding of a discipline: Cognition and instruction in technology education. International Journal of Technology and Design Education, 14, 61–77.CrossRefGoogle Scholar
  7. Elliott, B., Oty, K., Mcarthur, J., & Clark, B. (2001). The effect of an interdisciplinary algebra/science course on students. International Journal of Mathematical Education in Science and Technology, 32, 811–816.CrossRefGoogle Scholar
  8. Glaser, R. (1990). The reemergence of learning theory within instructional research. American Psychologist, 45, 29–39.CrossRefGoogle Scholar
  9. Haertel, G. D., Walberg, H. J., & Haertel, E. H. (1981). Socio-psychological environments and learning: A quantitative synthesis. British Educational Research Journal, 7, 27–36.CrossRefGoogle Scholar
  10. Haynes, S. N., Richard, D. C. S., & Kubany, E. S. (1995). Content validity in psychological assessment: A functional approach to concepts and methods. Psychological Assessment, 7, 238–247.CrossRefGoogle Scholar
  11. Hayton, J. C., Allen, D. G., & Scarpello, V. (2004). Factor retention decisions in exploratory factor analysis: A tutorial on parallel analysis. Organizational Research Methods, 7, 191–205.CrossRefGoogle Scholar
  12. Henson, R. K., & Roberts, J. K. (2006). Use of exploratory factor analysis in published research: Common errors and some comment on improved practice. Educational and Psychological Measurement, 66, 393–416.CrossRefGoogle Scholar
  13. Katehi, L., Pearson, G., & Feder, M. (Eds.). (2009). Engineering in K-12 education: Understanding the status and improving the prospects. Committee on K-12 Engineering Education. Washington, DC: The National.Google Scholar
  14. Korey, J. (2000). Dartmouth college mathematics across the curriculum evaluation summary: Mathematics and humanities courses, retrieved on 3 Sep 2012, from http://www.math.dartmouth.edu/~matc/Evaluation/humeval.pdf.
  15. Korey, J. (2002). Successful interdisciplinary teaching: Making one plus one equal one. 2nd International conference on the teaching of mathematics at the undergraduate level Hersonissos, Crete, July 1-6, 2002, retrieved on 3 Sep 2012, from http://www.math.uoc.gr/~ictm2/Proceedings/pap123.pdf.
  16. Lantz, H. B. (2009). Science, technology, engineering, and mathematics (STEM) education: What form? What function? CurrTech Integrations. Retrieved from, http://www.currtechintegrations.com/pdf/STEMEducationArticle.pdf.
  17. National Research Council (2012). A framework for K-12 science education: Practices, crosscutting concepts, and core ideas. Committee on a conceptual framework for new science education standards. board on science education, division of behavioral and Social Science Education. Washington, DC: The National Academies Press.Google Scholar
  18. Netemeyer, R. G., Bearden, W. O., & Sharma, S. (2003). Scaling procedures: Issues and applications. Thousand Oaks, CA: Sage Publications.Google Scholar
  19. Pett, M. A., Lackey, N. R., & Sullivan, J. J. (2003). Making sense of factor analysis: The use of factor analysis for instrument development in health care research. Thousand Oaks, CA: Sage Publications.Google Scholar
  20. Schmidt, W. H., & Maier, A. (2009). Opportunity to learn. In G. Sykes, B. Schneider, & N. Plank (Eds.), Handbook of educational policy research (pp. 541–560). New York and London: Routledge Publishers. for the American Educational Research Association.Google Scholar
  21. Schmidt, W. H., & McKnight, C. C. (2012). Inequality for all. New York: Teachers College Press.Google Scholar
  22. Velicer, W. F. (1976). Determining the number of components from the matrix of partial correlations. Psychometrika, 41, 321–327.CrossRefGoogle Scholar
  23. Velicer, W. F., Eaton, C. A., & Fava, J. L. (2000). Construct explication through factor or component analysis: A review and evaluation of alternative procedures for determining the number of factors or components. In R. D. Goffin & E. Helmes (Eds.), Problems and solutions in human assessment (pp. 41–71). Boston: Kluwer.CrossRefGoogle Scholar
  24. Walberg, H. J., & Anderson, G. J. (1968). Classroom climate and individual learning. Journal of Educational Psychology, 59, 414–419.CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media Dordrecht 2013

Authors and Affiliations

  • Paul R. Hernandez
    • 1
  • Ralph Bodin
    • 1
  • Jonathan W. Elliott
    • 1
  • Badaruddin Ibrahim
    • 2
  • Karen E. Rambo-Hernandez
    • 1
  • Thomas W. Chen
    • 3
  • Michael A. de Miranda
    • 3
  1. 1.School of EducationColorado State UniversityFort CollinsUSA
  2. 2.University Tun Hussein Onn MalaysiaBatu PahatMalaysia
  3. 3.College of EngineeringColorado State UniversityFort CollinsUSA

Personalised recommendations