Abstract
Integrated Science, Technology, Engineering, and Mathematics (STEM) teaching provides an opportunity for students to learn STEM knowledge across two or more domains. The following study presents students’ STEM content knowledge achievement after learning an integrated STEM unit taught by science and engineering technology teachers. After completing a two-week teacher professional development workshop, science and engineering technology teachers implemented an exemplar STEM unit called D-BAIT. The integrated STEM unit included entomology, biology, biomimicry, physics, and engineering design content. The researchers constructed a STEM knowledge multiple-choice pre/post-test assessment to assess students’ understanding of these concepts. This study employed a quasi-experimental nonequivalent comparison group design and collected a total of 1,345 pre/post-test assessments. The data were analyzed through the independent samples t-test. The results indicate that the integrated STEM unit implemented by teacher collaboration increased students’ overall STEM content knowledge. The comparison between science and engineering students’ knowledge gain showed that the integrated STEM unit significantly impacts students’ content knowledge. The comparisons between domain and cross-domain knowledge in science and engineering content found no significant differences; however, the mean score gain in cross-domain was higher than within the subject domain. The results of this study indicate that students can learn domain content outside of their course of study.
Similar content being viewed by others
Data availability
The datasets analyzed during the current study are not publicly available due to the request made in the consent forms issued to participants. The learning materials used in this study are all available on https://www.purdue.edu/trails/.
Code availability
Non applicable.
References
McSpadden, M., & Kelley, T. (2012). Engineering design: Diverse design teams to solve real-world problem. Technology and Engineering Teacher, 72(1), 17-21.
Kelley, T., & Knowles, J. G. (2016). A conceptual framework for integrated STEM education. International Journal of STEM Education, 3(11). https://doi.org/10.1186/s40594-016-0046-z
Kelley, T. R., Knowles, J. G., Holland, J. D., & Han, J. (2020). Increasing high school teachers self-efficacy for integrated STEM instruction through a collaborative community of practice. International Journal of STEM Education, 7(14). https://doi.org/10.1186/s40594-020-00211-w
Han, J., Kelley, T., & Knowles, J. G. (2021). Factors influencing student STEM learning: Self-efficacy and outcome expectancy, 21st century skills, and career awareness. Journal for STEM Education Research, 4(2), 117-137. https://doi.org/10.1007/s41979-021-00053-3
Indiana Department of Education [INDOE] (2015). Data center & Reports.
Apedoe, X. S., Reynolds, B., Ellefson, M. R., & Schunn, C. D. (2008). Bringing engineering design into high school science classrooms: The heating/cooling unit. Journal of science education and technology, 17(5), 454–465. https://doi.org/10.1007/s10956-008-9114-6
Ary, D., Jacobs, L. C., Irvine, C. K. S., & Walker, D. (2018). Introduction to research in education. Boston, MA: Cengage Learning
Banilower, E. R. (2019). Understanding the big picture for science teacher education: The 2018 NSSME+. Journal of Science Teacher Education, 30(3), 201–208. https://doi.org/10.1080/1046560X.2019.1591920
Baumann, M. R., & Bonner, B. L. (2017). An Expectancy Theory Approach to Group Coordination: Expertise, Task Features, and Member Behavior. Journal of Behavioral Decision Making, 30(2), 407–419. https://doi.org/10.1002/bdm.1954
Bell, S. (2010). Project-Based Learning for the 21st Century: Skills for the Future. The Clearing House: A Journal of Educational Strategies Issues and Ideas, 83(2), 39–43. https://doi.org/10.1080/00098650903505415
Berland, L., Steingut, R., & Ko, P. (2014). High school student perceptions of the utility of the engineering design process: Creating opportunities to engage in engineering practices and apply math and science content. Journal of Science Education and Technology, 23(6), 705–720. https://doi.org/10.1007/s10956-014-9498-4
Boyer, S. J., & Bishop, P. A. (2004). Young adolescent voices: Students’ perceptions of interdisciplinary teaming. RMLE Online, 28(1), 1–19. https://doi.org/10.1080/19404476.2004.11658176
Brown, J. S., Collins, A., & Duguid, P. (1989). Situated cognition and the culture of learning. Educational researcher, 18(1), 32–42
Brown, W. (1910). Some experimental results in the correlation of mental abilities. British Journal of Psychology, 3, 296–322
Center for Evaluation, Policy, & Research (CEPR) (2019). Indiana University. Center for Evaluation & Education Policy. https://cepr.indiana.edu/disr.html
Cohen, J. (1988). Statistical power analyses for the social sciences. Hillsdale, NJ: Lawrence Erlbauni Associates
Collins, A., Brown, J. S., & Holum, A. (1991). Cognitive apprenticeship: Making thinking visible. American educator, 15(3), 6–11
Cunningham, C. M., & Carlsen, W. S. (2014). Teaching engineering practices. Journal of science teacher education, 25(2), 197–210. https://doi.org/10.1007/s10972-014-9380-5
De Miranda, M. A. (2004). The Grounding of a Discipline: Cognition and Instruction in Technology Education. International Journal of Technology and Design Education, 14(1), 61–77. https://doi.org/10.1023/B:ITDE.0000007363.44114.3b
Dym, C. L., Agogino, A. M., Eris, O., Frey, D. D., & Leifer, L. J. (2005). Engineering design thinking, teaching, and learning. Journal of engineering education, 94(1), 103–120. https://doi.org/10.1002/j.2168-9830.2005.tb00832.x
DeVellis, R. F. (2016). Scale development: Theory and applications (26 vol.). Sage publications
Ebel, R. L. (1973). Evaluation and educational objectives. Journal of Educational Measurement, 10(4), 273–279. https://doi.org/10.1111/j.1745-3984.1973.tb00804.x
Eide., A., Jenison, R., Mashaw, L., & Northup, L. (1997). Selected Materials from Engineering Fundaments and Problem-Solving. McGraw-Hill
Ejiwale, J. A. (2013). Barriers to successful implementation of STEM education. Journal of Education and Learning, 7(2), 63–74. https://doi.org/10.11591/edulearn.v7i2.220
English, L. D., King, D., & Smeed, J. (2017). Advancing integrated STEM learning through engineering design: Sixth-grade students’ design and construction of earthquake resistant buildings. The Journal of Educational Research, 110(3), 255–271. https://doi.org/10.1080/00220671.2016.1264053
English, L. D., & King, D. (2019). STEM integration in sixth grade: Desligning and constructing paper bridges. International Journal of Science and Mathematics Education, 17(5), 863–884. https://doi.org/10.1007/s10763-018-9912-0
Ferketich, S. (1991). Focus on psychometrics. Aspects of item analysis. Research in nursing & health, 14(2), 165–168. https://doi.org/10.1002/nur.4770140211
Finch, W., Bolin, J., & Kelley, K. (2019). Multilevel Modeling Using R. New York: Chapman and Hall/CRC. https://doi.org/10.1201/9781351062268
Fortus, D. (2004). Design-based science and student learning. Journal of Research in Science Teaching, 41(10), 1081–1110. https://doi.org/10.1002/tea.20040. Dershimer, C., Krajcik, J., Marx, R., & Mamlok-Naaman, R.
Gao, X., Li, P., Shen, J., & Sun, H. (2020). Reviewing assessment of student learning in interdisciplinary STEM education. International Journal of STEM Education, 7(1), 1–14. https://doi.org/10.1186/s40594-020-00225-4
Goddard, Y. L., Goddard, R. D., & Tschannen-Moran, M. (2007). A theoretical and empirical investigation of teacher collaboration for school improvement and student achievement in public elementary schools. Teachers college record, 109(4), 877–896
Guzey, S. S., Harwell, M., Moreno, M., Peralta, Y., & Moore, T. J. (2017). The impact of design-based STEM integration curricula on student achievement in engineering, science, and mathematics. Journal of Science Education and Technology, 26(2), 207–222. https://doi.org/10.1007/s10956-016-9673-x
International Technology and Engineering Educators Association [ITEEA]. (2020). Standards for Technological and Engineering Literacy: Defining the Role of Technology and Engineering in STEM Education. VA: Author
Järvelä, S., Järvenoja, H., & Veermans, M. (2008). Understanding the dynamics of motivation in socially shared learning. International Journal of Educational Research, 47(2), 122–135. https://doi.org/10.1016/j.ijer.2007.11.012
Jones, C. (2009). Interdisciplinary approach-advantages, disadvantages, and the future benefits of interdisciplinary studies. ESSAI, 7(1), 26. Available at: http://dc.cod.edu/essai/vol7/iss1/26
Jones, A., & Issroff, K. (2005). Learning technologies: Affective and social issues in computer-supported collaborative learning. Computers & Education, 44(4), 395–408. https://doi.org/10.1016/j.compedu.2004.04.004
Krajcik, J. S., Blumenfeld, P. C., Marx, R. W., & Soloway, E. (1994). A collaborative model for helping middle grade science teachers learn project-based instruction. The elementary school journal, 94(5), 483–497. https://doi.org/10.1086/461779
Laal, M., & Ghodsi, S. M. (2012). Benefits of collaborative learning. Procedia-social and behavioral sciences, 31, 486–490. https://doi.org/10.1016/j.sbspro.2011.12.091
Lajoie, S. P., Guerrera, C., Munsie, S. D., & Lavigne, N. C. (2001). Constructing knowledge in the context of BioWorld. Instructional Science, 29(2), 155–186. https://doi.org/10.1023/A:1003996000775
Lande, M., & Leifer, L. (2009). Prototyping to learn: Characterizing engineering students’ prototyping activities and prototypes. In DS 58 – 1: Proceedings of ICED 09, the 17th International Conference on Engineering Design, Vol. 1, Design Processes, Palo Alto, CA, USA, 24.-27.08. 2009
Lehman, J., Kim, W., & Harris, C. (2014). Collaborations in a community of practice working to integrate engineering design in elementary science education. Journal of STEM Education, 15(3), 21–28. Retrieved September 16, 2021 from https://www.learntechlib.org/p/151109/
Lewis, T. (2006). Design and inquiry: Bases for an accommodation between science and technology education in the curriculum? Journal of Research in Science Teaching: The Official Journal of the National Association for Research in Science Teaching, 43(3), 255–281. https://doi.org/10.1002/tea.20111
Li, L. C., Grimshaw, J. M., Nielsen, C., Judd, M., Coyte, P. C., & Graham, I. D. (2009). Evolution of Wenger’s concept of community of practice. Implementation Science, 4(1), 1–8. https://doi.org/10.1186/1748-5908-4-11
Lotter, C., Carnes, N., Marshall, J. C., Hoppmann, R., Kiernan, D. A., Barth, S. G., & Smith, C. (2020). Teachers’ Content Knowledge, Beliefs, and Practice after a Project-Based Professional Development Program with Ultrasound Scanning. Journal of Science Teacher Education, 31(3), 311–334. https://doi.org/10.1080/1046560X.2019.1705535
Malmberg, J., Järvelä, S., & Järvenoja, H. (2017). Capturing temporal and sequential patterns of self-, co-, and socially shared regulation in the context of collaborative learning. Contemporary Educational Psychology, 49, 160–174. https://doi.org/10.1016/j.cedpsych.2017.01.009
McFadden, J., & Roehrig, G. (2019). Engineering design in the elementary science classroom: supporting student discourse during an engineering design challenge. International Journal of Technology and Design Education, 29(2), 231–262. https://doi.org/10.1007/s10798-018-9444-5
Messick, S. (1995). Validity of psychological assessment: Validation of inferences from persons’ responses and performances as scientific inquiry into score meaning. American Psychologist, 50(9), 741–749. https://doi.org/10.1037/0003-066X.50.9.741
Moore, T. J., Glancy, A. W., Tank, K. M., Kersten, J. A., Smith, K. A., & Stohlmann, M. S. (2014). A framework for quality K-12 engineering education: Research and development. Journal of pre-college engineering education research (J-PEER), 4(1), 2. https://doi.org/10.7771/2157-9288.1069
Nakazawa, Y., Miyashita, M., Morita, T., Umeda, M., Oyagi, Y., & Ogasawara, T. (2009). The palliative care knowledge test: reliability and validity of an instrument to measure palliative care knowledge among health professionals. Palliative Medicine, 23(8), 754–766. https://doi.org/10.1177/0269216309106871
National Research Council [NRC]. (2009). Engineering in K-12 education: Understanding the status and improving the prospects. National Academies Press
National Research Council. (2012). A framework for K-12 science education: Practices, crosscutting concepts, and core ideas. Washington, DC: National Academies Press
Netwong, T. (2018). Development of problem solving skills by integration learning following stem education for higher education. International Journal of Information and Education Technology, 8(9), 639–643. https://doi.org.10.18178/ijiet.2018.8.9.1114
NGSS Lead States. (2013). Next Generation Science Standards: For States, By States. Washington: The National Academies Press
Osborne, J. W. (2000). Advantages of hierarchical linear modeling. Practical Assessment Research & Evaluation, 7(1), 1–4. https://doi.org/10.7275/pmgn-zx89
Ovwigho, B. O. (2014). Empirical demonstration of techniques for computing the discrimination power of a dichotomous item response Test. Journal of Research & Method in Education, 4(1), 12–17. https://doi.org/10.5901/jesr.2014.v4n1p189
Ozkaya, H. E., Dabas, C., Kolev, K., Hult, G. T. M., Dahlquist, S. H., & Manjeshwar, S. A. (2013). An assessment of hierarchical linear modeling in international business, management, and marketing. International Business Review, 22(4), 663–677. https://doi.org/10.1016/j.ibusrev.2012.10.002
Panadero, E., & Järvelä, S. (2015). Socially shared regulation of learning: A review. European Psychologist, 20(3), 190–203. https://doi.org/10.1027/1016-9040/a000226
Petroski, H. (2011). The essential engineer: Why science alone will not solve our global problems. New York, NY: Vintage Books
Purzer, Ş., Goldstein, M. H., Adams, R. S., Xie, C., & Nourian, S. (2015). An exploratory study of informed engineering design behaviors associated with scientific explanations. International Journal of STEM Education, 2(1), 9. https://doi.org/10.1186/s40594-015-0019-7
Radhakrishna, R. B. (2007). Tips for developing and testing questionnaires/instruments. Journal of extension, 45(1), 1–4. Retrieved September 16, 2021 from https://archives.joe.org/joe/2007february/tt2.php
Raudenbush, S. W., & Bryk, A. S. (2002). Hierarchical Linear Models: Applications and data analysis methods (2nd. Eds.).). Chicago, IL: Sage Publication
Reeves, T., & Gomm, P. (2015). Community and contribution: Factors motivating students to participate in an extra-curricular online activity and implications for learning. E-Learning and Digital Media, 12(3–4), 391–409. https://doi.org/10.1177/2042753015571828
Remmers, H. H., & Ewart, E. (1941). Reliability of multiple-choice measuring instruments as a function of the Spearman-Brown prophecy formula, III. Journal of Educational Psychology, 32(1), 61–66. https://doi.org/10.1037/h0061781
Rogat, T. K., & Linnenbrink-Garcia, L. (2011). Socially shared regulation in collaborative groups: An analysis of the interplay between quality of social regulation and group processes. Cognition and Instruction, 29(4), 375–415. https://doi.org/10.1080/07370008.2011.607930
Rogoff, B. (1994). Developing understanding of the idea of communities of learners. Mind Culture and Activity, 1(4), 209–229
Ronfeldt, M., Farmer, S. O., McQueen, K., & Grissom, J. A. (2015). Teacher collaboration in instructional teams and student achievement. American Educational Research Journal, 52(3), 475–514. https://doi.org/10.3102/0002831215585562
Sanders, M. E. (2009). Stem, stem education, stemmania. Technology Teacher, 68(4), 20–26
Sanders, M. E. (2012). Integrative stem education as best practice. In H. Middleton (Ed.), Explorations of Best Practice in Technology, Design, & Engineering Education. Vol.2 (pp.103–117). Queensland, Australia: Griffith Institute for Educational Research. ISBN 978-1-921760-95-2
Shute, V. J., Lajoie, S. P., & Gluck, K. A. (2000). Individualized and group approaches to training. In S. Tobias, & J. D. Fletcher (Eds.), Training and Retraining: A Handbook for Business, Industry, Government, and the Military (pp. 171–207). New York, NY: Macmillan
Spearman, C. (1910). Correlation calculated from faulty data. British Journal of Psychology, 3, 271–295. https://doi.org/10.1111/j.2044-8295.1910.tb00206.x
Spector, J. M., & Anderson, T. M. (2000). Integrated and holistic perspectives on learning, instruction and technology: understanding complexity. Dordrect; Boston: Kluwer Academic Publishers
Stohlmann, M., Moore, T. J., & Roehrig, G. H. (2012). Considerations for teaching integrated STEM education. Journal of Pre-College Engineering Education Research (J-PEER), 2(1), 4. https://doi.org/10.5703/1288284314653
Vangrieken, K., Dochy, F., Raes, E., & Kyndt, E. (2015). Teacher collaboration: A systematic review. Educational research review, 15, 17–40. https://doi.org/10.1016/j.edurev.2015.04.002
Wang, H. H., Moore, T. J., Roehrig, G. H., & Park, M. S. (2011). STEM integration: Teacher perceptions and practice. Journal of Pre-College Engineering Education Research (J-PEER), 1(2), 1–13. https://doi.org/10.5703/1288284314636
Wendell, K. B., Wright, C. G., & Paugh, P. (2017). Reflective decision-making in elementary students’ engineering design. Journal of Engineering Education, 106(3), 356–397. https://doi.org/10.1002/jee.20173
Wheatley, G. H. (1991). Constructivist perspectives on science and mathematics learning. Science education, 75(1), 9–21. https://doi.org/10.1002/sce.3730750103
Wilson, S., Schweingruber, H., & Nielsen, N. (2015). Science teachers’ learning: enhancing opportunities, creating supportive contexts. Washington, DC: The National Academies Press
Funding
This research is supported by the National Science Foundation, award 1513248. Any opinions, and findings expressed in this material are the authors and do not necessarily reflect the views of National Science Foundation.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Compliance with ethical standards
All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards.
Ethics approval
All procedures performed in studies involving human participants were in accordance with the ethical standards approved by the institutional review board (IRB) at which the studies were conducted.
Consent to participate
Informed consent to participate was obtained from all individual participants included in the study.
Consent for publication
Informed consent for publication was obtained from all individual participants included in the study.
Informed consent
Informed consent was obtained from all individual participants included in the study.
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Appendix
Appendix
A. Item Analysis.
Instrument Item Difficulty Index. The item difficulty test was employed to test if the individual items have an appropriate level of difficulty. The instrument item difficulty, denoted by p, is measured by the proportion of participants responding to each item correctly. Item difficulty is calculated by:
\({\text{X}}_{\text{i}}\) = Number of students responding correctly to item i
N = number of students taking the assessment.
The result of item analysis shows the difficulty ranges from 0.10 to 0.78, and confirmed that instrument items 3, 4, and 24 results were computed to around 0.10 difficulty, meaning 90% of students did not correctly answer those items.
-
1.
Item Discrimination. Instrument item discrimination analysis detects whether each instrument item is efficiently constructed by comparing answer rates between students ranked over 66% and below 33% on the test. Item discrimination was identified by the following formula,
\({U}_{i}\)=number of students who have total scores in the upper range and who also have item i correct,
\({L}_{i}\)=number of students who have total scores in the lower range and who also have item i correct,
\({n}_{iU}\)=number of students who have total scores in the upper range of total test scores, and
\({n}_{iL}\)=number of students who have total scores in the lower range of total test scores.
The item discrimination index has from − 1.0 to 1.0. A negative discrimination index means that lower scoring students performed better on the item than higher scoring students. A low discrimination index (< 0.3) indicates an item was equally difficult for both groups. A high index value means the item discriminates well between low and high scorers. An index value of 0.3 and above is good and 0.6 and above is very good (Ebel, 1973; Ovwigho, 2014). The analysis for item discrimination indicated that item 3 and 24 had negative indexes, which imply that lower ranked students were more likely to answer the question correctly than higher ranked students. Also, items 4 and 23 were very low on the discrimination scale in the “Poor” index score range, another justification to remove these items from the instrument.
-
2.
Internal consistency and reliability. To assess internal consistency of the test instrument, the researchers calculated Cronbach’s Alpha using SPSS 23 software. The overall Cronbach’s Alpha was 0.69 which is a marginal range of score. The statistical item analysis indicated that if item 3, 4, 23, and 24 were deleted, the overall Cronbach’s Alpha score would be over 0.7, which is in the acceptable index range. Lastly, test reliability was assessed through split-half reliability using the adjusted Spearman-Brown prophecy formula (Brown, 1910; Spearman, 1910; Remmers & Ewart, 1941). The reliability score was computed in R statistics software and resulted in a reliable score range of 0.876.
The results of item analysis are shown in Table a below.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Kelley, T.R., Sung, E., Han, J. et al. Impacting secondary students’ STEM knowledge through collaborative STEM teacher partnerships. Int J Technol Des Educ 33, 1563–1584 (2023). https://doi.org/10.1007/s10798-022-09783-w
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10798-022-09783-w