Skip to main content

Advertisement

Log in

Operationalizing Optimization in a Middle School Virtual Engineering Internship

  • Published:
Journal of Science Education and Technology Aims and scope Submit manuscript

Abstract

New national science standards have elevated attention to student performance with a core set of science and engineering practices, yet guidance about how to assess these practices is only just emerging in the literature. This is particularly true for the set of engineering design–focused concepts and practices articulated in the Next Generation Science Standards’ (NGSS) Engineering, Technology, and Application of Science (ETS) standards. In this work, we present a model of student cognition for assessing student facility with the engineering design practice of optimization. We operationalize this model of cognition within a set of engineering-focused units for middle school, framed as Virtual Engineering Internships (VEIs). To operationalize the engineering design practice of optimization within our VEIs, we first broke optimization down into two more specific sub-behaviors: exploration and systematicity. We then designed metrics that provide evidence of those behaviors and would be observable given student clickstream data from a digital design tool. We normalized these metrics based on the obtained distributions from a research trial. We discuss the existing correlations between these behaviors and metrics.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

References

  • Apedoe, X., & Schunn, C. D. (2013). Strategies for success: uncovering what makes students successful in design and learning. Instructional Science, 41(4), 773–791.

    Article  Google Scholar 

  • Baxter, G. P., Elder, A. D., & Glaser, R. (1996). Knowledge-based cognition and performance assessment in the science classroom. Educational Psychologist, 31(2), 133–140.

    Article  Google Scholar 

  • Bennett, R. E., Persky, H., Weiss, A., & Jenkins, F. (2010). Measuring problem solving with technology: a demonstration study for NAEP. The Journal of Technology, Learning and Assessment, 8(8).

  • Black, P., & Wiliam, D. (2010). Inside the black box: raising standards through classroom assessment. Phi Delta Kappan, 92(1), 81–90.

    Article  Google Scholar 

  • Brown, N. J. S., & Wilson, M. (2011). A model of cognition: the missing cornerstone of assessment. Educational Psychological Review, 23(2), 221–234.

    Article  Google Scholar 

  • Chan, J., Fu, K., Schunn, C. D., Cagan, J., Wood, K., & Kotovsky, K. (2011). On the benefits and pitfalls of analogies for innovative design: ideation performance based on analogical distance, commonness, and modality of examples. Journal of Mechanical Design, 133 081004-1-11.

  • Crismond, D. P., & Adams, R. S. (2012). The informed design teaching and learning matrix. Journal of Engineering Education, 101(4), 738–797.

    Article  Google Scholar 

  • DeBarger, A. H., Penuel, W. R., & Harris, C. J. (2013). Designing NGSS assessments to evaluate the efficacy of curriculum interventions. In Invitational Research Symposium on Science Assessment, Washington, DC. http://www.k12center.org/rsc/pdf/debarger-penuel-harris.pdf. Accessed 15 Aug 2019.

  • Doppelt, Y., Mehalik, M. M., Schunn, C. D., Silk, E., & Krysinski, D. (2008). Engagement and achievements: a case study of design-based learning in a science context. Journal of Technology Education, 19(2), 22–39.

    Google Scholar 

  • Dubberly, H. (2004). How do you design? A compendium of models. http://www.dubberly.com/wp-content/uploads/2008/06/ddo_designprocess.pdf. .

  • Gobert, J., Sao Pedro, M., Raziuddin, J., & Baker, R. S. (2013). From log files to assessment metrics for science inquiry using educational data mining. The Journal of the Learning Sciences, 22(4), 521–563.

  • Hammond, K. J. (1989). Case-based planning: Viewing planning as a memory task. Academic Press.

  • Kolodner, J. (1993). Case-based reasoning. San Mateo: Morgan Kaufmann Publishers.

    Book  Google Scholar 

  • Kolodner, J. L., Camp, P. J., Crismond, D., Fasse, B., Gray, J., Holbrook, J., Puntambekar, S., & Ryan, M. (2003). Problem-based learning meets case-based reasoning in the middle-school science classroom: putting learning by design (tm) into practice. The Journal of the Learning Sciences, 12(4), 495–547.

    Article  Google Scholar 

  • Kruger, C., & Cross, N. (2006). Solution driven versus problem driven design: strategies and outcomes. Design Studies, 27(5), 527–548.

    Article  Google Scholar 

  • Kuo, C. Y., Wu, H. K., Jen, T. H., & Hsu, Y. S. (2015). Development and validation of a multimedia-based assessment of scientific inquiry abilities. International Journal of Science Education, 37(14), 2326–2357.

    Article  Google Scholar 

  • Marples, D. L. (1961). The decisions of engineering design. IRE Transactions on Engineering Management, 2, 55–71.

    Article  Google Scholar 

  • Mehalik, M., & Schunn, C. (2007). What constitutes good design? A review of empirical studies of design processes. International Journal of Engineering Education, 22(3), 519.

    Google Scholar 

  • Mislevy, R. J., Steinberg, L. S., & Almond, R. G. (2003). On the structure of educational assessments. Measurement: Interdisciplinary Research and Perspectives, 1, 3–62.

    Google Scholar 

  • National Research Council. (2001). Knowing what students know: The science and design of educational assessment. National Academies Press.

  • National Research Council. (2012). A framework for K-12 science education: Practices, crosscutting concepts, and core ideas. National Academies Press.

  • NGSS Lead States. (2013). Next generation science standards: For states, by states. Washington, DC: The National Academies Press.

    Google Scholar 

  • Pellegrino, J. W., Wilson, M. R., Koenig, J. A., & Beatty, A. S. (2014). Developing assessments for the next generation science standards. National Academies Press.

  • Pellegrino, J. W., DiBello, L. V., & Goldman, S. R. (2016). A framework for conceptualizing and evaluating the validity of instructionally relevant assessments. Educational Psychologist, 51(1), 59–81.

    Article  Google Scholar 

  • Purzer, Ş., Goldstein, M. H., Adams, R. S., Xie, C., & Nourian, S. (2015). An exploratory study of informed engineering design behaviors associated with scientific explanations. International Journal of STEM Education, 2(1), 9.

    Article  Google Scholar 

  • Quellmalz, E. S., & Pellegrino, J. W. (2009). Technology and testing. Science, 323(5910), 75–79.

    Article  Google Scholar 

  • Quellmalz, E. S., Timms, M. J., Silberglitt, M. D., & Buckley, B. C. (2012). Science assessments for all: integrating science simulations into balanced state science assessment systems. Journal of Research in Science Teaching, 49(3), 363–393.

    Article  Google Scholar 

  • Ruiz-Primo, M. A., & Shavelson, R. J. (1996). Problems and issues in the use of concept maps in science assessment. Journal of Research in Science Teaching, 33(6), 569–600.

    Article  Google Scholar 

  • Schank, R. C. (1982). Dynamic memory: A theory of reminding and learning in computers and people. New York: Cambridge University Press.

    Google Scholar 

  • Schank, R. C. (1999). Dynamic memory revisited. New York: Cambridge University Press.

    Book  Google Scholar 

  • Schauble, L., Klopfer, L. E., & Raghavan, K. (1991). Students’ transition from an engineering model to a science model of experimentation. Journal of Research in Science Teaching, 28(9), 859–882.

    Article  Google Scholar 

  • Serrano-Laguna, Á., Torrente, J., Moreno-Ger, P., & Fernández-Manjón, B. (2012). Tracing a little for big improvements: application of learning analytics and videogames for student assessment. Procedia Computer Science, 15, 203–209.

    Article  Google Scholar 

  • Shah, J. J., Vargas-Hernandez, N., & Smith, S. M. (2003). Metrics for measuring ideation effectiveness. Design Studies, 24(2), 111–134. https://doi.org/10.1016/S0142-694X(02)00034-0.

    Article  Google Scholar 

  • Shavelson, R. J., Baxter, G. P., & Pine, J. (1991). Performance assessment in science. Applied Measurement in Education, 4(4), 347–362.

    Article  Google Scholar 

  • Tempelaar, D. T., Rienties, B., & Giesbers, B. (2015). In search for the most informative data for feedback generation: learning analytics in a data-rich context. Computers in Human Behavior, 47, 157–167.

    Article  Google Scholar 

  • Vieira, C., Goldstein, M. H., Purzer, Ş., & Magana, A. J. (2016). Using learning analytics to characterize student experimentation strategies in the context of engineering design. Journal of Learning Analytics, 3(3), 291–317.

    Article  Google Scholar 

  • Vieira, C., Magana, A. J., & Purzer, S. (2017). Identifying engineering students’ design practices using process data. In Proceedings of 2017 research in engineering education symposium (REES). Bogotá-Colombia.

  • Wertheim, J., Osborne, J., Quinn, H., Pecheone, R., Schultz, S., Holthuis, N., & Martin, P. (2016). An analysis of existing science assessments and the implications for developing assessment tasks for the NGSS. https://snapgse.stanford.edu/sites/default/files/snap_landscape_analysis_of_assessments_for_ngss_0.pdf. Accessed 15 Aug 2019.

  • Xie, C., Zhang, Z., Nourian, S., Pallant, A., & Hazzard, E. (2014). Time series analysis method for assessing engineering design processes using a CAD tool. International Journal of Engineering Education, 30, 218–230.

    Google Scholar 

Download references

Funding

This research is based upon work supported by the National Science Foundation under grant no. 1417939.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ryan Montgomery.

Ethics declarations

Conflict of Interest

Samuel Crane provided data and assisted in the development of the digital tools used in this study, as part of his role as Director of Data Science at Amplify Education.

Ethical Approval

All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards.

Informed Consent

Informed consent and parental consent were obtained from all individual participants included in the study.

Disclaimer

Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Montgomery, R., Greenwald, E., Crane, S. et al. Operationalizing Optimization in a Middle School Virtual Engineering Internship. J Sci Educ Technol 29, 409–420 (2020). https://doi.org/10.1007/s10956-020-09826-8

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10956-020-09826-8

Keywords

Navigation