Skip to main content

Assessment of Real-World Problem-Solving and Critical Thinking Skills in a Technology Education Classroom

  • Chapter
  • First Online:
Applications of Research in Technology Education

Part of the book series: Contemporary Issues in Technology Education ((CITE))

Abstract

In the twenty-first century, Science, Technology, Engineering, and Mathematics (STEM) workers need to be able to utilize their existing knowledge in science and mathematics and solve complex real-world (authentic) problems. Making timely decisions on what disciplinary areas contribute to the creation of a problem and thereby developing a reasonable solution requires critical thinking. Together, problem-solving, and critical thinking are touted as the most important skills (or abilities) needed by employees for tackling the challenges of this century. Also, having the necessary background in science and mathematics, being able to communicate well, and working with diverse teams comprised of people from all walks of life are all essential for those seeking employment. Teaching students to problem-solve in real-world STEM contexts is known to be complex and there are limited assessment instruments appropriate for classroom use. Ad hoc trial and error approach to problem-solving without the use of science and mathematics-based knowledge can be detrimental in the real-world context. Herein lies the challenge: faced with a design problem out of the context of the classroom, students may not readily recognize the STEM domains applicable to solving the problem. Engineering, through its hands-on and design-oriented approach, offers a platform in K-12 grades for integrating content and practices in the STEM fields and provides opportunities for higher-order learning. This is because higher order cognitive demands (as per Blooms Taxonomy, apply, analyze, justify, and create are higher-order thinking abilities) are made when engaged in design-based problem-solving experiences. Assessment of engineering problem-solving skills in the context of technology education or in engineering education in K-12 grades is problematic because it is time-consuming to design the lessons for each aspect of the design process and evaluate problem-solving, as problems encountered may be unique to each team or individual. Frequently, students engage in their own unique and sometimes ad-hoc trajectories in defining a problem and set about developing alternative solutions. Similarly, assessment is also time-consuming and cumbersome because of a multitude of reasons: e.g., teamwork and collaboration require peer assessments and rubrics, creativity and communication are multifaceted and require separate assessments for each facet, and there is no right or wrong solution thereby requiring subjective assessments based on many factors. For assessment in the classroom, while it is possible to prescribe a process to be followed and create benchmarks regarding every aspect of an engineering design process, doing so will eliminate the authenticity of student performance. Furthermore, students being grade-focused, tend to follow instructions closely which then inhibits their creativity and investigation using the iterative process to evaluate and optimize their solution. In this chapter, we describe an assessment instrument with metacognitive questions and a related rubric for scoring student problem-solving skills when faced with an authentic design challenge. Metacognitive questioning directs students’ thinking and responses to specific assessment items measured by the related rubric. This assessment instrument and its related scoring rubric can be used by teachers for delivering instruction and later for evaluating students’ performance by removing some of the subjectivity in evaluation.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 109.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 139.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 139.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  • Barlex, D. (2003). Considering the impact of design and technology on society—the experience of the Young Foresight project. In J. R. Dakers & M. J. Devries (Eds.), The place of design and technology in the curriculum PATT conference 2003 (pp. 142–147). University of Glasgow.

    Google Scholar 

  • Barlex, D., & Trebell, D. (2008). Design-without-make: Challenging the conventional approach to teaching and learning in a design and technology classroom. International Journal of Technology and Design Education, 18(2), 119–138.

    Article  Google Scholar 

  • Budny, D., LeBold, W., & Bjedov, G. (1998). Assessment of the impact of freshman engineering courses. Journal of Engineering Education, 87(4), 405–411.

    Google Scholar 

  • Cajas, F. (2000). Research in technology education: What are we researching? A response to Theodore Lewis. Journal of Technology Education., 11(2), 61–69.

    Article  Google Scholar 

  • Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Lawrence Earlbaum Associates.

    Google Scholar 

  • Docktor, J., & Heller, K. (2009). Robust assessment instrument for student problem solving. In Proceedings of the NARST 2009 annual meeting. Retrieved from http://groups.physics.umn.edu/physed/People/Docktor/research.htm#Research_Documents

  • Howell, D. C. (2010). Statistical methods for psychology (7th ed.). Cengage Wadsworth.

    Google Scholar 

  • Jonassen, D. H., Strobel, J., & Lee, C. B. (2006). Everyday problem solving in engineering: Lessons for engineering educators. Journal of Engineering Education, 95(2), 1–14.

    Article  Google Scholar 

  • Katehi, L., Pearson, G., & Feder, M. (Eds.). (2009). Engineering in K-12 education: Understanding the status and improving the prospects (Committee on K-12 Engineering Education, National Academy of Engineering and National Research Council). National Academies Press.

    Google Scholar 

  • Kolodner, J. L. (2000). The design experiment as a research methodology for technology education. Paper presented at AAAS Technology Education Research Conference, Washington DC. Retrieved from.

    Google Scholar 

  • Martinez, M. E. (1998). What is problem solving? The Phi Delta Kappan, 79(8), 605–609. http://www.project2061.org/events/meetings/technology/papers/Kolodner.htm

  • National Academy of Engineering & National Research Council. (NAE & NRC). (2009). Engineering in K-12 education: Understanding the status and improving the prospects. The National Academies Press.

    Google Scholar 

  • National Academy of Engineering (NAE) & National Research Council (NRC) (2014). STEM integration in K-12 education: Status, prospects, and an agenda. Washington, DC: The National Academy Press.

    Google Scholar 

  • National Center for Education Statistics (NCES) (2012). U.S. department of education—institute of education sciences, trends in student performance: International trends in average scores. Retrieved on September 24, 2016 from http://nces.ed.gov/surveys/pisa/pisa2012/pisa2012highlights_6a_1.asp

  • Newell, A., & Simon, H. A. (1972). Human problem solving. Prentice-Hall Inc.

    Google Scholar 

  • Perkins, D. N., & Salomon, G. (1989). Are cognitive skills context bound? Educational Researcher, 18(1), 16–25.

    Article  Google Scholar 

  • Polya, G. (1949/1980). On solving mathematical problems in high school. In S. Krulik & R. Reys (Eds.), Problem solving in school mathematics: 1980 yearbook (pp. 1–2). Reston, VA: National Council of Teachers of Mathematics.

    Google Scholar 

  • Pope, D., Brown, M., & Miles, S. (2015). Overloaded and underprepared: Strategies for stronger schools and healthy successful kids. Jossey-Bass.

    Google Scholar 

  • Sanders, M. E. (2012). Integrative stem education as best practice. In H. Middleton (Ed.), Explorations of best practice in technology, design, & engineering education, vol. 2 (pp. 103–117). Griffith Institute for Educational Research, Queensland, Australia. ISBN 978-1-921760-95-2.

    Google Scholar 

  • Shavelson, R., Ruiz-Primo, M. A., Li, M., & Ayala, C. C. (2003). Evaluating new approaches to assessing learning. National Center for Research on Evaluation, Standards, and Student Testing, Los Angeles, CA.

    Google Scholar 

  • Sheppard, S., Colby, A., Macatangay, K., & Sullivan, W. (2006). What is engineering practice? International Journal of Engineering Education, 22(3), 429–438.

    Google Scholar 

  • Steif, P. S., & Dantzler, J. A. (2005). A statics concept inventory: Development and psychometric analysis. Journal of Engineering Education, 94(4), 363–371.

    Article  Google Scholar 

  • Wells, J. G. (2016). Efficacy of the technological/engineering design approach: Imposed cognitive demands within design-based biotechnology instruction. Journal of Technology Education, 27(2), 4–20.

    Google Scholar 

  • Zuga, K. F. (1995). Review of technology education research. Paper presented at the Technology Education Issues Symposium, June 23–29, 1996, Maui, Hawaii.

    Google Scholar 

  • Zuga, K. F. (2000). Thoughts on technology education research. In Proceedings of the first AAAS technology education research conference, Washington, DC.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Susheela Shanta .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Shanta, S. (2022). Assessment of Real-World Problem-Solving and Critical Thinking Skills in a Technology Education Classroom. In: Williams, P.J., von Mengersen, B. (eds) Applications of Research in Technology Education. Contemporary Issues in Technology Education. Springer, Singapore. https://doi.org/10.1007/978-981-16-7885-1_10

Download citation

  • DOI: https://doi.org/10.1007/978-981-16-7885-1_10

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-16-7884-4

  • Online ISBN: 978-981-16-7885-1

  • eBook Packages: EducationEducation (R0)

Publish with us

Policies and ethics