Advertisement

The Concept of Competence and Its Relevance for Science, Technology and Mathematics Education

  • Mathias Ropohl
  • Jan Alexis Nielsen
  • Christopher Olley
  • Silke Rönnebeck
  • Kay Stables
Chapter
Part of the Contributions from Science Education Research book series (CFSE, volume 4)

Abstract

Since the beginning of the twenty-first century, the concept of competence has been introduced as a new paradigm in several educational systems. It reflects the need of educational systems to respond to societal and economic changes, i.e. the transition from industrial- to information-based societies. In contrast to earlier educational goals that focused more on basic skills and knowledge expectations, competences are more functionally oriented. They involve the ability to solve complex problems in a particular context, e.g. in vocational or everyday situations. In science, technology and mathematics education, the concept of competence is closely linked to the concept of literacy. Apart from these rather cognitive and affective perspectives influenced by the need to assess students’ achievement of desired learning goals in relation to their interest and motivation, the perspectives of the concept of Bildung as well as of the labour market influence today’s definition of educational goals. In order to address these perspectives, twenty-first-century skills were defined that encompass skills believed to be critically important to success in today’s world like, e.g. innovation and communication. This chapter addresses these developments by describing the concept of competence, by explaining its relevance for science, technology and mathematics education and by examining future directions. The chapter concludes with some remarks regarding commonalities and differences between the three domains: science, technology and mathematics.

References

  1. Ainley, J., Fraillon, J., & Freeman, C. (2005). National assessment program: ICT literacy years 6 & 10 report. Carlton South: The Ministerial Council on Education, Employment, Training and Youth Affairs.Google Scholar
  2. American Association for the Advancement of Science. (Ed.). (1989, 1990). Science for all Americans – Online. Retrieved from https://www.project2061.org/publications/sfaa/online/intro.htm
  3. Australian Curriculum, Assessment and Reporting Authority. (2016). Design and technologies sequence of achievement F-10. Retrieved from https://www.australiancurriculum.edu.au
  4. Binkley, M., Erstad, O., Herman, J., Raizen, S., Ripley, M., Miller-Ricci, M., & Rumble, M. (2012). Defining twenty-first century skills. In I. P. Griffin, B. McGaw, & E. Care (Eds.), Assessment and teaching of 21st century skills (pp. 17–66). New York: Springer.CrossRefGoogle Scholar
  5. Black, P., Harrison, C., Hodgen, J., Marshall, B., & Serret, N. (2010). Validitiy in teachers’ summative assessments. Assessment in Education: Principles, Policy & Practice, 17(2), 215–232.CrossRefGoogle Scholar
  6. Black, P., & Harrisson, C. (2016). Teacher education programme. SAILS – Strategies for assessment of inquiry learning in science. London: King’s College London.Google Scholar
  7. Black, P., & Wiliam, D. (2004). The formative purpose: Assessment must first promote learning. Yearbook of the National Society for the Study of Education, 103, 20–50.CrossRefGoogle Scholar
  8. Black, P., & Wiliam, D. (2006). Inside the black box: Raising standards through classroom assessment. London: Granada Learning.Google Scholar
  9. Blömeke, S., Gustafsson, J.-E., & Shavelson, R. J. (2015). Beyond dichotomies. Competence viewed as a continuum. Zeitschrift für Psychologie, 223(1), 3–13.CrossRefGoogle Scholar
  10. Blum, W., Galbraith, P. L., & Henn, H. W. (2007). Modelling and applications in mathematics education: The 14th ICMI study. New York: Springer.CrossRefGoogle Scholar
  11. Bulte, A. M. W., Westbroek, H. B., de Jong, O., & Pilot, A. (2006). A research approach to designing chemistry education using authentic practices as contexts. International Journal of Science Education, 28(9), 1063–1086.CrossRefGoogle Scholar
  12. Bybee, R. W. (1997). Towards an understanding of scientific literacy. In W. Gräber & C. Bolte (Eds.), Scientific literacy – An international symposium (pp. 37–68). Kiel: Institut für die Pädagogik der Naturwissenschaften (IPN).Google Scholar
  13. Christensen, T. S., Hobel, P., & Paulsen, M. (2012). Evaluering af projekt innovationskraft og entreprenørskab i gymnasiet i Region Hovedstaden. Innovation i gymnasiet. Rapport 3 og 4 [evaluation of the project innovation and entrepreneurship in high school in the capital region. Innovation in high school. Report 3 and 4]. Odense, Denmark: Institut for Filosofi, Pædagogik og Religionsstudier, Syddansk Universitet.Google Scholar
  14. Cizek, G. (2001). More unintended consequences of high-stakes testing. Educational Measurement: Issues and Practice, 20, 19–28.CrossRefGoogle Scholar
  15. Cropley, A. (2006). In praise of convergent thinking. Creativity Research Journal, 18(3), 391–404.CrossRefGoogle Scholar
  16. Danish Ministry of Education. (1995). En samlet uddannelse strategies på iværksætterområdet [a comprehensive educational strategy on the area of entrepreneurship]. Copenhagen: Danish Ministry of Education.Google Scholar
  17. DeBoer, G. E. (2000). Scientific literacy: Another look at its historical and contemporary meanings and its relationship to science education reform. Journal of Research in Science Teaching, 37(6), 582–601.CrossRefGoogle Scholar
  18. Department for Education. (2015). Design and technology: GCSE subject content. London: Department for Education. Retrieved from https://www.gov.uk/government/uploads/system uploads/attachment_data/file/473188/GCSE_design_technology_subject_content_nov_ 2015.pdf.
  19. Departmentment for Education and Employment/Qualifications and Curriculum Authority. (1999). Design and technology: National curriculum for England. London: HMSO.Google Scholar
  20. Doppelt, Y. (2009). Assessing creative thinking in design-based learning. International Journal of Technology and Design Education, 19(1), 55–65.CrossRefGoogle Scholar
  21. Dowling, P. (2010). Abandoning mathematics and hard labour in schools: A new sociology of education and curriculum reform. In: MADIF7, 2010–01-27 - 2010-01-27, Stockholm.Google Scholar
  22. European Commission. (2010). Europe 2020: A strategy for smart, sustainable and inclusive growth. Brussels: EU-Commission.Google Scholar
  23. Gilbert, J. K. (2006). On the nature of “context” in chemical education. International Journal of Science Education, 28(9), 957–976.CrossRefGoogle Scholar
  24. Gitomer, D., & Duschl, R. (1998). Emerging issues and practices in science assessment. In B. Fraser & K. Tobin (Eds.), International handbook of science education (pp. 791–810). Dordrecht: Kluwer Academic Publishers.CrossRefGoogle Scholar
  25. Grugeon-Allys, B., Godino, J., & Castela, C. (2016). Three perspectives on the issue of theoretical diversity. In B. R. Hodgson, A. Kuzniak, & J. B. Lagrange (Eds.), The didactics of mathematics: Approaches and issues. A homage to Michèle artigue (pp. 57–86). New York: Springer.CrossRefGoogle Scholar
  26. Haladyna, T. M., Downing, S. M., & Rodriguez, M. C. (2002). A review of multiple-choice item-writing guidelines for classroom assessment. Applied Measurement in Education, 15(3), 309–333.CrossRefGoogle Scholar
  27. Harlen, W. (2013). Assessment & inquiry-based science education: Issues in policy and practice. Trieste: Global Network of Science Academies (IAP) Science Education Programme.Google Scholar
  28. Hodgen, J., & Wiliam, D. (2006). Mathematics inside the black box: Assessment for learning in the mathematics classroom. London: Granada Learning.Google Scholar
  29. International Technology Education Association. (2007). Standards for technological literacy: Content for the study of technology. Retrieved from http://www.iteea.org/File.aspx?id=67767&v= b26b7852.
  30. Jablonka, E. (2015). The evolvement of numeracy and mathematical literacy curricula and the construction of hierarchies of numerate or mathematically literate subjects. ZDM, 47, 599–609.CrossRefGoogle Scholar
  31. Jeffrey, B., & Craft, A. (2004). Teaching creatively and teaching for creativity: Distinctions and relationships. Educational Studies, 30(1), 77–87.CrossRefGoogle Scholar
  32. Kauertz, A., Neumann, K., & Härtig, H. (2012). Competence in science education. In B. Fraser, K. Tobin, & C. McRobbie (Eds.), Second international handbook of science education, Springer international handbooks of education (Vol. 24, pp. 711–721). Dordrecht: Springer.CrossRefGoogle Scholar
  33. Kilpatrick, J. (2014). Competency frameworks in mathematics education. In S. Lerman (Ed.), Encyclopedia of mathematics education (pp. 85–87). Dordrecht: Springer.Google Scholar
  34. Kimbell, R. (1997). Assessing technology: International trends in curriculum and assessment. Buckingham: Open University Press.Google Scholar
  35. Kimbell, R. (2012). The origins and underlying principles of e-scape. International Journal of Technology and Design Education, 22(2), 123–134.CrossRefGoogle Scholar
  36. Kimbell, R., & Perry, D. (2001). Design and technology in a knowledge economy. London: Engineering Council.Google Scholar
  37. Kimbell, R., & Stables, K. (2007). Researching design learning: Issues and findings from two decades of research and development. Berlin: Springer.CrossRefGoogle Scholar
  38. Kimbell, R., Stables, K., Wheeler, T., Wozniak, A., & Kelly, A. V. (1991). The assessment of performance in design and technology. London: SEAC / HMSO.Google Scholar
  39. Kimbell, R., Wheeler, T., Stables, K., Shepard, T., Martin, F., Davies, D., ... Whitehouse, G (2009). E-scape portfolio assessment: A research & development project for the Department of Children, Families and Schools, phase 3 report. London: Goldsmiths College.Google Scholar
  40. Klieme, E., Avenarius, H., Blum, W., Döbrich, P., Gruber, H., Prenzel, M., et al. (2004). The development of national educational standards – An expertise. Berlin: Federal Ministry of Education and Research (BMBF).Google Scholar
  41. Klieme, E., Hartig, J., & Rauch, D. (2008). The concept of competence in educational contexts. In J. Hartig, E. Klieme, & D. Leutner (Eds.), Assessments of competences in educational contexts (pp. 3–22). Cambridge, MA: Hogrefe.Google Scholar
  42. Kline, S. J., & Rosenberg, N. (1986). An overview of innovation. In I. R. Landau & N. Rosenberg (Eds.), The positive sum game (pp. 275–305). Washington, DC: National Academy Press.Google Scholar
  43. Koeppen, K., Hartig, J., Klieme, E., & Leutner, D. (2008). Current issues in competence modeling and assessment. Zeitschrift für Psychologie/Journal of Psychology, 216(2), 61–73.CrossRefGoogle Scholar
  44. Layton, D. (1993). Technology’s challenge to science education: Cathedral, quarry or company store. Buckingham: Open University Press.Google Scholar
  45. Leuders, T., & Sodian, B. (2013). Inwiefern sind Kompetenzmodelle dazu geeignet kognitive Prozesse von Lernenden zu beschreiben? [To what extent can competence models describe cognitive processes?]. Zeitschrift für Erziehungswissenschaften, 16, 27–33.CrossRefGoogle Scholar
  46. McLaren, S. (2012). Assessment is for learning: Supporting feedback. International Journal of Technology and Design Education, 22(2), 227–245.CrossRefGoogle Scholar
  47. Mellin-Olsen, S. (1987). The politics of mathematics education. New York: Springer.Google Scholar
  48. Ministry of Education. (2010). Technology curriculum support. Wellington: techlink.org.nz.Google Scholar
  49. Mioduser, D. (2015). The pedagogical ecology of technology education: An agenda for future research and development. In P. J. Williams, A. Jones, & C. Buntting (Eds.), The future of technology education (pp. 77–98). Singapore: Springer.Google Scholar
  50. Moreland, J. (2009). Assessment: Focusing on the learner and the subject. In A. Jones & M. de Vries (Eds.), International handbook of research and development in technology education (pp. 445–448). Rotterdam: Sense Publishers.Google Scholar
  51. Moreland, J., Jones, A., & Barlex, D. (2008). Design and technology inside the black box: Assessment for learning in the design and technology classroom. London: GL Assessment.Google Scholar
  52. National Research Council. (2012). A framework for K-12 science education: Practices, crosscutting concepts, and core ideas. Washington, DC: The National Academies Press.Google Scholar
  53. Nielsen, J. A. (2015). Assessment of innovation competency: A thematic analysis of upper secondary school teachers’ talk. The Journal of Educational Research, 108(4), 318–330.CrossRefGoogle Scholar
  54. Nielsen, J. A., & Holmegaard, H. T. (2015). Innovation and employability: Moving beyond the buzzwords - a theoretical lens to improve chemistry education. In I. Eilks & A. Hofstein (Eds.), Relevant chemistry education – From theory to practice (pp. 317–334). Rotterdam: Sense Publishers.Google Scholar
  55. Niss, M. (2004). Mathematical competencies and the learning of mathematics: The Danish KOM project. In A. Gagtsis & Papastavridis (Eds.), 3rd Mediterranean conference on mathematical education, 3–5 january 2003, Athens, Greece (pp. 115–124). Athens: The Hellenic mathematical society.Google Scholar
  56. Niss, M., & Jablonka, E. (2014). Mathematical literacy. In S. Lerman (Ed.), Encyclopedia of mathematics education (pp. 391–396). Dordrecht: Springer.Google Scholar
  57. Nordic Council of Ministers. (2011). Kreativitet, innovation og entreprenørskab i de nordiske uddannelsessystemer – Fra politiske hensigtserklæringer til praktisk handling [creativity, innovation, and entrepreneurship in the Nordic educational systems – From political intentions to practical action]. Copenhagen, Denmark: Nordic Council of Ministers.Google Scholar
  58. OECD. (2010). The OECD innovation strategy: Getting a head start on tomorrow. Paris: OECD Publishing.Google Scholar
  59. OECD. (2014). PISA 2012 results: What students know and can do (volume I). Paris: OECD Publishing.Google Scholar
  60. OECD. (2016). PISA 2015 assessment and analytical framework. Science, reading, mathematic and financial literacy. Paris: OECD Publishing.CrossRefGoogle Scholar
  61. Oosterbeek, H., van Praag, M., & Ijsselstein, A. (2010). The impact of entrepreneurship education on entrepreneurship skills and motivation. European Economic Review, 54(3), 442–454.CrossRefGoogle Scholar
  62. Qualifications and Curriculum Authority. (2007). Science. Programme of study for key stage 3 and attainment targets. Retrieved from https://www.stem.org.uk/elibrary/resource/28541
  63. Ritz, J., & Reed, P. (2006). Technology education and the influence of research: A United States perspective, 1985–2005. In M. J. de Vries & I. Mottier (Eds.), International handbook of technology education: Reviewing the past twenty years (pp. 113–124). Rotterdam: Sense Publishers.Google Scholar
  64. Roberts, D. (2007). Scientific literacy/science literacy. In S. K. Abell & N. G. Lederman (Eds.), Handbook of research on science education (pp. 729–780). Mahwah: Lawrence Erlbaum.Google Scholar
  65. Rönnebeck, S., Bernholt, S., & Ropohl, M. (2016). Searching for a common ground – A literature review of empirical research on scientific inquiry activities. Studies in Science Education, 52(2), 161–197.CrossRefGoogle Scholar
  66. Rump, C., Nielsen, J. A., Hammar, P., & Christiansen, F. V. (2013). A framework for teaching educators to teach innovation. Paper presented at the SEFI (European Society for Engineering Education) 2013 conference, Leuven, Belgien.Google Scholar
  67. Rychen, D. S., & Salganik, L. H. (2003). A holistic model of competence. In D. S. Rychen & L. H. Salganik (Eds.), Key competencies for a successful life and a well-functioning society (pp. 41–62). Göttingen: Hogrefe & Huber.Google Scholar
  68. Sadler, D. R. (2013). Making competent judgements of competence. In S. Blömeke, O. Zlatkin-Troitschanskaia, C. Kuhn, & J. Fege (Eds.), Modeling and measuring competencies in higher education – Tasks and challenges (pp. 13–28). Rotterdam: Sense Publishers.CrossRefGoogle Scholar
  69. Schecker, H., & Parchmann, I. (2007). Standards and competence models: The German situation. In D. Waddington, P. Nentwig, & S. Schanze (Eds.), Standards in science education (pp. 147–164). Münster: Waxmann.Google Scholar
  70. Seery, N., Canty, D., & Phelan, P. (2012). The validity and value of peer assessment using adaptive comparative judgement in design driven practical education. International Journal of Technology and Design Education, 22(2), 205–226.CrossRefGoogle Scholar
  71. Sen, A. (1992). Inequality reexamined. New York: Russell Sage Foundation.Google Scholar
  72. Shavelson, R. J. (2010). On the measurement of competency. Empirical Research in Vocational Education and Training, 1, 43–65.Google Scholar
  73. Skolverket. (2009). Syllabuses for the compulsory school. Stockholm.Google Scholar
  74. Skolverket. (2012). Upper secondary school 2011. Stockholm.Google Scholar
  75. Snape, P., & Fox-Turnbull, W. (2013). Perspectives of authenticity: Implementation in technology education. International Journal of Technology and Design Education, 23(1), 51–68.CrossRefGoogle Scholar
  76. Stables, K. (2013). Social and cultural relevance in approaches to developing designerly well-being: The potential and challenges when learners call the shots in design and technology projects. Paper presented at the Technology Education for the future: A play on sustainability, Christchurch, New Zealand.Google Scholar
  77. Stables, K., & Kimbell, R. (2000). The unpickled portfolio: Pioneering performance assessment in design and technology. Paper presented at the D&T International Millennium Conference: Learning from Experience: Modelling new futures, Institute of Education, University of London.Google Scholar
  78. The Conference Board of Canada. (2013). Innovation skills profile 2.0. Ottawa: The Conference Board of Canada, Centre for Business Innovation.Google Scholar
  79. Theyßen, H., Schecker, H., Gut, C., Hopf, M., Kuhn, J., Schreiber, N., et al. (2014). Modelling and assessing experimental competencies in physics. In C. Bruguière, A. Tiberghien, & P. Clément (Eds.), Topics and trends in current science education: 9th ESERA conference selected contributions (pp. 321–339). Dordrecht: Springer.CrossRefGoogle Scholar
  80. Theyßen, H., Dickmann, M., Neumann, K., Schecker, H., & Eickhorst, B. (2016). Measuring experimental skills in large scale assessments: A simulation-based test instrument. In J. Lavonen, K. Juuti, J. Lampiselkä, A. Uitto, & K. Hahl (Hrsg.), Proceedings of the bi-annual conference of the European Science Education Research Conference (ESERA) (pp. 1598–6006).Google Scholar
  81. Trilling, B., & Fadel, C. (2009). 21st century skills: Learning for life in our times. San Francisco: Wiley.Google Scholar
  82. Turnbull, W. (2002). The place of authenticity in technology in the New Zealand curriculum. International Journal of Technology and Design Education, 12, 23–40.CrossRefGoogle Scholar
  83. Waddington, D., Nentwig, P., & Schanze, S. (Eds.). (2007). Making it comparable – Standards in science education. Münster: Waxmann.Google Scholar
  84. Weinert, F. E. (2001). Concept of competence: A conceptual clarification. In D. S. Rychen & L. H. Salganik (Eds.), Defining and selecting key competencies (pp. 45–65). Seattle: Hogrefe & Huber.Google Scholar
  85. White House. (2011). A strategy for American innovation, securing our economic growth and prosperity. Washington, DC: The National Economic Council, Council of Economic Advisors, and Office of Science and Technology Policy.Google Scholar
  86. Williams, P. J. (2006). Technology education in Australia: Twenty years in retrospect. In M. J. de Vries & I. Mottier (Eds.), International handbook of technology education: Reviewing the past twenty years (pp. 183–196). Rotterdam: Sense Publishers.Google Scholar
  87. Williams, P. J. (2013). Engineering studies. In P. J. Williams & P. Newhouse (Eds.), Digital representations of student performance for assessment (pp. 99–124). Rotterdam: Sense Publishers.CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG 2018

Authors and Affiliations

  • Mathias Ropohl
    • 1
  • Jan Alexis Nielsen
    • 2
  • Christopher Olley
    • 3
  • Silke Rönnebeck
    • 1
    • 4
  • Kay Stables
    • 5
  1. 1.Leibniz-Institute for Science and Mathematics Education (IPN)KielGermany
  2. 2.Department of Science EducationUniversity of CopenhagenCopenhagenDenmark
  3. 3.King’s College LondonLondonUK
  4. 4.Kiel UniversityKielGermany
  5. 5.Goldsmiths, University of LondonLondonUK

Personalised recommendations