Advertisement

Technology, Knowledge and Learning

, Volume 19, Issue 1–2, pp 127–146 | Cite as

The Role of Strategy Knowledge for the Application of Strategies in Complex Problem Solving Tasks

  • Sascha Wüstenberg
  • Matthias Stadler
  • Jarkko Hautamäki
  • Samuel Greiff
Article

Abstract

Education in the twenty-first century must prepare students to meet the challenges of a dynamic and interconnected world. However, assessment of students’ skills tends to focus primarily on static tasks. Therefore, it is not known whether knowledge about successful strategies displayed on static tasks can be transferred to interactive and dynamic environments. This study investigated whether students’ knowledge of a certain strategy (i.e., vary-one-thing-at-a-time, VOTAT) that was assessed in a paper-and-pencil-based scientific reasoning task as well as their fluid intelligence and learning orientation would be sufficient to explain variance in the application of the VOTAT strategy in solving an interactive complex problem solving (CPS) task (i.e., CPS strategy). Furthermore, we analyzed whether CPS strategy mediated the relation between the predictors (i.e., scientific reasoning, learning orientation, fluid intelligence) and CPS performance. The sample consisted of N = 3,191 Finnish students attending the 6th and 9th grades. Results revealed that all predictors were significantly related to CPS strategy, but a substantial amount of variance in CPS strategy remained unexplained (ΔR2 = .583). Furthermore, CPS strategy mediated the relation between the predictors and CPS performance. Three implications are discussed: Different demands on the problem solver, knowledge transfer from static versus interactive tasks, or metastrategic knowledge may explain the unexplained variance in CPS strategy. Additionally, the results of our mediation analyses emphasize the importance of measuring strategies via logfiles to gain a deeper understanding of determinants of students’ CPS performance. Finally, fostering motivational factors such as students’ learning orientation yields small improvements in CPS performance.

Keywords

Complex problem solving Scientific reasoning Strategy Educational measurement Computer-based assessment Metacognition 

Notes

Acknowledgments

This research was funded by a grant from the Fonds National de la Recherche Luxembourg (ATTRACT “ASKI21”), the European Union (290683; LLLight’in’Europe), and the German Federal Ministry of Education and Research (LSA004). We are grateful to the TBA group at DIPF (http://tba.dipf.de) for providing the authoring tool CBA Item Builder and technical support.

References

  1. Adey, P., Csapó, B., Demetriou, A., Hautamäki, J., & Shayer, M. (2007). Can we be intelligent about intelligence? Why education needs the concept of plastic general ability. Educational Research Review, 2(2), 75–97.CrossRefGoogle Scholar
  2. Adey, P., & Shayer, M. (1990). Accelerating the development of formal thinking in middle and high school students. Journal of Research in Science Teaching, 27(3), 267–285.CrossRefGoogle Scholar
  3. Adey, P., & Shayer, M. (1993). An exploration of long-term far-transfer effects following an extended intervention program in the high school science curriculum. Cognition and Instruction, 11(1), 1–29.CrossRefGoogle Scholar
  4. Autor, H., Levy, F., & Murnane, R. J. (2003). The skill content of recent technological change: An empirical investigation. Quarterly Journal of Economics, 118(4), 1279–1333.CrossRefGoogle Scholar
  5. Boland, R. J., Singh, J., Salipante, P., Aram, J. D., Fay, S. Y., & Kanawattanachai, P. (2001). Knowledge representations and knowledge transfer. Academy of Management Journal, 44(2), 393–417.CrossRefGoogle Scholar
  6. Bollen, K. A. (1989). Structural equations with latent variables. New York: Wiley.Google Scholar
  7. Brown, A. L. (1997). Transforming schools into communities of thinking and learning about serious matters. American psychologist, 52(4), 399–413.Google Scholar
  8. Brunner, M. (2008). No g in education? Learning and Individual Differences, 18, 152–165.CrossRefGoogle Scholar
  9. Buchner, A. (1995). Basic topics and approaches to the study of complex problem solving. In P. A. Frensch & J. Funke (Eds.), Complex problem solving: The European perspective. Hillsdale, NJ: Erlbaum.Google Scholar
  10. Csapò, B., Ainley, J., Bennett, R. E., Latour, T., & Law, N. (2012). Technological issues for computer-based assessment. In P. Griffin, B. McGaw, & E. Care (Eds.), Assessment and teaching of 21st century skills. Dordrecht: Springer.Google Scholar
  11. Elliott, E. S., & Dweck, C. S. (1988). Goals: an approach to motivation and achievement. Journal of Personality and Social Psychology, 54(1), 5–12.CrossRefGoogle Scholar
  12. Funke, J. (2001). Dynamic systems as tools for analysing human judgement. Thinking & Reasoning, 7(1), 69–89.CrossRefGoogle Scholar
  13. Funke, J. (2010). Complex problem solving: A case for complex cognition? Cognitive Processing, 11, 133–142.CrossRefGoogle Scholar
  14. Funke, J., & Frensch, P. A. (2007). Complex problem solving: The European perspective—10 years after. In D. H. Jonassen (Ed.), Learning to solve complex scientific problems (pp. 25–47). New York: Lawrence Erlbaum.Google Scholar
  15. Gonzalez, C., Vanyukov, P., & Martin, M. K. (2005). The use of microworlds to study dynamic decision making. Computers in Human Behavior, 21(2), 273–286.CrossRefGoogle Scholar
  16. Greiff, S., Fischer, A., Wüstenberg, S., Sonnleitner, P., Brunner, M., & Martin, R. (2013a). A multitrait-multimethod study of assessment instruments for Complex Problem Solving. Intelligence, 41, 579–596.CrossRefGoogle Scholar
  17. Greiff, S., Wüstenberg, S., & Funke, J. (2012). Dynamic problem solving: A new measurement perspective. Applied Psychological Measurement, 36(3), 189–213.CrossRefGoogle Scholar
  18. Greiff, S., Wüstenberg, S., Molnár, G., Fischer, A., Funke, J., & Csapó, B. (2013b). Complex problem solving in educational settings—Something beyond g: Concept, assessment, measurement invariance, and construct validity. Journal of Educational Psychology, 105, 364–379.CrossRefGoogle Scholar
  19. Guthke, J., Beckmann, J. F., Stein, H., Rittner, S., & Vahle, H. (1995). Adaptive Computergestützte Intelligenz-Lerntestbatterie (ACIL) [Adaptive computer supported intelligence learning test battery]. Mödlingen: Schuhfried.Google Scholar
  20. Hardre, P. L., Crowson, H. M., Xie, K., & Ly, C. (2007). Testing differential effects of computer-based, web-based and paper-based administration of questionnaire research instruments. British Journal of Educational Technology, 38(1), 5–22.CrossRefGoogle Scholar
  21. Hattie, J. (2013). Visible learning: A synthesis of over 800 meta-analyses relating to achievement. London: Routledge.Google Scholar
  22. Hautamäki, J., Arinen, P., Eronen, S., Hautamäki, A., Kupianien, S., Lindblom, B., et al. (2002). Assessing learning-to-learn: A framework. Helsinki: Centre for Educational Assessment, Helsinki University/National Board of Education.Google Scholar
  23. Herrnstein, R. J., & Murray, C. (1994). The bell curve. Intelligence and class structure in American life. New York: Free Press.Google Scholar
  24. Hu, L., & Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling, 6, 1–55.CrossRefGoogle Scholar
  25. Ifenthaler, D., Eseryel, D., & Ge, X. (2012). Assessment for game-based learning. In D. Ifenthaler, D. Eseryel, & X. Ge (Eds.), Assessment in game-based learning. Dordrecht: Springer.CrossRefGoogle Scholar
  26. Inhelder, B., & Piaget, J. (1958). The growth of logical thinking from childhood to adolescence. New York: Basic Books.CrossRefGoogle Scholar
  27. Klahr, D., & Dunbar, K. (1988). Dual space search during scientific reasoning. Cognitive science, 12(1), 1–48.CrossRefGoogle Scholar
  28. Klauer, K. J., & Phye, G. D. (2008). Inductive reasoning: A training approach. Review of Educational Research, 78(1), 85–123.CrossRefGoogle Scholar
  29. Koslowski, B. (1996). Theory and evidence: The development of scientific reasoning. Cambridge, MA: MIT Press.Google Scholar
  30. Kröner, S., Plass, J. L., & Leutner, D. (2005). Intelligence assessment with computer simulations. Intelligence, 33(4), 347–368.CrossRefGoogle Scholar
  31. Krkovic, K., Greiff, S., Kupiainen, S., Vainikainen, M.-P., & Hautamäki, J. (2014). Teacher evaluation of student ability: what roles do teacher gender, student gender, and their interaction play? Educational Research, 56, 243–256.Google Scholar
  32. Kuhn, D. (2000). Metacognitive development. Current Directions in Psychological Science, 9(5), 178–181.CrossRefGoogle Scholar
  33. Kuhn, D. (2005). Education for thinking. Cambridge, MA: Harvard University Press.Google Scholar
  34. Kuhn, D., & Dean, D. (2005). Is developing scientific thinking all about learning to control variables? Psychological Science, 16, 866–870.CrossRefGoogle Scholar
  35. Kuhn, D., Iordanou, K., Pease, M., & Wirkala, C. (2008). Beyond control of variables: What needs to develop to achieve skilled scientific thinking? Cognitive Development, 23(4), 435–451.CrossRefGoogle Scholar
  36. Kuhn, D., & Pearsall, S. (1998). Relations between metastrategic knowledge and strategic performance. Cognitive Development, 13(2), 227–247.CrossRefGoogle Scholar
  37. Kuhn, D., & Pease, M. (2008). What needs to develop in the development of inquiry skills? Cognition and Instruction, 26(4), 512–559.CrossRefGoogle Scholar
  38. Leeson, H. V. (2006). The mode effect: A literature review of human and technological issues in computerized testing. International Journal of Testing, 6(1), 1–24.CrossRefGoogle Scholar
  39. Loh, C. S. (2013). Improving the impact and return of investment of game-based learning. International Journal of Virtual and Personal Learning Environments, 4(1), 1–15.CrossRefGoogle Scholar
  40. Mayer, R. E., & Wittrock, M. C. (2006). Problem solving. In P. A. Alexander & P. H. Winne (Eds.), Handbook of educational psychology (pp. 287–303). Mahwah, NJ: Erlbaum.Google Scholar
  41. McGrew, K. S. (2009). CHC theory and the human cognitive abilities project: Standing on the shoulders of the giants of psychometric intelligence research. Intelligence, 37, 1–10.CrossRefGoogle Scholar
  42. Muthén, L. K., \& Muthén, B. O. (1998–2012). Mplus user’s guide (7th Edn). Los Angeles, CA: Muthén & Muthén.Google Scholar
  43. Nicholls, J. G. (1984). Achievement motivation: Conceptions of ability, subjective experience, task choice, and performance. Psychological Review, 91(3), 328.CrossRefGoogle Scholar
  44. Niemivirta, M. (1999). Motivational and cognitive predictors of goal setting and task performance. International Journal of Educational Research, 31(6), 499–513.CrossRefGoogle Scholar
  45. Novick, L. R., & Bassok, M. (2005). Problem solving. In K. J. Holyoak & R. G. Morrison (Eds.), The Cambridge handbook of thinking and reasoning (pp. 321–349). Cambridge: Cambridge University Press.Google Scholar
  46. Organisation for Economic Co-operation and Development. (2004). Problem solving for tomorrow’s world. First measures of cross-curricular competencies from PISA 2003. Paris, France: OECD.Google Scholar
  47. Organisation for Economic Co-operation and Development. (2010). PISA 2012 problem solving framework [draft for field trial]. Paris: OECD.Google Scholar
  48. Patrick, B. C., Hisley, J., & Kempler, T. (2000). “What’s everybody so excited about?” The effects of teacher enthusiasm on student intrinsic motivation and vitality. The Journal of Experimental Education, 68(3), 217–236.CrossRefGoogle Scholar
  49. Raven, J. (2000). Psychometrics, cognitive ability, and occupational performance. Review of Psychology, 7, 51–74.Google Scholar
  50. Ross, J. D., & Ross, C. M. (1976). Ross test of higher cognitive processes. Novato, CA: Academic Therapy Publications.Google Scholar
  51. Ryan, R. M., & Deci, E. L. (2000a). Self-determination theory and the facilitation of intrinsic motivation, social development, and well-being. American Psychologist, 55, 68–78.CrossRefGoogle Scholar
  52. Ryan, R. M., & Deci, E. L. (2000b). Intrinsic and extrinsic motivations: Classic definitions and new directions. Contemporary Educational Psychology, 25, 54–67.CrossRefGoogle Scholar
  53. Scherer, R., & Tiemann, R. (2012). Factors of problem-solving competency in a virtual chemistry environment: The role of metacognitive knowledge about strategies. Computers & Education, 59(4), 1199–1214.CrossRefGoogle Scholar
  54. Schweizer, F., Wüstenberg, S., & Greiff, S. (2013). Validity of the MicroDYN approach: Complex problem solving predicts school grades beyond working memory capacity. Learning & Individual Differences, 24, 42–52.CrossRefGoogle Scholar
  55. Shayer, M. (2008). Intelligence for education: As described by Piaget and measured by psychometrics. British Journal of Educational Psychology, 78(1), 1–29.CrossRefGoogle Scholar
  56. Shayer, M., Küchemann, D. E., & Wylam, H. (1976). The distribution of Piagetian stages of thinking in British middle and secondary school children. British Journal of Educational Psychology, 46(2), 164–173.CrossRefGoogle Scholar
  57. Sonnleitner, P., Brunner, M., Keller, U., & Martin, R. (2014). Differential relations between facets of complex problem solving and students’ immigration background. Journal of Educational Psychology. doi: 10.1037/a0035506.
  58. Sternberg, R. J., Castejón, J. L., Prieto, M. D., Hautamäki, J., & Grigorenko, E. (2001). Confirmatory factor analysis of the Sternberg Triarchic Abilities Test (Multiple Choice Items) in Three International Samples: An empirical test of the Triarchic Theory. European Journal of Psychological Assessment, 17, 1–16.CrossRefGoogle Scholar
  59. Thorndyke, P. W., & Hayes-Roth, B. (1979). The use of schemata in the acquisition and transfer of knowledge. Cognitive Psychology, 11(1), 82–106.CrossRefGoogle Scholar
  60. Tschirgi, J. E. (1980). Sensible reasoning: A hypothesis about hypotheses. Child Development, 51, 1–10.CrossRefGoogle Scholar
  61. Tuominen-Soini, H., Salmela-Aro, K., & Niemivirta, M. (2008). Achievement goal orientations and subjective well-being: A person-centred analysis. Learning and Instruction, 18(3), 251–266.CrossRefGoogle Scholar
  62. Watkins, C. (2001). Learning about learning enhances performance. NSIN Research Matters, 13, 1–9.Google Scholar
  63. Wilkening, F., & Sodian, B. (2005). Scientific reasoning in young children: Introduction. Swiss Journal of Psychology, 64(3), 137–139.CrossRefGoogle Scholar
  64. Wirth, J., & Klieme, E. (2003). Computer-based assessment of problem solving competence. Assessment in Education: Principles, Policy, & Practice, 10, 329–345.CrossRefGoogle Scholar
  65. Wittmann, W., & Hattrup, K. (2004). The relationship between performance in dynamic systems and intelligence. Systems Research and Behavioral Science, 21, 393–409.CrossRefGoogle Scholar
  66. Wüstenberg, S., Greiff, S., & Funke, J. (2012). Complex problem solving—More than reasoning? Intelligence, 40, 1–14.CrossRefGoogle Scholar
  67. Wüstenberg, S., Greiff, S., Molnár, G., & Funke, J. (2014). Cross-national gender differences in complex problem solving and their determinants. Learning and Individual Differences, 29, 18–29.CrossRefGoogle Scholar
  68. Zimmerman, C. (2007). The development of scientific thinking skills in elementary and middle school. Developmental Review, 27(2), 172–223.CrossRefGoogle Scholar
  69. Zohar, A., & Peled, B. (2008). The effects of explicit teaching of metastrategic knowledge on low-and high-achieving students. Learning and instruction, 18(4), 337–353.CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media Dordrecht 2014

Authors and Affiliations

  • Sascha Wüstenberg
    • 1
  • Matthias Stadler
    • 1
  • Jarkko Hautamäki
    • 2
  • Samuel Greiff
    • 1
  1. 1.University of LuxembourgLuxembourgLuxembourg
  2. 2.University of HelsinkiHelsinkiFinland

Personalised recommendations