Competencies for Complexity: Problem Solving in the Twenty-First Century

Chapter
Part of the Educational Assessment in an Information Age book series (EAIA)

Abstract

In this chapter, we present a view of problem solving as a bundle of skills, knowledge and abilities that are required to deal effectively with complex non-routine situations in different domains. This includes cognitive aspects of problem solving, such as causal reasoning, model building, rule induction, and information integration. These abilities are comparatively well covered by existing tests and relate to existing theories. However, non-cognitive components, such as motivation, self-regulation and social skills, which are clearly important for real-life problem solving have only just begun to be covered in assessment. We conclude that currently there is no single assessment instrument that captures problem solving competency in a comprehensive way and that a number of challenges must be overcome to cover a construct of this breadth effectively. Research on some important components of problem solving is still underdeveloped and will need to be expanded before we can claim a thorough, scientifically backed understanding of real-world problem solving. We suggest that a focus on handling and acting within complex systems (systems competency) may be a suitable starting point for such an integrative approach.

Introduction

Problem solving is a key competency for the twenty-first century with its increasing complexity in many areas of life. This requires a new type of problem solving that involves a high degree of systems thinking, taking into account connectedness, complex dynamics and sometimes also fragility of the systems we live with. In recent years a shift away from well-defined analytical forms of problem solving, such as text book problems, towards more complex problems involving dynamic interaction with the task or collaborative elements can be observed in problem solving research. However, theoretical and conceptual progress in the field seems out of step with the recent proliferation of empirical data. What is required is a theoretical foundation for the application of problem solving assessments in educational contexts and a careful selection of tools to adequately measure this ability.

In this chapter, we present a view of problem solving competency as a bundle of skills, knowledge and abilities that are required to deal effectively with complex and dynamic non-routine situations in different domains (Fischer and Neubert 2015). We consider this change of perspective important to move away from the relatively narrow conception of problem solving ability based on a conventional psychometric perspective which has become prominent in recent years (Funke 2014a, b; Funke et al. 2017; Schoppek and Fischer 2015). The components of this competency do not necessarily need to be correlated, which is often implied in the psychometric view on problem solving. Instead, problem solving competency may be understood as a formative construct (Fischer and Neubert 2015; Schoppek and Fischer 2015), where successful performance can arise from a range of different factors. The components of problem solving competency may also vary in their degree of generalizability. For example, content knowledge is highly domain specific, while self-regulatory abilities are very general, with generic problem solving strategies occupying the middle ground.

According to Duncker (1935) problem solving simply is what is needed when an organism pursues a goal and does not know how to reach that goal. This classical definition provides three fundamental elements: a given state, a goal state, and some obstacles between them that make it impossible to reach the goal state in an immediate and obvious way. Subsequently, Newell and Simon (1972) elaborated the information processing perspective on problem solving, describing it as an activity that relies on states of knowledge, operators for changing one state into another, and constraints on applying the operators. Moving beyond the well-defined problems studied by Newell and Simon, Donald Broadbent in Great Britain and Dietrich Dörner in Germany, independently started new lines of research dealing with complex and dynamic systems (Broadbent and Aston 1978; Dörner and Reither 1978). Broadbent was interested in the implicit understanding of complex rules; Dörner wanted to understand how ordinary people (as opposed to experts) cope with complexity and dynamics in the context of everyday decision-making and problem solving. At the same time, MacKinnon and Wearing (1980) from Australia proposed to look deeper into “dynamic decision making” (Edwards 1962). The subject of this new field of research was soon labelled “complex problem solving” and it found a place in two anthologies (Frensch and Funke 1995b; Sternberg and Frensch 1991) which emphasised that problems in the real world differ markedly from simple problems and entertaining brain-teasers. This development resembled a similar change in research focus in the field of decision making, where it was recognised that experts’ “naturalistic decision making” (Klein 2008) or “risky decision making” (Huber 2012) differs fundamentally from decision making within gambling situations commonly used in laboratory-based decision making research.

In their review of the first 20 years of complex problem solving (CPS) research, Frensch and Funke (1995a) summarize a wide range of different views about the topic in the following definition: “CPS occurs to overcome barriers between a given state and a desired goal state by means of behavioral and/or cognitive, multistep activities. The given state, goal state, and barriers between given state and goal state are complex, change dynamically during problem solving, and are intransparent. The exact properties of the given state, goal state, and barriers are unknown to the solver at the outset. CPS implies the efficient interaction between a solver and the situational requirements of the task, and involves a solver’s cognitive, emotional, personal, and social abilities and knowledge” (p. 18). If one compares that CPS definition with the understanding of Newell and Simon (1972), a new emphasis on complexity, dynamics, and on non-cognitive factors becomes apparent.

Taxonomic Aspects

Over the last 30 years, many new terms have been coined like “complex” problem solving (Sternberg and Frensch 1991), “interactive” problem solving (Greiff et al. 2013a, b), or “analytical” problem solving (Leutner et al. 2012). At the same time, there has been research exploring “everyday” problem solving (Willis 1996), “creative” problem solving (Treffinger et al. 1994), “social” problem solving (Chang et al. 2004), “collaborative” problem solving (O’Neil et al. 2003) or simply “applied” problem solving (Heppner 2008).

This collection of labels for problem solving shows that there is no obvious boundary between the different labels or the constructs they represent. For example, complex problems usually involve analytical problem solving, among other kinds (Dörner 1997; Fischer et al. 2015; Funke 2010). Additionally, one can easily imagine combinations, e.g., “social creative” problem solving for group collaboration, which in turn could be “interactive”. In the recent literature, the labelling seems to be largely arbitrary. For example, the OECD (OECD 2014) decided to label the task group including analytic and interactive problems within PISA 2012 as “creative problem solving” and even constructs using identical assessment methods are sometimes labelled differently in different publications.

Beyond simple and complex the literature reveals several criteria that can be used to distinguish different types of problems: content domain (a mathematical problem in algebra is different from a problem of how to find a good apartment), time scale (how to cope with a dangerous traffic situation, compared to the question of how to make a good living), high vs. low stake problem situations (a problem in a computer game vs. admission to a prestigious university), static vs. dynamic problems, and so on. Relating to CPS, well-defined and ill-defined problems have been differentiated according to the nature of the clarity of the goal (originally introduced by McCarthy 1956). In another context, Rittel and Webber (1973) introduced a class of “inherently wicked” planning problems which they opposed to “tame” problems. Some attributes of “wicked” problems according to Rittel and Webber (1973) are as follows: (1) that there is no definitive formulation of a wicked problem, (2) that solutions to wicked problems are not true-or-false but good-or-bad, (3) that every solution to a wicked problem is a “one-shot operation” (because there is no opportunity to learn by trial-and-error, every attempt counts significantly), and (4) that wicked problems do not have an enumerable (or an exhaustively describable) set of potential solutions, nor is there a well-described set of permissible operations that may be incorporated into the plan in which they arise.

Similarly, Dörner (1975) characterized complex problems as involving dynamic systems that people must deal with under conditions of uncertainty. These systems can be described by their (1) complexity (number of interconnected elements; requires complexity reduction), (2) interconnectedness (relations between the elements; requires model building), (3) intransparency (availability and accessibility to relevant information; requires information retrieval and information management), (4) dynamics (system changes over time – either slow or fast; requires prediction of future developments), and (5) whether they involve competing goals (polytelic goal structure; requires balancing of competing interests).

A recent typology of problems based on formal aspects of the problem situation proposed by Wood (2006) seems especially useful for demonstrating the wide range of problems that could be (but is not yet) involved in the assessment of problem solving competency. According to his approach (Table 3.1), the three dichotomous dimensions “availability of data” (given or incomplete), “awareness of methods” (familiar or unfamiliar), and “precision of outcome criteria” (given or open) produce eight different types of problems. These types also differ in terms of the skills required to solve them.
Table 3.1

Typology of problems according to the three dimensions “availability of data”, “awareness of methods”, and “precision of outcome criteria” together with appropriate skill descriptions

Type

Data needed

Methods for solution

Outcome criteria

Skills required

1

Given

Familiar

Given

Recall of algorithm

2

Open

Decision about appropriate goals; exploration of knowledge networks

3

Unfamiliar

Given

Looking for parallels to known methods

4

Open

Decision about goals and choice of appropriate methods; exploration of knowledge and technique networks

5

Incomplete

Familiar

Given

Analysis of problems to decide what further data are required

6

Open

Once goals have been specified by the student, they are seen to be incomplete

7

Unfamiliar

Given

Weighing up possible methods and deciding on data required

8

Open

Suggestions of goals and methods to get there

From Wood (2006), p. 99

Obviously, there is no simple consensus about the best classification of problems. Which aspects of problem solving competency are most required depends heavily on the set of problem situations selected (Fischer 2015). Therefore, from an assessment point of view, one needs to carefully select the types of stimuli that are used for measuring the construct: what you present to participants determines what is measured.

Problem Solving as a Competency

Some people think of problem solving as a competency; others talk about the (cognitive) abilities involved; another group conceives of problem solving as a skill, e.g., with respect to applying particular strategies (also see Griffin and Care 2015, p. 5). We prefer the term ‘competency’ as it emphasizes that a range of different cognitive and non-cognitive resources may contribute to successful problem solving and implies that this competency may have changed through training. In contrast, the term ability usually refers to something more static and innate (although in some papers, ability is used as a neutral alternative to competency). Expertise refers to the endpoint in the development some skill or competency, and is opposite to the novice situation.

The long tradition in measuring general intelligence has produced some of the most reliable assessment instruments, which are among the best predictors of job performance (Schmidt and Hunter 1998). However, problem solving and not intelligence has been chosen in many frameworks (e.g., within PISA and other large-scale assessments) as a central skill with high importance for the twenty-first century. One reason for this might be that the still increasing complexity and interconnectivity of social and economic life requires a change in thinking style. Instead of simple and mechanistic cause-effect assumptions (i.e., stimulus-response associations or input-output relations), a more holistic kind of systems thinking is required to consider the dynamics of the relevant processes and the feedback (see, e.g., Brehmer 1996, 2005). A combination of analytic, creative, and pragmatic thinking is needed for the identification of goals, the creation of solution paths, and the practical realization of planned actions in the light of obstacles. But it also becomes evident that problem solvers need to do more if they want to solve problems in the real world: they need to build models of dynamic processes, make predictions about changes in the future, and identify side-effects of their dynamic decision making (Selten et al. 2012).

Problem solving can be contrasted with routine behaviour. Routine behaviour (e.g., Betsch and Haberstroh 2005) makes life easy, but some events are more complicated and call for higher cognitive activities – not only to adjust known routines to a given situation but also to create new courses of action to overcome barriers on the way to a goal. In these cases, heuristics and strategies come into play as tools for the solution process; sometimes trial-and-error and other heuristics (simple ones as well as more sophisticated ones) do the job; sometimes effort is needed for problem solving. There is not a single method for problem solving (Fleck and Weisberg 2013).

Problem solving is strongly bound to knowledge. We may recognize a distinction between knowledge-lean and knowledge-rich problems, but on that scale a completely knowledge-free problem does exist. Problems cannot be defined without reference to the knowledge of the problem solver. The status of a specific situation as a problem depends on the absence or presence of solution-relevant knowledge. The simple question “What is the result of 7+9?” is not a problem at all for most adults, but for a pre-schooler, it might be unsolvable because of missing knowledge. For assessment, this has the implication of controlling previous knowledge in order to see if a given situation really is a problem. Because of the difficulties of knowledge assessment, researchers prefer knowledge-lean tasks to reduce the effect of prior knowledge.

To summarize, problem solving is a goal-oriented and high-level cognitive process. This implies that, for a problem to be solved, many elementary cognitive operations like attention, perception, learning, memory use, etc. need to be coordinated effectively. Indeed, problem solving can be seen as a regulation process – one that regulates not only cognitive but also non-cognitive factors (Zelazo et al. 1997).

The inclusion of non-cognitive factors can be understood as a shift in the historical paradigm of problem solving research, in which problem solving was seen as a purely cognitive activity. It is important to recognize that every problem-solving situation potentially produces a negative feeling for the problem solver. Frustration is a natural result of unforeseen obstacles between one’s goal and one’s current state, and there are more and less competent ways to deal with feelings of frustration. This connection between cognition and emotion is closely related to the definition of problems, but there are more non-cognitive factors to take into account. The class of “non-cognitive” skills in this context is a kind of residual category: it is everything but cognition!

Some researchers suggest that positive and negative affect trigger different styles of information processing (Fiedler 2001). Positive affect promotes a heuristic, top-down processing style in which individuals refer to pre-existing knowledge structures. Negative affect, in contrast, leads to an analytic, bottom-up processing style through the consideration of new information. Positive affect facilitates creative problem solving (Isen et al. 1987); negative affect enhances the performance of tasks that require a systematic and analytic approach (Forgas 2007). Even if there is not much research on the influence of affect on complex problem solving, evidence suggests that this may be an interesting line of research for the future (Barth and Funke 2010; Spering et al. 2005). Especially in economic contexts, the role of non-cognitive factors (e.g., motivation, trustworthiness, tenacity and perseverance) has been highlighted as important for success in life (e.g., Heckman and Rubinstein 2001).

Further insights about non-cognitive aspects of problem solving come from personality research. Assumptions about influences from the “big five” personality dimensions have not been supported up to now (e.g., Greiff and Neubert 2014), but perhaps more process-oriented research will reveal the influences of such traits as “conscientiousness” or – in collaborative contexts – “agreeableness”. The role of motivation in problem solving is also evident but not often demonstrated. In the context of dynamic systems, Vollmeyer and Rheinberg (2000) showed that the motivational state of subjects affected their knowledge acquisition and encouraged them to stay with the instructed systematic strategy. For assessment situations, the motivation of participants therefore seems important for showing their real performance level. Toy, or simple, problems (low-stakes situations, as compared to high-stakes ones) might therefore not be able to measure the real capacity in a proper way (Alison et al. 2013).

Systems Competency as a Focus for Future Assessments

To integrate the different facets of problem solving competency that have been proposed so far (e.g., Fischer et al. 2015) we propose a focus on what might be termed systems competency. Systems competency implies the ability to construct mental models of systems, to form and test hypotheses, and to develop strategies for system identification and control. The idea of focusing on systems competency as an important construct in the area of problem solving within dynamic systems is not new (see, e.g., Kriz 2003, 2008), but we think the value of this concept has not been fully realized in the context of assessment. Systems competency is more specific than CPS in that it explicitly emphasizes system dynamics aspects. However, it is also more generic in so far as it also covers routine controls of systems. Small disturbances from outside need to be compensated, shifts of the system into certain directions are made smoothly and without producing system crashes. In contrast, CPS is required only in situations where novel states are given, where a new system is encountered into play, or unusual goals have been set.

A focus on systems competency requires to reconsider the value of existing measurement approaches for CPS. Funke (2014b, p. 495) emphasized the importance of task selection in the context of problem solving assessment: According to his view, an emphasis on psychometric qualities (e.g., increasing reliability by repeating similar tasks; Greiff et al. 2015) has led to a loss of variety and validity (see also Dörner and Funke 2017). For example, in the minimally complex systems (MCS) approach the problem solver is confronted with a larger set of unknown problems, each of them lasting for about 5 min only. To achieve such short testing times per item, the simulations of the MCS approach need to be less complex than traditional microworlds. While interacting with the problem, participants need to generate and test hypotheses and plans based on feedback from a series of interventions. To ensure a sufficient amount of commonality between different problems, each problem is formulated in a certain formal framework (see Greiff et al. 2015) such as the framework of linear equation systems (MicroDYN) or the framework of finite state. However, Funke (2014b) argues that systems thinking requires more than analyzing models with two or three linear equations. Inherent features of complex problems like nonlinearity, cyclicity, rebound effects, etc. should show up in at least some of the problems used for research and assessment purposes. Minimal complex systems run the danger of becoming minimally valid systems. To increase the validity of assessments, we do not need more of the same, but different types of task requirements (see also Funke et al. 2017).

With respect to the process-oriented nature of the construct, we need a broad scope of indicators to assess the portfolio of person’s problem solving competencies. Within the PISA 2012 Problem Solving Framework (OECD 2013; see also Csapó and Funke 2017), four dyads of cognitive processes have been differentiated that follow the assumed processes of PS: exploring and understanding, representing and formulating, planning and executing, and monitoring and reflecting. Assessment instruments need to tap different competencies and their interplay: analytic problem solving, creative problem solving, scientific reasoning, complex problem solving, model building and hypothesis testing, to name just a few. In the tradition of Dörner (1975), we summarize these different concepts under the heading of systems competency (Kriz 2003, 2008) as a global construct that describes the competency to handle all types of systems in different respects, such as constructing, understanding, controlling or predicting.

To move towards a comprehensive assessment of the skills and abilities involved in systems competency, existing assessment approaches could be expanded in several respects. Rather than a revolution, we suggest a gradual evolution to cover additional facets of the problem solving process and to make increasing use of technology-based assessment. Below, we will outline two routes for further development in the assessment of problem solving and systems competency.

The first route is to extend the line of work that started with classical microworld scenarios and has recently been continued with minimally complex systems (MCS; Greiff et al. 2015). While MCS improved efficient delivery and reliability of assessment compared to classical microworlds, the approach is limited in terms of the variability of scenarios that can be constructed, which does not allow the assessment of a broadly-based competency. Thus, we suggest a systematic expansion of MCS in the direction of the original microworlds approach (e.g., Dörner 1997; Vester 2007) making use of complex systems for assessment purposes. Slightly increasing number of variables and/or relations (especially feedback loops) as well as with the inclusion of nonlinear relations, we argue that it should be possible to approximate some of the most interesting aspects of traditional CPS simulations. These include, for instance, limits-to-growth problems (e.g., EcoPolicy; Vester 2007), unstable homeostasis (e.g., Moroland; Strohschneider and Güss 1999), and interaction of variables (e.g., Tailorshop; Danner et al. 2011). Systems with such a moderate degree of complexity allow to obtain information about a broad palette of a person’s competencies in dealing with complexity (dealing with information, use of different strategies, proactive or reactive style of dealing with system events, etc.) within an acceptable amount of testing time. In this way, it may be possible to combine the variety of demands inherent in different microworld simulations with the psychometric benefits of minimal complex systems.

A second route of development focuses on the role of communication and collaboration in problem solving. Many complex problems in the real world are not tackled by individuals but by people collaborating in teams. Collaboration brings certain benefits, e.g., sharing knowledge, combining specialist skills, or distributing work, but also introduces difficulties through miscommunication, coordination losses, and potential goal conflicts. Given that collaborative problem solving (CoPS) is a key skill for the twenty-first century, as for example identified by the World Economic Forum or the OECD, the question arises how it could be measured appropriately. CoPS is a complex activity with closely intertwined cognitive, social and self-regulatory aspects, which may require new approaches to measurement and the development of a suitable theoretical framework.

Currently, several approaches are being piloted by different research groups and it is still too early to say what is likely to work in the long run. We suggest to build on existing approaches using computer-simulated microworlds, which can be systematically expanded towards interactive CoPS assessments. One advantage of these microworlds is that they already are computer-implemented, which makes it easy to integrate electronic communication mechanisms into the scenarios. Furthermore, in many of these scenarios social interactions and communication can be made a natural part of the simulation. Finally, there are established procedures for administering and scoring microworld scenarios in assessment, which makes them a convenient starting point for further developments.

The scoring of CoPS performance could be conducted along an “individual problem solving” axis (e.g., information search or strategic activities), drawing on existing scoring criteria for simulation scenarios, and a “collaboration” axis (e.g., effective communication, organization, or knowledge sharing). The framework proposed by the OECD for the Programme for International Student Assessment 2015 (PISA) provides a good illustration of what a competency matrix with these two main dimensions could look like. One of the main challenges in devising CoPS assessments will be to devise problem solving situations with suitable communication demands and to define appropriate scoring criteria for analysing communication behavior. Another challenge will be the validation of the derived scores against relevant external criteria such as performance in real-world collaborative problem solving activities. Standardized computer-based CoPS tests would fill an important niche in assessing systems competency, particularly with respect to one of the central non-cognitive factors – communication – that so far has been largely neglected in problem solving assessment.

Conclusion

In the twentieth century, skilled routine behaviour was a key to success. The challenges of the twenty-first century require humans’ problem solving competency more than ever. To assess this competency, we need knowledge-rich test situations that represent the full order of complexity in a diversity of domains. To measure system competency, the interaction between those diverse environments on the one side and the person with his or her abilities and skills on the other side needs to be carefully analyzed. Assessment of transversal (in educational contexts, cross-curricular) competencies cannot be achieved with one or two types of assessment. The plurality of skills and abilities requires a plurality of assessment instruments. Think of a good orchestra: if there are only violins playing, it is not the full sound that you will hear. And even if the triangle is played for only 1 min in a 60-min performance, we do not want to miss it. To reduce a complete orchestra to the most frequently used instruments might be a proposal made by business consultants but would hopefully never be realized. For a realistic assessment of problem solving competency that offers a realistic valuation of persons we, too, need tools that tap a wide range of contributors – of relevant cognitive and non-cognitive components. Systems competency may be fundamental for successfully dealing with the uncertainties of the twenty-first century – we have to be able to assess it!

Notes

Acknowledgment

The authors wish to thank Bruce Beswick, Esther Care, and Mark Wilson for extensive and helpful comments to earlier versions of this manuscript. The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be constructed as a potential conflict of interest. The research was supported by a grant from the German Research Foundation (DFG) to the first author (Az. Fu 173/14).

References

  1. Alison, L., van den Heuvel, C., Waring, S., Power, N., Long, A., O’Hara, T., & Crego, J. (2013). Immersive simulated learning environments for researching critical incidents: A knowledge synthesis of the literature and experiences of studying high-risk strategic decision making. Journal of Cognitive Engineering and Decision Making, 7(3), 255–272. doi:10.1177/1555343412468113.CrossRefGoogle Scholar
  2. Barth, C. M., & Funke, J. (2010). Negative affective environments improve complex solving performance. Cognition & Emotion, 24, 1259–1268. doi:10.1080/02699930903223766.CrossRefGoogle Scholar
  3. Betsch, T., & Haberstroh, S. (Eds.). (2005). The routines of decision making. Mahwah: Erlbaum.Google Scholar
  4. Brehmer, B. (1996). Man as a stabiliser of systems: From static snapshots of judgment processes to dynamic decision making. Thinking and Reasoning, 2, 225–238. doi:10.1080/135467896394528.CrossRefGoogle Scholar
  5. Brehmer, B. (2005). Micro-worlds and the circular relation between people and their environment. Theoretical Issues in Ergonomics Science, 6(1), 73–93. doi:10.1080/14639220512331311580.
  6. Broadbent, D. E., & Aston, B. (1978). Human control of a simulated economic system. Ergonomics, 21, 1035–1043.CrossRefGoogle Scholar
  7. Chang, E. C., D’Zurilla, T. J., & Sanna, L. J. (Eds.). (2004). Social problem solving: Theory, research, and training. Washington, DC: APA.Google Scholar
  8. Csapó, B., & Funke, J. (Eds.). (2017). The nature of problem solving. Using research to inspire 21st centurylearning. Paris: OECD Publishing. doi:10.1787/9789264273955-en.
  9. Danner, D., Hagemann, D., Holt, D. V., Hager, M., Schankin, A., Wüstenberg, S., & Funke, J. (2011). Measuring performance in a complex problem solving task: Reliability and validity of the tailorshop simulation. Journal of Individual Differences, 32, 225–233. doi:10.1027/1614-0001/a000055.CrossRefGoogle Scholar
  10. Dörner, D. (1975). Wie Menschen eine Welt verbessern wollten [How people wanted to improve a world]. Bild der Wissenschaft, 12, 48–53.Google Scholar
  11. Dörner, D. (1997). The logic of failure. Recognizing and avoiding error in complex situations. New York: Basic Books.Google Scholar
  12. Dörner, D., & Reither, F. (1978). Über das Problemlösen in sehr komplexen Realitätsbereichen [On problem solving in very complex domains of reality]. Zeitschrift für Experimentelle und Angewandte Psychologie, 25, 527–551.Google Scholar
  13. Dörner, D., & Funke, J. (2017). Complex problem solving: What it is and what it is not. Frontiers in Psychology, 8,1153. doi:10.3389/fpsyg.2017.01153.
  14. Duncker, K. (1935). Zur Psychologie des produktiven Denkens [On the psychology of productive thinking]. Berlin: Julius Springer.Google Scholar
  15. Edwards, W. (1962). Dynamic decision theory and probabilistic information processing. Human Factors, 4, 59–73.CrossRefGoogle Scholar
  16. Fiedler, K. (2001). Affective states trigger processes of assimilation and accomodation. In L. L. Martin & G. L. Clore (Eds.), Theories of mood and cognition: A user’s guidebook (pp. 85–98). Mahwah: Erlbaum.Google Scholar
  17. Fischer, A. (2015). Assessment of problem solving skills by means of multiple complex systems – Validity of finite automata and linear dynamic systems. Dissertation, Heidelberg: Heidelberg University.Google Scholar
  18. Fischer, A., & Neubert, J. C. (2015). The multiple faces of complex problems: A model of problem solving competency and its implications for training and assessment. Journal of Dynamic Decision Making, 1(6), 1–14. https://doi.org/10.11588/jddm.2015.1.23945.Google Scholar
  19. Fischer, A., Greiff, S., Wuestenberg, S., Fleischer, J., Buchwald, F., & Funke, J. (2015). Assessing analytic and interactive aspects of problem solving competency. Learning and Individual Differences, 39, 172–179. doi:10.1016/j.lindif.2015.02.008.CrossRefGoogle Scholar
  20. Fleck, J. I., & Weisberg, R. W. (2013). Insight versus analysis: Evidence for diverse methods in problem solving. Journal of Cognitive Psychology, 25, 436–463. doi:10.1080/20445911.2013.779248.CrossRefGoogle Scholar
  21. Forgas, J. P. (2007). When sad is better than happy: Negative affect can improve the quality and effectiveness of persuasive messages and social influence strategies. Journal of Experimental Social Psychology, 43, 513–528. doi:10.1016/j.jesp.2006.05.006.CrossRefGoogle Scholar
  22. Frensch, P. A., & Funke, J. (1995a). Definitions, traditions, and a general framework for understanding complex problem solving. In P. A. Frensch & J. Funke (Eds.), Complex problem solving: The European perspective (pp. 3–25). Hillsdale: Erlbaum.Google Scholar
  23. Frensch, P. A., & Funke, J. (Eds.). (1995b). Complex problem solving: The european perspective. Hillsdale: Erlbaum.Google Scholar
  24. Funke, J. (2010). Complex problem solving: A case for complex cognition? Cognitive Processing, 11(2), 133–142. doi:10.1007/s10339-009-0345-0.CrossRefGoogle Scholar
  25. Funke, J. (2014a). Analysis of minimal complex systems and complex problem solving require different forms of causal cognition. Frontiers in Psychology, 5, 739. doi:10.3389/fpsyg.2014.00739.CrossRefGoogle Scholar
  26. Funke, J. (2014b). Problem solving: What are the important questions? In P. Bello, M. Guarini, M. McShane, & B. Scassellati (Eds.), Proceedings of the 36th annual conference of the cognitive science society (pp. 493–498). Austin: Cognitive Science Society.Google Scholar
  27. Funke, J., Fischer, A., & Holt, D. V. (2017). When less is less: Solving multiple simple problems is not complex problem solving – A comment on Greiff et al. (2015). Journal of Intelligence, 5(1), 5. doi:10.3390/jintelligence5010005.CrossRefGoogle Scholar
  28. Greiff, S., & Neubert, J. C. (2014). On the relation of complex problem solving, personality, fluid intelligence, and academic achievement. Learning and Individual Differences, 36, 37–48. doi:10.1016/j.lindif.2014.08.003.CrossRefGoogle Scholar
  29. Greiff, S., Fischer, A., Wüstenberg, S., Sonnleitner, P., Brunner, M., & Martin, R. (2013a). A multitrait-multimethod study of assessment instruments for complex problem solving. Intelligence, 41, 579–596. doi:10.1016/j.intell.2013.07.012.CrossRefGoogle Scholar
  30. Greiff, S., Holt, D. V., & Funke, J. (2013b). Perspectives on problem solving in educational assessment: Analytical, interactive, and collaborative problem solving. Journal of Problem Solving, 6, 71–91. doi:10.7771/1932-6246.1153.Google Scholar
  31. Greiff, S., Fischer, A., Stadler, M., & Wüstenberg, S. (2015). Assessing complex problem-solving skills with multiple complex systems. Thinking & Reasoning, 21(3), 356–382. https://doi.org/10.1080/13546783.2014.989263.
  32. Griffin, P., & Care, E. (2015). The ATC21S method. In P. Griffin & E. Care (Eds.), Assessment and teaching of 21st century skills: Methods and approach (pp. 3–33). Dordrecht: Springer.Google Scholar
  33. Heckman, J. J., & Rubinstein, Y. (2001). The importance of noncognitice skills: Lessons from the GED testing program. American Economic Review, 91(2), 145–149.CrossRefGoogle Scholar
  34. Heppner, P. P. (2008). Expanding the conceptualization and measurement of applied problem solving and coping: From stages to dimensions to the almost forgotten cultural context. American Psychologist, 63, 803–805. doi:10.1037/0003-066X.63.8.805.CrossRefGoogle Scholar
  35. Huber, O. (2012). Risky decisions: Active risk management. Current Directions in Psychological Science, 21(1), 26–30. doi:10.1177/0963721411422055.CrossRefGoogle Scholar
  36. Isen, A. M., Daubman, K. A., & Nowicki, G. P. (1987). Positive affect facilitates creative problem solving. Journal of Personality and Social Psychology, 52, 1122–1131.CrossRefGoogle Scholar
  37. Klein, G. (2008). Naturalistic decision making. Human Factors, 50, 456–460. doi:10.1518/001872008X288385.CrossRefGoogle Scholar
  38. Kriz, W. C. (2003). Creating effective learning environments and learning organizations through gaming simulation design. Simulation & Gaming, 34(4), 495–511. doi:10.1177/1046878103258201.CrossRefGoogle Scholar
  39. Kriz, W. C. (2008). A systemic-constructivist approach to the facilitation and debriefing of simulations and games. Simulation & Gaming, 41, 663–680. doi:10.1177/1046878108319867.CrossRefGoogle Scholar
  40. Leutner, D., Fleischer, J., Wirth, J., Greiff, S., & Funke, J. (2012). Analytische und dynamische Problemlösekompetenz im Lichte internationaler Schulleistungsvergleichsstudien: Untersuchungen zur Dimensionalität [Analytical and dynamic problem-solving competence from an international educational studies perspective. Analysis of dimensionality]. Psychologische Rundschau, 63, 34–42. doi:10.1026/0033-3042/a000108.CrossRefGoogle Scholar
  41. MacKinnon, A. J., & Wearing, A. J. (1980). Complexity and decision making. Behavioral Science, 25, 285–296.CrossRefGoogle Scholar
  42. McCarthy, J. (1956). The inversion of functions defined by Turing machines. In C. E. Shannon & J. McCarthy (Eds.), Automata studies (AM-34). Princeton: Princeton University Press.Google Scholar
  43. Newell, A., & Simon, H. A. (1972). Human problem solving. Englewood Cliffs: Prentice-Hall.Google Scholar
  44. O’Neil, H. F., Chuang, S., & Chung, G. K. W. K. (2003). Issues in the computer-based assessment of collaborative problem solving. Assessment in Education: Principles, Policy & Practice, 10(3), 361–373. doi:10.1080/0969594032000148190.CrossRefGoogle Scholar
  45. OECD. (2013). PISA 2012 assessment and analytical framework. Mathematics, reading, science, problem solving and financial literacy. Paris: OECD.CrossRefGoogle Scholar
  46. OECD. (2014). PISA 2012 results: Creative problem solving: Students’ skills in tackling real-life problems (Vol. V). Paris: OECD Publishing.Google Scholar
  47. Rittel, H. W. J., & Webber, M. M. (1973). Dilemmas in a general theory of planning. Policy Sciences, 4, 155–169.CrossRefGoogle Scholar
  48. Schmidt, F. L., & Hunter, J. E. (1998). The validity and utility of selection methods in personnel psychology: Practical and theoretical implications of 85 years of research findings. Psychological Bulletin, 124(2), 262–274. doi:10.1037/0033-2909.124.2.262.CrossRefGoogle Scholar
  49. Schoppek, W., & Fischer, A. (2015). Complex problem solving – single ability or complex phenomenon? Frontiers in Psychology, 6(1669), 1–4. https://doi.org/10.3389/fpsyg.2015.01669.Google Scholar
  50. Selten, R., Pittnauer, S., & Hohnisch, M. (2012). Dealing with dynamic decision problems when knowledge of the environment is limited: An approach based on goal systems. Journal of Behavioral Decision Making, 25(5), 443–457. doi:10.1002/bdm.738.CrossRefGoogle Scholar
  51. Spering, M., Wagener, D., & Funke, J. (2005). The role of emotions in complex problem-solving. Cognition & Emotion, 19(8), 1252–1261. doi:10.1080/02699930500304886.CrossRefGoogle Scholar
  52. Sternberg, R. J., & Frensch, P. A. (Eds.). (1991). Complex problem solving: Principles and mechanisms. Hillsdale: Erlbaum.Google Scholar
  53. Strohschneider, S., & Güss, D. (1999). The fate of the Moros: A cross-cultural exploration of strategies in complex and dynamic decision making. International Journal of Psychology, 34, 235–252.CrossRefGoogle Scholar
  54. Treffinger, D. J., Isaksen, S. G., & Dorval, K. B. (1994). Creative problem solving: An overview. In M. A. Runco (Ed.), Problem finding, problem solving and creativity (pp. 223–236). Norwood: Ablex.Google Scholar
  55. Vester, F. (2007). The art of interconnected thinking. Ideas and tools for tackling complexity. Munich: MCB-Verlag.Google Scholar
  56. Vollmeyer, R., & Rheinberg, F. (2000). Does motivation affect performance via persistence? Learning and Instruction, 10, 293–309. doi:10.1016/S0959-4752(99)00031-6.CrossRefGoogle Scholar
  57. Willis, S. L. (1996). Everyday problem solving. In J. E. Birren, K. W. Schaie, et al. (Eds.), Handbook of the psychology of aging (4th ed., pp. 287–307). San Diego: Academic.Google Scholar
  58. Wood, C. (2006). The development of creative problem solving in chemistry. Chemistry Education Research and Practice, 7(2), 96–113. doi:10.1039/b6rp90003h.CrossRefGoogle Scholar
  59. Zelazo, P. D., Carter, A., Reznick, J. S., & Frye, D. (1997). Early development of executive function: A problem-solving framework. Review of General Psychology, 1(2), 198–226.CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG 2018

Authors and Affiliations

  • Joachim Funke
    • 1
  • Andreas Fischer
    • 2
  • Daniel V. Holt
    • 1
  1. 1.Department of PsychologyHeidelberg UniversityHeidelbergGermany
  2. 2.Research Institute for Vocational Education and TrainingNürnbergGermany

Personalised recommendations