# Development of a Framework to Characterise the Openness of Mathematical Tasks

Article

## Mathematical Problem, Investigation and Real-Life Task

An extensive review of relevant literature has revealed two different viewpoints on what a mathematical problem is. The first viewpoint is whether a situation is a problem depends on the individual. One of the earliest references to this view was from Henderson & Pingry (1953), who defined a situation to be a problem for a person if ‘blocking of the path toward the goal occurs, and the individual’s fixed patterns of behavior or habitual responses are not sufficient for removing the block’ (p. 230). Consider a typical textbook question:

Solve the quadratic equation x2 + 2x − 3 = 0.

This task may just be a routine practice of procedural skills that students have learnt earlier in the class (Lester, 1980) and so, they may know immediately what to do to solve it. However, this task may be a problem to students who have not been taught the procedure, or to low-ability students who have just learnt the procedure but do not know how to apply it properly. Nevertheless, with enough practice, this task can become a routine exercise to the students. Some educators call this type of tasks ‘routine problems’ (Orton & Frobisher, 1996, p. 27), but these tasks may not be problems to some students. Moreover, for students who do not practise these ‘routine’ tasks found in the textbook, then, these tasks are not even routine to them. Cockcroft (1982) used the term ‘familiar or unfamiliar tasks’ to indicate whether the tasks are familiar or unfamiliar to a student. Let us contrast task 1a with another example:

At a workshop, each of the 100 participants shakes hands once with each of the other participants. Find the total number of handshakes.

The main purpose of this task is for students to make use of some problem-solving heuristics, such as looking for patterns, to solve it. But this task may not pose a problem to students who have been exposed to such tasks before, or to high-ability students who have not encountered such problems before but are able to solve it without much difficulty. Therefore, from the first viewpoint, both tasks 1a and 2a can be problems to students who are ‘unable to proceed directly to a solution’ (Lester, 1980, p. 30). Schoenfeld (1985) believed that ‘being a “problem” is not a property inherent in a mathematical task’ (p. 74). Hence, the first viewpoint holds that whether or not a task is a problem depends on the individual.

However, there is another criterion in the second viewpoint to decide whether a mathematical task is a problem: the existence of ‘a clearly defined goal’ (Henderson & Pingry, 1953, p. 230). For example, task 2a has a clearly defined goal in its task statement: find the total number of handshakes. Contrast this with the next example:

Powers of 3 are 31, 32, 33, 34, … Investigate.

Orton & Frobisher (1996) believed that mathematical tasks, such as task 3a, do not specify a goal in their task statements. They claimed that few mathematics educators would classify investigations of this kind as problems. Since ‘investigation’ is a process (Ernest, 1991), task 3a will instead be called an ‘investigative task’ in this article. The purpose of investigative tasks is for students to investigate and discover the underlying patterns or mathematical structures. If we subscribe to the first viewpoint, an investigative task can still pose a problem to any student who does not know how or what to investigate (Evans, 1987).

There is another kind of open tasks that are rather different from the open investigative tasks. Consider the following example:

Design a playground for the school.

Find a solution of the quadratic equation x2 + 2x − 3 = 0.

Table 1

N.A.

## Open Goal

The problem with a general goal, such as ‘investigate’, is that it is ill-defined in the sense that it does not clearly define what students are supposed to investigate. A research study by the author has revealed that secondary school students, who had not encountered such open investigative tasks before, were unable to start investigating because they did not understand what it means to investigate (Yeo, 2008). So, the question is whether the ill-defined goal of an investigative task can be defined more clearly to help the students understand the task requirement, and yet keep the goal open. Let us consider a paraphrase of task 3a:

Powers of 3 are 31, 32, 33, 34, … Find as many patterns as possible.

The goal of task 3b is to find as many patterns as possible. It is a general goal since students can still choose their own specific goals to pursue, i.e., they can choose to investigate different patterns. Thus the goal is still open, but it is now more well-defined because the students know that their goal is to find as many patterns as possible. Whether or not they can find any pattern is a different issue altogether: at least they can start trying since they can understand what the task expects them to do. Therefore, an open goal can be ill-defined or well-defined.

Choose any mathematics project to do. Submit a report at the end of the year.

Choose any mathematician and find out more about his life and mathematical discoveries. Write a two-page biography of the mathematician and submit it at the end of the year.

To summarise, a task has a closed goal if there is a specific goal in the task statement. Otherwise, the task has an open goal, of which there are two types: an ill-defined goal, such as ‘investigate’, is vague, while a well-defined goal, such as ‘find as many patterns as possible’, is more clearly defined. Table 2 provides a summary of the different types of goal. Since a closed goal is always well-defined, the last cell is left blank.
Table 2

Types of goal

Closed goal

Open goal

Well-defined goal

Ill-defined goal

N.A.

## Open Method

In task 2a, the method is also indeterminate, i.e. it is impossible to determine all the methods of solution. For example, students may try to solve task 2a by starting with a smaller number of participants in order to find a pattern so as to generalise, or they may use logical reasoning to deduce that the total number of handshakes is 99 + 98 + 97 + … + 1. But this does not mean that there are no other methods. In fact, there is a method using combinatorics that few students will think of: since every different pair of participants will give rise to one distinct handshake, so the total number of handshakes is 100C2. However, task 1a also has an indeterminate method as there may be more methods other than the three methods mentioned in the preceding paragraph. For example, the answer to task 1a can be found by trial and error, or by using a graphing software. Thus, the idea of determinateness is not helpful here.

Another option is look at the nature of the methods. The methods for solving task 1a are procedural skills that students have learnt earlier in class while the methods for solving task 2a are useful problem-solving heuristics that stimulate mathematical thinking (Orton & Frobisher, 1996). Hence, a more viable definition is that a task has an open method if there are multiple methods of solution involving problem-solving heuristics rather than the mere application of known procedures, and closed if otherwise. Similarly, investigative tasks, such as task 3a (powers of 3), has an open method since there are multiple methods of solution involving problem-solving heuristics.

However, there is another characteristic of an open method: whether the openness is subject-dependent or task-inherent. The issue with the open method in task 2a, a problem-solving task, is that most students will usually use one method to solve and so, the method is closed to them. Some teachers may also teach their students only one method to solve, thus closing the method. One way to open up the task is for the teacher to tell the students to find alternative methods of solution. Another way to open up the task is to rephrase the task as followed, although technically, what is open in task 2b is the answer, which includes all the different methods of solution:

At a workshop, each of the 100 participants shakes hands once with each of the other participants. Find the total number of handshakes using as many methods as possible. Discuss which methods are ‘better’ and in what ways they are ‘better’.

On the other hand, the open method in investigative tasks, such as task 3a, is inherent in the task because it is not possible to use only one method to generate all the correct answers. Similarly, the open method in real-life tasks, such as task 4a, is also task-inherent because the method is ill-defined: it does not depend on the subjects to make the method open.

To summarise, a task has a closed method if there is only one method or if the method involves only a routine application of known procedures. Otherwise, the task has an open method, which can be well-defined or ill-defined: a well-defined method means that it is possible to teach students a method that will lead to the same correct answer, while an ill-defined method means that the same method taught to different students will produce different answers. An open method can also be task-inherent or subject-dependent: a task-inherent method means that the openness of the method does not depend on the subjects (e.g. teachers and students). Table 3 provides a summary of the different types of method. Since a closed method is always well-defined and task-inherent, two of the cells are left blank.
Table 3

Types of method

Closed method

Open method

Well-defined method

Ill-defined method

N.A.

Subject-dependent

N.A.

Closely related to the method of solution is scaffolding: how teachers can structure the method of solution into the task statement to guide students if the task is too complex.

Make an open box using a given vanguard sheet so that it has the biggest possible volume.

This is a problem-solving task since it requires the use of some problem-solving heuristics to solve, and it has a closed goal as there is a specific goal in the task statement. The task also has a closed answer because there is only one correct answer and so it is determinate. Just like task 2a (handshake), task 6 has an open method, but there is a sense that something else is open: the task is too complex and there is no scaffolding in the task statement, so the task is open to the students in the sense that they may not even know how to start.

Complexity is a continuum: at both ends, educators may agree that the tasks are either very simple or very complex; the problem always lies in the middle and so there will be grey areas. Moreover, complexity depends on the students: what is complex for primary school students may be simple enough for secondary school students. But educators are more likely to agree that, for the same student, task 6 is more open than task 2a because task 6 is more complex. To make task 6 less open, the teacher can rephrase the task statement to include more guidance on how to solve the task. In fact, it is possible to provide enough scaffolding, in terms of the method of solution, to close the task because the method is well-defined, i.e. task 6 is not inherently open. In other words, this first type of openness is subject-dependent: it depends on the teacher whether to provide enough scaffolding to close the task.

However, if the method is ill-defined, it is not easy to structure the method of solution inside the task statement to close the task. For example, task 4a (playground) is also open since it is complex and there is no scaffolding in the task statement, but how do you specify all the methods of solution to help the students design a playground? Instead, there is another type of scaffolding to make this task less open by providing more information as followed:

Design a playground for the school. The playground must be 20 m by 20 m and it must contain at least one slide and two swings. The budget is \$10.000.

Some authors (e.g. Frederiksen, 1984; Kilpatrick, 1987) referred to tasks 4a and 4b as ill-structured because the tasks lack any structure. From Simon’s (1978) three criteria of ill-structuredness and his examples of ill-structured tasks which are all real-life tasks, his idea of an ill-structured task is one with an ill-defined method and an ill-defined answer. Therefore, Simon’s idea of ill-structuredness is an inherent property of the task, i.e. it is not possible to provide enough scaffolding to close the task.

To summarise, a task is closed if it is simple enough for the students. Otherwise, it is open if it is too complex for the students and there is not enough scaffolding in the task statement. There are two types of openness in terms of complexity: the first type is subject-dependent because the teacher can provide enough scaffolding to close the task; the second type is task-inherent because it is inherently not possible to provide enough scaffolding to close the task. Table 4 shows a summary of the different types of complexity and scaffolding. Since a closed task is always task-inherent, the last cell is left blank.
Table 4

Task is closed (in terms of complexity)

Task is open (in terms of complexity)

Subject-dependent

N.A.

## Extension

There are two types of extension: subject-dependent and task-inherent. The issue with extending problem-solving tasks such as task 2a is that most teachers will usually not expect their students to extend, and most students will usually not extend the tasks themselves, thus the tasks are closed to them. So, this kind of extension is subject-dependent. However, for investigative tasks such as task 3a, students are expected to find patterns not only for powers of 3, but also for powers of other numbers. In other words, students are expected to extend investigative tasks by changing the given numbers or conditions. Therefore, extension is task-inherent in investigative tasks: it does not depend on the subjects whether to extend or not.

To summarise, a task is closed if it cannot or should not be extended, i.e. ‘extending’ the task would only lead to new unrelated tasks; otherwise, a task is open if it can be extended. There are two types of extension: task-inherent and subject-dependent. Table 5 shows a summary of the different types of extension. Since a closed task in terms of extension is always task-inherent, the last cell is left blank.
Table 5

Types of extension

Task is closed (in terms of extension)

Task is open (in terms of extension)

Subject-dependent

N.A.

## Discussion

Table 6

Framework characterising task openness (with some examples)

Examples and types of mathematical tasks to illustrate framework

Task 3a: Investigate powers of 3

Task 3b: Find patterns for powers of 3

Goal

Closed

Open; well-defined

Open; ill-defined

Method

Closed

Open; well-defined

Open; ill-defined

Open; subject-dependent

Complexity

Closed

Open; subject-dependent

Closed

Open; well-defined

Open; ill-defined

Extension

Closed

Open; subject-dependent

Refer to the relevant sections for the definitions of task variables and types of openness

## Conclusion

A task can be open in various aspects, and which aspect it is open may affect the types of thinking processes required to solve it. Doyle (1983) contended that the types of tasks given to students will affect what they learn, while Christiansen & Walther (1986) went further to distinguish between a task as set up and implemented by the teacher from the mathematical activity that students engage in when they attempt the task. ‘The purpose of a task is to initiate activity by learners.’ (Mason & Johnston-Wilder, 2006, p. 5)

The activity itself is not learning: it is just doing tasks. But it is in the course of this activity that learners encounter mathematical ideas and themes, develop and practise techniques, and use their mathematical powers. The activity provides the basis for learning to take place. (Mason & Johnston-Wilder, 2006, p. 69)

The distinction between a task and an activity is important because the original purpose of a task may be lost during its implementation (Stein et al., 1996). Therefore, there is a need to develop a framework to characterise the openness of a task based on different task variables so that teachers can design or choose appropriate tasks to develop in their students different kinds of mathematical processes. Being aware that the openness of a task may be subject-dependent or task-inherent may also help teachers to decide how best to implement the task so as not to close it. However, the impact of the various task variables that are open on different types of students’ learning is still not very clear, so more research needs to be done in this area in order to inform teaching.

### References

1. Becker, J. P. & Shimada, S. (1997). The open-ended approach: A new proposal for teaching mathematics. Reston, VA: National Council of Teachers of Mathematics.Google Scholar
2. Boaler, J. (1998). Open and closed mathematics: Student experiences and understandings. Journal for Research in Mathematics Education, 29, 41–62.
3. Brown, S. I. & Walter, M. I. (2005). The art of problem posing (3rd ed.). Mahwah, NJ: Erlbaum.Google Scholar
4. Cai, J. & Cifarelli, V. (2005). Exploring mathematical exploration: How two college students formulated and solved their own mathematical problems. Focus on Learning Problems in Mathematics, 27(3), 43–72.Google Scholar
5. Christiansen, B. & Walther, G. (1986). Task and activity. In B. Christiansen, A. G. Howson & M. Otte (Eds.), Perspectives on mathematics education: Papers submitted by members of the Bacomet Group (pp. 243–307). Dordrecht, The Netherland: Reidel.
6. Cockcroft, W. H. (1982). Mathematics counts: Report of the committee of inquiry into the teaching of mathematics in schools under the chairmanship of Dr W H Cockcroft. London, England: Her Majesty’s Stationery Office (HMSO).Google Scholar
7. Doyle, W. (1983). Academic work. Review of Educational Research, 53, 159–199.
8. Ernest, P. (1991). The philosophy of mathematics education. London, England: Falmer Press.Google Scholar
9. Evans, J. (1987). Investigations: The state of the art. Mathematics in School, 16(1), 27–30.Google Scholar
10. Frederiksen, N. (1984). Implications of cognitive theory for instruction in problem solving. Review of Educational Research, 54, 363–407.
11. Frobisher, L. (1994). Problems, investigations and an investigative approach. In A. Orton & G. Wain (Eds.), Issues in teaching mathematics (pp. 150–173). London, England: Cassell.Google Scholar
12. Henderson, K. B. & Pingry, R. E. (1953). Problem solving in mathematics. In H. F. Fehr (Ed.), The learning of mathematics: Its theory and practice (pp. 228–270). Washington, DC: National Council of Teachers of Mathematics.Google Scholar
13. Henningsen, M. & Stein, M. K. (1997). Mathematical tasks and student cognition: Classroom-based factors that support and inhibit high-level mathematical thinking and reasoning. Journal for Research in Mathematics Education, 28, 524–549.
14. Hiebert, J. & Wearne, D. (1993). Instructional tasks, classroom discourse, and students’ learning in second-grade arithmetic. American Educational Research Journal, 30, 393–425.
15. Jaworski, B. (1994). Investigating mathematics teaching: A constructivist enquiry. London, England: Falmer Press.Google Scholar
16. Kaiser, G. & Sriraman, B. (2006). A global survey of international perspectives on modelling in mathematics education. Zentralblatt für Didaktik der Mathematik, 38, 302–310.
17. Kilpatrick, J. (1987). Problem formulating: Where do good problems come from? In A. H. Schoenfeld (Ed.), Cognitive science and mathematics education (pp. 123–147). Hillsdale, MI: Erlbaum.Google Scholar
18. Klavir, R. & Hershkovitz, S. (2008). Teaching and evaluating ‘open-ended’ problems. International Journal for Mathematics Teaching and Learning. Retrieved from http://www.cimt.plymouth.ac.uk/journal.
19. Lampert, M. (1990). When the problem is not the question and the solution is not the answer: Mathematical knowing and teaching. American Educational Research Journal, 27, 29–63.
20. Lerman, S. (1989). Investigations: Where to now? In P. Ernest (Ed.), Mathematics teaching: The state of the art (pp. 73–80). London, England: Falmer.Google Scholar
21. Lester, F. K., Jr. (1980). Problem solving: Is it a problem? In M. M. Lindquist (Ed.), Selected issues in mathematics education (pp. 29–45). Berkeley: McCutchan.Google Scholar
22. Lingefjärd, T. & Meier, S. (2010). Teachers as managers of the modelling process. Mathematics Education Research Journal, 22, 92–107.
23. Mason, J., Burton, L. & Stacey, K. (1985). Thinking mathematically (Revth ed.). Wokingham, England: Addison-Wesley.Google Scholar
24. Mason, J. & Johnston-Wilder, S. (2006). Designing and using mathematical tasks. St Albans, England: Tarquin Publications.Google Scholar
25. Moschkovich, J. N. (2002). Bringing together workplace and academic mathematical practices during classroom assessments. In E. Yackel, M. E. Brenner & J. N. Moschkovich (Eds.), Everyday and academic mathematics in the classroom (pp. 93–110). Reston, VA: National Council of Teachers of Mathematics.Google Scholar
26. National Council of Teachers of Mathematics (1991). Professional standards for teaching mathematics. Reston, VA: National Council of Teachers of Mathematics.Google Scholar
27. Orton, A. & Frobisher, L. (1996). Insights into teaching mathematics. London, England: Cassell.Google Scholar
28. Pirie, S. (1987). Mathematical investigations in your classroom: A guide for teachers. Basingstoke, England: Macmillan.Google Scholar
29. Reys, R. E., Lindquist, M. M., Lambdin, D. V., Smith, N. L. & Suydam, M. N. (2012). Helping children learn mathematics (10th ed.). Hoboken, NJ: Wiley.Google Scholar
30. Ronis, D. (2001). Problem-based learning for math and science: Integrating inquiry and the Internet. Arlington Height, IL: SkyLight.Google Scholar
31. Schoenfeld, A. H. (1985). Mathematical problem solving. Orlando, FL: Academic.Google Scholar
32. Schoenfeld, A. H. (1988). When good teaching leads to bad results: The disasters of “well-taught” mathematics courses. Educational Psychologist, 23, 145–166.
33. Sheffield, L. J., Meissner, H. & Foong, P. Y. (2004). Developing mathematical creativity in young children. Paper presented at the Tenth International Congress on Mathematical Education, Copenhagen, Denmark.Google Scholar
34. Silver, E. A. (1994). On mathematical problem posing. For the Learning of Mathematics, 14(1), 19–28.Google Scholar
35. Simon, H. A. (1978). Information-processing theory of human problem solving. In W. K. Estes (Ed.), Handbook of learning and cognitive processes (Vol. 5, pp. 271–295). Hillsdale, MI: Erlbaum.Google Scholar
36. Skovsmose, O. (2002). Landscapes of investigation. In L. Haggarty (Ed.), Teaching mathematics in secondary schools (pp. 115–128). London, England: Routledge Falmer.Google Scholar
37. Stein, M. K., Grover, B. W. & Henningsen, M. (1996). Building student capacity for mathematical thinking and reasoning: An analysis of mathematical tasks used in reform classrooms. American Educational Research Journal, 33, 455–488.
38. Wolf, A. (1990). Testing investigations. In P. Dowling & R. Noss (Eds.), Mathematics versus the national curriculum (pp. 137–153). London, England: Falmer Press.Google Scholar
39. Yeo, J.B.W. (2008). Secondary school students investigating mathematics. In M. Goos, R. Brown & K. Makar (Eds.), Proceedings of the 31st Annual Conference of the Mathematics Education Research Group of Australasia (MERGA): Navigating Currents and Charting Directions (vol. 2, pp. 613–619). Brisbane, Australia: MERGA, Inc. Google Scholar
40. Yeo, J.B.W & Yeap, B.H. (2010). Characterising the cognitive processes in mathematical investigation. International Journal for Mathematics Teaching and Learning. Retrieved from http://www.cimt.plymouth.ac.uk/journal.