# Development of a Framework to Characterise the Openness of Mathematical Tasks

## Abstract

Educators usually mean different constructs when they speak of open tasks: some may refer to pure-mathematics investigative tasks while others may have authentic real-life tasks in mind; some may think of the answer being open while others may refer to an open method. On the other hand, some educators use different terms, e.g. open and open-ended, to mean the same construct, while others distinguish between these terms. It is difficult to hold a meaningful discussion or to define clearly an area of research on open tasks if the idea of what constitutes the construct of openness is vague. Moreover, what students learn depends on the types of tasks that they are given, and different kinds of tasks place differing cognitive demands on students. Thus, the objectives of this article are to clarify the types of mathematical tasks and develop a framework to characterise their openness based on five task variables: goal, method, task complexity, answer and extension; and to discuss how different types of tasks and openness may affect student learning. The openness framework can help teachers to design or select more appropriate tasks to cater to students with different abilities in order to develop in them various kinds of mathematical thinking processes, and it can also make it easier for researchers to study the interaction between different types of openness and student learning.

### Keywords

Mathematical investigation Open tasks Open-ended tasks Problem solving Real-life tasks## Introduction

A closed mathematical task has only one correct answer (Becker & Shimada, 1997). Examples of closed tasks are most procedural tasks found commonly in textbooks whose main purpose is for students to practise procedural skills taught earlier in the class (Lester, 1980). However, many educators (e.g. Lampert, 1990; Schoenfeld, 1988) believe that these skills are of limited use in new situations and so, there is a growing support for other types of tasks, such as mathematical problems that stimulate thinking (Mason, Burton, & Stacey, 1985), open problems (Frobisher, 1994), open-ended problems (Klavir & Hershkovitz, 2008), mathematical investigation (Pirie, 1987) and mathematical modelling (Lingefjärd & Meier, 2010). But how are these tasks different from one another? One problem is that different people usually use the same term to mean different constructs, or different terms to refer to the same construct. For example, Boaler (1998) used the terms ‘open’ and ‘open-ended’ interchangeably, but Orton & Frobisher (1996) distinguished between them. Becker & Shimada (1997) advocated the use of open-ended tasks which look like pure-mathematics investigative tasks while Wolf (1990) used the same term to refer to both pure-mathematics investigative tasks and practical tasks. But these practical tasks also resemble Cockcroft’s (1982) project work and Skovsmose’s (2002) real-life landscapes of investigation. Some educators (e.g. Frederiksen, 1984) also call this kind of authentic real-life tasks ‘ill-structured’. The failure to define some of these constructs clearly can lead to a lot of confusion in any discussion, research or teaching.

Doyle (1983) argued that different tasks require different strategies to solve them and what students learn depends to a large extent the tasks that are given to them. Henningsen & Stein (1997) also asserted that ‘the nature of tasks can potentially influence and structure the way students think’ (p. 525). Thus, it is important for teachers to understand the types of tasks and their features so that they can choose suitable tasks to elicit appropriate student learning (Hiebert & Wearne, 1993). But how a task is set up by the teacher can be very different from how the task is implemented in the classroom (Stein, Grover & Henningsen, 1996). Some researchers are worried that teachers may teach mathematical investigation in an algorithmic manner by stereotyping certain mathematical processes as a set of procedures to be learnt by students (Jaworski, 1994). For example, Lerman (1989) observed a lesson by an experienced teacher who taught mathematical investigation by telling his students what to do to arrive at an answer when they were stuck, instead of asking guiding questions to stimulate thinking. Hence, a task that is intended to be open can be closed by the teacher in its implementation.

Therefore, the purpose of this article is to clarify the nature of mathematical tasks and the construct of openness so that teachers can better understand the different types of tasks and researchers can clearly define the boundary of their research. The following questions will be addressed: ‘What are the differences between a mathematical problem, an investigation and a real-life task? How are open tasks, open-ended tasks and ill-structured tasks similar to or different from one another? Can we develop a framework to characterise the openness of mathematical tasks using criteria such as the goal, method, task complexity, answer and extension?’ The article will also discuss how different types of openness may affect different kinds of learning so that teachers can design or select more appropriate tasks for their students, and researchers can investigate the effects of the openness of different task variables on students’ thinking processes.

## Mathematical Problem, Investigation and Real-Life Task

*individual*. One of the earliest references to this view was from Henderson & Pingry (1953), who defined a situation to be a problem for a person if ‘blocking of the path toward the goal occurs, and the individual’s fixed patterns of behavior or habitual responses are not sufficient for removing the block’ (p. 230). Consider a typical textbook question:

Task 1a: Standard Textbook Task (Quadratic)

Solve the quadratic equation

*x*^{2}+ 2*x*− 3 = 0.

*problem*to students who have not been taught the procedure, or to low-ability students who have just learnt the procedure but do not know how to apply it properly. Nevertheless, with enough practice, this task can become a routine exercise to the students. Some educators call this type of tasks ‘routine problems’ (Orton & Frobisher, 1996, p. 27), but these tasks may not be problems to some students. Moreover, for students who do not practise these ‘routine’ tasks found in the textbook, then, these tasks are not even routine to them. Cockcroft (1982) used the term ‘familiar or unfamiliar tasks’ to indicate whether the tasks are familiar or unfamiliar to a student. Let us contrast task 1a with another example:

Task 2a: Problem-Solving Task (Handshake)

At a workshop, each of the 100 participants shakes hands once with each of the other participants. Find the total number of handshakes.

The main purpose of this task is for students to make use of some problem-solving heuristics, such as looking for patterns, to solve it. But this task may not pose a problem to students who have been exposed to such tasks before, or to high-ability students who have not encountered such problems before but are able to solve it without much difficulty. Therefore, from the first viewpoint, both tasks 1a and 2a can be problems to students who are ‘*unable to proceed directly* to a solution’ (Lester, 1980, p. 30). Schoenfeld (1985) believed that ‘being a “problem” is not a property inherent in a mathematical task’ (p. 74). Hence, the first viewpoint holds that whether or not a task is a problem depends on the individual.

The second viewpoint of what constitutes a mathematical problem involves the *nature* and *purpose* of the task. Task 1a is not a mathematical problem because it requires only a procedure to solve it and the purpose of such a task is to ‘provide students with practice in using standard mathematical procedures, for example, computational algorithms, algebraic manipulations, and use of formulas’ (Lester, 1980, p. 31). On the other hand, task 2a requires ‘some creative effort and higher-level thinking’ (Reys, Lindquist, Lambdin, Smith & Suydam, 2012, p. 115) to solve, not just a direct application of a procedure. Thus, task 2a is a mathematical problem even though it may not pose a problem to some high-ability students. Since tasks 1a and 2a are inherently different and they may or may not be a problem to an individual, some educators have used the phrase ‘mathematical tasks’ (e.g. National Council of Teachers of Mathematics (NCTM), 1991, p. 25; Schoenfeld, 1985, p. 74) instead of ‘mathematical problems’. In this article, task 1a will be called a ‘procedural task’ (Doyle, 1983) since it involves the practice of procedures, while task 2a will be called a ‘problem-solving task’ (Yeo & Yeap, 2010) since it requires the use of some problem-solving heuristics to solve.

Task 3a: Investigative Task (Powers of 3)

Powers of 3 are 3

^{1}, 3^{2}, 3^{3}, 3^{4}, … Investigate.

Orton & Frobisher (1996) believed that mathematical tasks, such as task 3a, do not specify a goal in their task statements. They claimed that few mathematics educators would classify investigations of this kind as problems. Since ‘investigation’ is a process (Ernest, 1991), task 3a will instead be called an ‘investigative task’ in this article. The purpose of investigative tasks is for students to investigate and discover the underlying patterns or mathematical structures. If we subscribe to the first viewpoint, an investigative task can still pose a problem to any student who does not know how or what to investigate (Evans, 1987).

Task 4a: Real-Life Task (Playground)

Design a playground for the school.

Moschkovich (2002) and Skovsmose (2002) advocated using real-life tasks or projects, such as designing a scientific research station on the Antarctic coast and planning a major road, where students can learn and utilise mathematics such as measurement, geometry, costing and spatial visualisation. Orton & Frobisher (1996) called them ‘environment problems’ or ‘real-world problems’. The purpose of this kind of tasks is to learn and apply mathematics in real-life situations. Unlike investigative tasks such as task 3a, students usually cannot just depend on their brain power to think of how to design a playground: they will have to do some *research* to find the dimensions of a slide, a swing, etc., and the cost of building these. In a way, this is doing some kind of investigation, but it is different from the type of investigation in task 3a. Task 4a seems to be open, but unlike task 3a, it is closed in its goal since it has a clearly defined goal: design a playground. Therefore, task 4a must be open in some other aspects, such as the answer. The development of a framework to characterise the openness of mathematical tasks will begin with the openness of the answer as the nature of the answer seems to have a great influence on the characteristics of the other task variables.

## Open Answer

*correct*[emphasis mine] answer’ (p. 1). The emphasis is important. For example, task 1a (quadratic) has two solutions (or answers), but giving both solutions is the

*only*correct answer. Similarly, problem-solving tasks, such as task 2a (handshake), are also closed in its answer because there is only one correct answer. A task that does not have a closed answer is said to have an open answer. For example, task 3a (powers of 3) is open in its answer because there are multiple correct answers. Becker and Shimada called this kind of tasks ‘open-ended’ because the answer, which is at the ‘end’ of the task, is open. However, this definition of a closed and an open answer has a problem. For example, what happens if we rephrase task 1a, which has a closed answer, as follows?

Task 1b: Paraphrase of Standard Textbook Task (Quadratic)

Find a solution of the quadratic equation

*x*^{2}+ 2*x*− 3 = 0.

Does this mean that the answer for task 1b is now open because it has more than one correct answer? But for what purpose do teachers want to ‘open up’ task 1a by rephrasing it as task 1b? The main issue with task 1b is that all its multiple correct answers can be *determined*, but no one can say for certain that they have found all the correct answers to task 3a. In other words, if we modify the definition of a closed answer to mean that the answer is *determinate*, task 1b will still be closed in its answer. Similarly, tasks 1a and 2a will still be closed in their answer while task 3a will still have an open answer since its answer is indeterminate, i.e. it is not possible to find all the possible correct answers for task 3a.

There is another type of open answer. For example, task 4a (playground) has an open answer in the sense that the students can produce different designs, and the product (i.e. the answer) is indeterminate; but can we judge the product by saying that it is correct or wrong? As long as the students design a playground and not something else, their product is *valid*, but this kind of answer is *subjective*, unlike the multiple *correct* answers in task 3a. Although a student may formulate a conjecture for task 3a that no one in the world knows how to prove or refute, the answer is still either correct or wrong, except that no one knows the actual answer. Another difference between tasks 3a and 4a is that each student is expected to find as many correct answers as possible for task 3a, but each student is required to produce only one product for task 4a. One other characteristic of the type of open answer for task 4a is that the students cannot say for sure when they have accomplished the task since the product can always be modified or improved upon whenever the students have a new insight or a change of mind (Simon, 1978). In this article, the two types of open answer will be called ‘well-defined’ (i.e. objective) and ‘ill-defined’ (i.e. subjective). So tasks 1a, 2a, and 3a have a *well*-*defined* answer in the sense that the answers are clearly defined to be either correct or wrong, but task 4a has an *ill*-*defined* answer as there is no right or wrong answer.

Types of answer

Closed answer | Open answer | |
---|---|---|

Well-defined answer | Procedural tasks; problem-solving tasks | Investigative tasks |

Ill-defined answer | N.A. | Real-life tasks |

## Open Goal

As explained in the literature review, Orton & Frobisher (1996) believed that investigative tasks do not have a goal in their task statement, and so investigative tasks are different from problem-solving tasks which contain a goal in their task statement. They defined a task ‘to be “open” when no goal is specified’ (p. 32). But they also described investigative tasks as having a ‘general goal’ (p. 26), which is to investigate. Thus students can choose any *specific* goal to investigate (Cai & Cifarelli, 2005), e.g. two possible specific goals for task 3a (powers of 3) are to find whether there is a pattern in the last digit, and a pattern in the sum of all the digits, of consecutive powers of 3. Therefore, a task is open in its goal if there is no specific goal in the task statement and students are expected to choose their own specific goals to pursue. Hence, investigative tasks, such as task 3a, are open in their goal; while procedural tasks such as task 1a (quadratic), and problem-solving tasks such as task 2a (handshake), are closed in their goal because there is a specific goal in their task statement.

*general*goal, such as ‘investigate’, is that it is

*ill-defined*in the sense that it does not clearly define what students are supposed to investigate. A research study by the author has revealed that secondary school students, who had not encountered such open investigative tasks before, were unable to start investigating because they did not understand what it means to investigate (Yeo, 2008). So, the question is whether the ill-defined goal of an investigative task can be defined more clearly to help the students understand the task requirement, and yet keep the goal open. Let us consider a paraphrase of task 3a:

Task 3b: Paraphrase of Investigative Task (Powers of 3)

Powers of 3 are 3

^{1}, 3^{2}, 3^{3}, 3^{4}, … Find as many patterns as possible.

The goal of task 3b is to find as many patterns as possible. It is a general goal since students can still choose their own specific goals to pursue, i.e., they can choose to investigate different patterns. Thus the goal is still open, but it is now more *well-defined* because the students know that their goal is to find as many patterns as possible. Whether or not they can find any pattern is a different issue altogether: at least they can start trying since they can understand what the task expects them to do. Therefore, an open goal can be ill-defined or well-defined.

*not*the case for task 4a (playground), which is closed in its goal as there is a specific goal in its task statement: design a playground. But students pursuing the same goal will end up with different designs, i.e. the answer is open. Nevertheless, there is still a difference between the closed goal in task 1a and in task 4a: in the former, the students will know that they have achieved the goal if they have found an answer because the answer is well-defined; but in the latter, it ‘may not even be obvious when a solution has been reached’ (Kilpatrick, 1987, p. 134) since the ‘criterion that determines whether the goal has been attained is both more complex and less definite’ (Simon, 1978, p. 286) because the answer is ill-defined. The next question is whether it is possible for a task with an ill-defined answer to have an open goal, instead of a closed goal like in task 4a. Let us consider the following real-life tasks:

Task 5a: Real-Life Task (Project)

Choose any mathematics project to do. Submit a report at the end of the year.

Task 5b: Real-Life Task (Project)

Choose any mathematician and find out more about his life and mathematical discoveries. Write a two-page biography of the mathematician and submit it at the end of the year.

Both tasks have an ill-defined answer because the answer is subjective, but the goal is open because students can choose any project to do or any mathematician to write a biography on. Thus, it is possible for a task with an ill-defined answer to have an open goal. The next question is whether the goals in tasks 5a and 5b are well-defined or ill-defined. In any classification where the attributes are not distinct but exist in a continuum, there will always be grey areas. The reader may not agree whether the goals in task 5a and 5b are well-defined or ill-defined, but he or she is more likely to agree that the goal in task 5b is more well-defined than the goal in task 5a because it helps to narrow the scope of the mathematics project. Therefore, it is possible for a task with an ill-defined open answer to have either a closed goal (as in task 4a), a well-defined open goal (as in task 5b), or an ill-defined open goal (as in task 5a). On the other hand, a task with a well-defined open answer, such as tasks 3a and 3b, cannot have a closed goal. Hence, we see how the nature of the answer affects the openness of the goal.

Types of goal

Closed goal | Open goal | |
---|---|---|

Well-defined goal | Procedural tasks; problem-solving tasks; some real-life tasks such as task 4a | Some investigative tasks such as task 3b; some real-life tasks such as task 5b |

Ill-defined goal | N.A. | Some investigative tasks such as task 3a; some real-life tasks such as task 5a |

## Open Method

Frobisher (1994) attributed ‘the openness [of problem-solving tasks] to the method of solution, not to the solution’ (p. 158). Sheffield, Meissner & Foong (2004) used the term ‘open-middle’ to describe this kind of tasks since the method, which is in the middle of the goal and the answer, is open. For example, there are multiple methods to solve problem-solving tasks such as task 2a (handshake). However, procedural tasks, such as task 1a (quadratic), also have multiple methods of solution, e.g. factorisation, completing the square and quadratic formula. But do we really want to say that procedural tasks are also open in its method? In the section on open answer, the initial definition of an open answer as consisting of multiple correct answers has to be modified to one whose answer is indeterminate because of a counter example from task 1b. So, the question is whether we can use the idea of determinateness to refine the definition of an open method.

In task 2a, the method is also indeterminate, i.e. it is impossible to determine all the methods of solution. For example, students may try to solve task 2a by starting with a smaller number of participants in order to find a pattern so as to generalise, or they may use logical reasoning to deduce that the total number of handshakes is 99 + 98 + 97 + … + 1. But this does not mean that there are no other methods. In fact, there is a method using combinatorics that few students will think of: since every different pair of participants will give rise to one distinct handshake, so the total number of handshakes is ^{100}C_{2}. However, task 1a also has an indeterminate method as there may be more methods other than the three methods mentioned in the preceding paragraph. For example, the answer to task 1a can be found by trial and error, or by using a graphing software. Thus, the idea of determinateness is not helpful here.

Another option is look at the *nature* of the methods. The methods for solving task 1a are procedural skills that students have learnt earlier in class while the methods for solving task 2a are useful problem-solving heuristics that stimulate mathematical thinking (Orton & Frobisher, 1996). Hence, a more viable definition is that a task has an open method if there are multiple methods of solution involving problem-solving heuristics rather than the mere application of known procedures, and closed if otherwise. Similarly, investigative tasks, such as task 3a (powers of 3), has an open method since there are multiple methods of solution involving problem-solving heuristics.

But there is a second type of open method. For real-life tasks such as task 4a (playground), ‘there is no *simple* [emphasis mine] “legal move generator” for finding all of the alternative possibilities at each step’ (Simon, 1978, p. 286). Doyle (1983), when describing writing a composition whose answer can also be classified as ill-defined in our context, believed that ‘*specific* [emphasis mine] formulas for generating paragraphs may not exist’ (p. 165). But Doyle also conceded that it is still possible to teach students how to write simple sentences and then combine them into more complex sentences when writing a composition. Similarly for task 4a, it is still possible to teach students how to find the dimensions and costs of the various components of a playground, how to decide which components to choose if there is a budget, and how to draw these components inside a playground using an appropriate scale. However, it is impossible to teach students a method that will lead to the *same* product (or answer) since different students *will* produce different products even when they are taught the same method: no two compositions or playground designs will be exactly the same. In fact, if two products are exactly the same, the students will be suspected of plagiarism. This kind of open method is unlike the first type of open method in tasks 2a and 3a where it is possible to teach students a method that will lead to the same correct answer because the answer is well-defined. In this article, the first type of open method will be called ‘well-defined’ while the second type will be called ‘ill-defined’.

Task 2b: Paraphrase of Problem-Solving Task (Handshake)

At a workshop, each of the 100 participants shakes hands once with each of the other participants. Find the total number of handshakes using as many methods as possible. Discuss which methods are ‘better’ and in what ways they are ‘better’.

On the other hand, the open method in investigative tasks, such as task 3a, is inherent in the task because it is not possible to use only one method to generate all the correct answers. Similarly, the open method in real-life tasks, such as task 4a, is also task-inherent because the method is ill-defined: it does not depend on the subjects to make the method open.

Types of method

Closed method | Open method | |
---|---|---|

Well-defined method | Procedural tasks | Problem-solving tasks; investigative tasks |

Ill-defined method | N.A. | Real-life tasks |

Task-inherent | Procedural tasks | Investigative tasks; Real-life tasks |

Subject-dependent | N.A. | Problem-solving tasks |

Closely related to the method of solution is scaffolding: how teachers can structure the method of solution into the task statement to guide students if the task is too complex.

## Task Complexity

Task 6: Problem-Solving Task (Box)

Make an open box using a given vanguard sheet so that it has the biggest possible volume.

This is a problem-solving task since it requires the use of some problem-solving heuristics to solve, and it has a closed goal as there is a specific goal in the task statement. The task also has a closed answer because there is only one correct answer and so it is determinate. Just like task 2a (handshake), task 6 has an open method, but there is a sense that something else is open: the task is too complex and there is no scaffolding in the task statement, so the task is open to the students in the sense that they may not even know how to start.

Complexity is a continuum: at both ends, educators may agree that the tasks are either very simple or very complex; the problem always lies in the middle and so there will be grey areas. Moreover, complexity depends on the students: what is complex for primary school students may be simple enough for secondary school students. But educators are more likely to agree that, for the *same* student, task 6 is more open than task 2a because task 6 is more complex. To make task 6 less open, the teacher can rephrase the task statement to include more guidance on how to solve the task. In fact, it is possible to provide enough scaffolding, in terms of the method of solution, to *close* the task because the method is well-defined, i.e. task 6 is not inherently open. In other words, this first type of openness is subject-dependent: it depends on the teacher whether to provide enough scaffolding to close the task.

Task 4b: Paraphrase of Real-Life Task (Playground)

Design a playground for the school. The playground must be 20 m by 20 m and it must contain at least one slide and two swings. The budget is $10.000.

Nevertheless, it is still not possible to provide all the information necessary to solve this task because the answer is ill-defined. For example, students can choose any slide that they like, but how do you provide the size and cost of every slide available in the world? Thus, the students have to do their own research to gather all the necessary information that they need. In fact, it is impossible to decide what the boundary of the necessary information to solve this kind of real-life tasks is (Frederiksen, 1984; Simon, 1978), so there may be a limit as to how much scaffolding can be given, thus making the task open. Therefore, this second type of openness is task-inherent since it is *inherently* not possible to provide enough scaffolding to close the task. Contrast this with task 3a (powers of 3) which is an investigative task with an open but well-defined answer: although not much information is given in the task statement, it can still be solved without further information or research. But if a task is simple enough for the students, e.g. task 1a (quadratic) and task 2a, then the task is said to be closed.

Some authors (e.g. Frederiksen, 1984; Kilpatrick, 1987) referred to tasks 4a and 4b as *ill*-*structured* because the tasks lack any structure. From Simon’s (1978) three criteria of ill-structuredness and his examples of ill-structured tasks which are all real-life tasks, his idea of an ill-structured task is one with an ill-defined method and an ill-defined answer. Therefore, Simon’s idea of ill-structuredness is an inherent property of the task, i.e. it is not possible to provide enough scaffolding to close the task.

Types of task complexity

Task is closed (in terms of complexity) | Task is open (in terms of complexity) | |
---|---|---|

Task-inherent | Procedural tasks; some problem-solving tasks such as sask 2a | Real-life tasks |

Subject-dependent | N.A. | Investigative tasks; some problem-solving tasks such as task 6 |

## Extension

As stated in the ‘Introduction’, Orton & Frobisher (1996) distinguished between the terms ‘open’ and ‘open-ended’. They classified investigative tasks as ‘open’ because these tasks have an open goal. However, they used the term ‘open-ended’ to describe problem-solving tasks because what are open are not the tasks themselves but the end of the tasks since they can always be extended. For example, students can extend task 2a (handshake) by asking, ‘What if there are 200 participants? Or *n* participants?’ This method of extending a task is to change the given numbers or conditions by asking, ‘What if?’ (Brown & Walter, 2005; Kilpatrick, 1987) Thus, problem-solving tasks are open in the sense that they can be extended. On the other hand, Becker & Shimada (1997) used the term ‘open-ended’ differently to refer to tasks with an open answer because the answer, which is at the ‘end’ of the solution of a task, is open. Therefore, there is a need to be aware of the different meanings of constructs that look similar (e.g. open and open-ended) and different meanings of the same terminology (e.g. open-ended) as used by different people. Hence, it is important to specify which task variable is open.

The purpose for extending problem-solving tasks is to discover more underlying patterns or mathematical structures, and to generalise if possible (Mason et al., 1985). But not all tasks can be or should be extended. For example, after students have solved the quadratic equation in task 1a, finding another quadratic equation to solve is not an extension of task 1a but another *new* task. Similarly, after students have designed a playground in task 4a, designing a swimming pool for the school is not an extension of task 4a but a different task. Therefore, there is a need to distinguish between these two types of situations: a task is open if it can be extended, i.e. ‘extending’ the task would lead to the discovery of more underlying structures; otherwise, a task is closed if it cannot or should not be extended, i.e. ‘extending’ the task would only lead to new unrelated tasks.

There are two types of extension: subject-dependent and task-inherent. The issue with extending problem-solving tasks such as task 2a is that most teachers will usually not expect their students to extend, and most students will usually not extend the tasks themselves, thus the tasks are closed to them. So, this kind of extension is subject-dependent. However, for investigative tasks such as task 3a, students are expected to find patterns not only for powers of 3, but also for powers of other numbers. In other words, students are expected to extend investigative tasks by changing the given numbers or conditions. Therefore, extension is task-inherent in investigative tasks: it does not depend on the subjects whether to extend or not.

Types of extension

Task is closed (in terms of extension) | Task is open (in terms of extension) | |
---|---|---|

Task-dependent | Procedural tasks; real-life tasks | Investigative tasks |

Subject-dependent | N.A. | Problem-solving tasks |

## Discussion

*not*meant to be exhaustive. It is beyond the scope of this article to discuss other types of tasks such as mathematical modelling tasks (Kaiser & Sriraman, 2006), problem posing tasks (Silver, 1994) and tasks used in problem-based learning (Ronis, 2001). The reader is invited to evaluate the openness of these tasks using the openness framework developed in this article. It is also beyond the scope of this article to discuss whether there are other task variables, e.g. context, which might affect the openness of mathematical tasks.

Framework characterising task openness (with some examples)

Task variables | Examples and types of mathematical tasks to illustrate framework | |||||||
---|---|---|---|---|---|---|---|---|

Task 1a: Quadratic | Task 2a: Handshake | Task 6: Box | Task 3a: Investigate powers of 3 | Task 3b: Find patterns for powers of 3 | Task 4a: Playground | Task 5a: Project | ||

Procedural task | Problem-solving task | Problem-solving task | Investigative task | Investigative task | Real-life task | Real-life task | ||

Goal | Closed | ✓ | ✓ | ✓ | ✓ | |||

Open; well-defined | ✓ | |||||||

Open; ill-defined | ✓ | ✓ | ||||||

Method | Closed | ✓ | ||||||

Open; well-defined | ✓ | ✓ | ✓ | ✓ | ||||

Open; ill-defined | ✓ | ✓ | ||||||

Open; task-inherent | ✓ | ✓ | ✓ | ✓ | ||||

Open; subject-dependent | ✓ | ✓ | ||||||

Complexity | Closed | ✓ | ✓ | |||||

Open; task-inherent | ✓ | ✓ | ||||||

Open; subject-dependent | ✓ | ✓ | ✓ | |||||

Answer | Closed | ✓ | ✓ | ✓ | ||||

Open; well-defined | ✓ | ✓ | ||||||

Open; ill-defined | ✓ | ✓ | ||||||

Extension | Closed | ✓ | ✓ | ✓ | ||||

Open; task-inherent | ✓ | ✓ | ||||||

Open; subject-dependent | ✓ | ✓ |

The purpose of the framework is to establish a common platform for analysing the openness of a mathematical task. If teachers are unclear about how the features of different types of tasks can affect student learning, it will affect how they teach (Doyle, 1983; Frobisher, 1994). For example, a teacher may select open tasks that happen to be open in only one aspect, such as a well-defined open answer (e.g. task 3a: powers of 3) but not an ill-defined open answer (e.g. task 4a: playground), and this will develop in students a limited range of skills and processes; or the teacher may use the same pedagogy to teach both closed and open tasks, such as telling students to use a particular method to solve both procedural tasks and investigative tasks, which will defeat the intended purpose of the openness of investigative tasks. But the impact of the openness of a task on student learning is not so clear because there are many task variables that can affect the degrees of openness. Thus, there is a need to analyse the effect of each task variable on student learning while keeping the other variables constant. Researchers can examine the relationship between each task variable and the types of thinking processes that can be elicited by the students. They can also study the interaction between the task variables and student variables such as age group, achievement level and belief system. These may then help teachers to understand which types of tasks, with what combinations of task variables that are open, that they may have to design or choose in order to develop in students a wider range of mathematical skills and processes.

Whether the openness of a task is inherent or subject-dependent can also affect how the task is implemented. For example, problem-solving tasks are supposed to be open in terms of method and extension, but if the teachers do not teach their students to find alternative methods of solution or to extend the tasks, then the tasks are closed to the students. To help the students develop other types of processes, the teachers can make the tasks more open by telling their students to discuss other methods of solution or to pose more problems to solve by changing the given conditions. On the other hand, the openness of investigative tasks in terms of method and extension is task-inherent, so there is no need for the teacher to open up the tasks further. However, what is not so clear is whether students will acquire the full range of problem-posing processes if they are only taught to extend problem-solving tasks, or whether they need to be given investigative tasks because the latter may develop in students different kinds of problem-posing processes. For example, students may use the original problem in problem-solving tasks as a springboard to pose more problems to solve or to generalise, but when given an investigative task without any problem in the task statement, the students may not know how to pose any problem to solve or investigate.

Another example is how much scaffolding teachers can build into a task to help lower-ability students. For tasks with an open but ill-defined method, e.g. task 4a (playground), there is a limit as to how much the teacher can structure the task statement to include the method of solution or enough information to solve it, although it is still possible to provide more information to make the task less open for weaker students. On the other hand, tasks with an open but well-defined method, such as task 6 (box), can be made less open by structuring the method of solution into the task statement to close it because this kind of openness is subject-dependent. However, the impact of the different types of scaffolding on student learning is not so clear because scaffolding may help weaker students to solve the task but it may also cost them the chance to develop other types of processes. For example, researchers can investigate whether providing a method of solution to make complex problem-solving tasks, such as task 6, less open may help students gain more confidence and competence in solving other more complex problem-solving tasks, or will it cause students to be stuck when they encounter the more open problem-solving tasks with no scaffolding? These are important issues that will inform how and what teachers teach their students.

## Conclusion

*task*is to initiate

*activity*by learners.’ (Mason & Johnston-Wilder, 2006, p. 5)

The activity itself is not learning: it is just doing tasks. But it is in the course of this activity that learners encounter mathematical ideas and themes, develop and practise techniques, and use their mathematical powers. The activity provides the basis for learning to take place. (Mason & Johnston-Wilder, 2006, p. 69)

The distinction between a task and an activity is important because the original purpose of a task may be lost during its implementation (Stein et al., 1996). Therefore, there is a need to develop a framework to characterise the openness of a task based on different task variables so that teachers can design or choose appropriate tasks to develop in their students different kinds of mathematical processes. Being aware that the openness of a task may be subject-dependent or task-inherent may also help teachers to decide how best to implement the task so as not to close it. However, the impact of the various task variables that are open on different types of students’ learning is still not very clear, so more research needs to be done in this area in order to inform teaching.

### References

- Becker, J. P. & Shimada, S. (1997).
*The open-ended approach: A new proposal for teaching mathematics*. Reston, VA: National Council of Teachers of Mathematics.Google Scholar - Boaler, J. (1998). Open and closed mathematics: Student experiences and understandings.
*Journal for Research in Mathematics Education, 29*, 41–62.CrossRefGoogle Scholar - Brown, S. I. & Walter, M. I. (2005).
*The art of problem posing*(3rd ed.). Mahwah, NJ: Erlbaum.Google Scholar - Cai, J. & Cifarelli, V. (2005). Exploring mathematical exploration: How two college students formulated and solved their own mathematical problems.
*Focus on Learning Problems in Mathematics, 27*(3), 43–72.Google Scholar - Christiansen, B. & Walther, G. (1986). Task and activity. In B. Christiansen, A. G. Howson & M. Otte (Eds.),
*Perspectives on mathematics education: Papers submitted by members of the Bacomet Group*(pp. 243–307). Dordrecht, The Netherland: Reidel.CrossRefGoogle Scholar - Cockcroft, W. H. (1982).
*Mathematics counts: Report of the committee of inquiry into the teaching of mathematics in schools under the chairmanship of Dr W H Cockcroft*. London, England: Her Majesty’s Stationery Office (HMSO).Google Scholar - Doyle, W. (1983). Academic work.
*Review of Educational Research, 53*, 159–199.CrossRefGoogle Scholar - Ernest, P. (1991).
*The philosophy of mathematics education*. London, England: Falmer Press.Google Scholar - Evans, J. (1987). Investigations: The state of the art.
*Mathematics in School, 16*(1), 27–30.Google Scholar - Frederiksen, N. (1984). Implications of cognitive theory for instruction in problem solving.
*Review of Educational Research, 54*, 363–407.CrossRefGoogle Scholar - Frobisher, L. (1994). Problems, investigations and an investigative approach. In A. Orton & G. Wain (Eds.),
*Issues in teaching mathematics*(pp. 150–173). London, England: Cassell.Google Scholar - Henderson, K. B. & Pingry, R. E. (1953). Problem solving in mathematics. In H. F. Fehr (Ed.),
*The learning of mathematics: Its theory and practice*(pp. 228–270). Washington, DC: National Council of Teachers of Mathematics.Google Scholar - Henningsen, M. & Stein, M. K. (1997). Mathematical tasks and student cognition: Classroom-based factors that support and inhibit high-level mathematical thinking and reasoning.
*Journal for Research in Mathematics Education, 28*, 524–549.CrossRefGoogle Scholar - Hiebert, J. & Wearne, D. (1993). Instructional tasks, classroom discourse, and students’ learning in second-grade arithmetic.
*American Educational Research Journal, 30*, 393–425.CrossRefGoogle Scholar - Jaworski, B. (1994).
*Investigating mathematics teaching: A constructivist enquiry*. London, England: Falmer Press.Google Scholar - Kaiser, G. & Sriraman, B. (2006). A global survey of international perspectives on modelling in mathematics education.
*Zentralblatt für Didaktik der Mathematik, 38*, 302–310.CrossRefGoogle Scholar - Kilpatrick, J. (1987). Problem formulating: Where do good problems come from? In A. H. Schoenfeld (Ed.),
*Cognitive science and mathematics education*(pp. 123–147). Hillsdale, MI: Erlbaum.Google Scholar - Klavir, R. & Hershkovitz, S. (2008). Teaching and evaluating ‘open-ended’ problems.
*International Journal for Mathematics Teaching and Learning*. Retrieved from http://www.cimt.plymouth.ac.uk/journal. - Lampert, M. (1990). When the problem is not the question and the solution is not the answer: Mathematical knowing and teaching.
*American Educational Research Journal, 27*, 29–63.CrossRefGoogle Scholar - Lerman, S. (1989). Investigations: Where to now? In P. Ernest (Ed.),
*Mathematics teaching: The state of the art*(pp. 73–80). London, England: Falmer.Google Scholar - Lester, F. K., Jr. (1980). Problem solving: Is it a problem? In M. M. Lindquist (Ed.),
*Selected issues in mathematics education*(pp. 29–45). Berkeley: McCutchan.Google Scholar - Lingefjärd, T. & Meier, S. (2010). Teachers as managers of the modelling process.
*Mathematics Education Research Journal, 22*, 92–107.CrossRefGoogle Scholar - Mason, J., Burton, L. & Stacey, K. (1985).
*Thinking mathematically*(Revth ed.). Wokingham, England: Addison-Wesley.Google Scholar - Mason, J. & Johnston-Wilder, S. (2006).
*Designing and using mathematical tasks*. St Albans, England: Tarquin Publications.Google Scholar - Moschkovich, J. N. (2002). Bringing together workplace and academic mathematical practices during classroom assessments. In E. Yackel, M. E. Brenner & J. N. Moschkovich (Eds.),
*Everyday and academic mathematics in the classroom*(pp. 93–110). Reston, VA: National Council of Teachers of Mathematics.Google Scholar - National Council of Teachers of Mathematics (1991).
*Professional standards for teaching mathematics*. Reston, VA: National Council of Teachers of Mathematics.Google Scholar - Orton, A. & Frobisher, L. (1996).
*Insights into teaching mathematics*. London, England: Cassell.Google Scholar - Pirie, S. (1987).
*Mathematical investigations in your classroom: A guide for teachers*. Basingstoke, England: Macmillan.Google Scholar - Reys, R. E., Lindquist, M. M., Lambdin, D. V., Smith, N. L. & Suydam, M. N. (2012).
*Helping children learn mathematics*(10th ed.). Hoboken, NJ: Wiley.Google Scholar - Ronis, D. (2001).
*Problem-based learning for math and science: Integrating inquiry and the Internet*. Arlington Height, IL: SkyLight.Google Scholar - Schoenfeld, A. H. (1985).
*Mathematical problem solving*. Orlando, FL: Academic.Google Scholar - Schoenfeld, A. H. (1988). When good teaching leads to bad results: The disasters of “well-taught” mathematics courses.
*Educational Psychologist, 23*, 145–166.CrossRefGoogle Scholar - Sheffield, L. J., Meissner, H. & Foong, P. Y. (2004).
*Developing mathematical creativity in young children.*Paper presented at the Tenth International Congress on Mathematical Education, Copenhagen, Denmark.Google Scholar - Silver, E. A. (1994). On mathematical problem posing.
*For the Learning of Mathematics, 14*(1), 19–28.Google Scholar - Simon, H. A. (1978). Information-processing theory of human problem solving. In W. K. Estes (Ed.),
*Handbook of learning and cognitive processes*(Vol. 5, pp. 271–295). Hillsdale, MI: Erlbaum.Google Scholar - Skovsmose, O. (2002). Landscapes of investigation. In L. Haggarty (Ed.),
*Teaching mathematics in secondary schools*(pp. 115–128). London, England: Routledge Falmer.Google Scholar - Stein, M. K., Grover, B. W. & Henningsen, M. (1996). Building student capacity for mathematical thinking and reasoning: An analysis of mathematical tasks used in reform classrooms.
*American Educational Research Journal, 33*, 455–488.CrossRefGoogle Scholar - Wolf, A. (1990). Testing investigations. In P. Dowling & R. Noss (Eds.),
*Mathematics versus the national curriculum*(pp. 137–153). London, England: Falmer Press.Google Scholar - Yeo, J.B.W. (2008). Secondary school students investigating mathematics. In M. Goos, R. Brown & K. Makar (Eds.),
*Proceedings of the 31st Annual Conference of the Mathematics Education Research Group of Australasia (MERGA): Navigating Currents and Charting Directions*(vol. 2, pp. 613–619). Brisbane, Australia: MERGA, Inc. Google Scholar - Yeo, J.B.W & Yeap, B.H. (2010). Characterising the cognitive processes in mathematical investigation.
*International Journal for Mathematics Teaching and Learning*. Retrieved from http://www.cimt.plymouth.ac.uk/journal.