Introduction

Intellectual curiosity, referring to student’s motivation to comprehend and engage in cognitively demanding tasks, is identified as a third major predictor for success in academic performance, next to intelligence and effort (Von Stumm, Hell, & Chamorro-Premuzic 2011). Adults, such as parents and teachers, can have a pivotal role in supporting or inhibiting intellectual curiosity in students (Chak 2002). One of the strategies to arouse intellectual curiosity is to encourage students to ask Sincerely Information Seeking (SIS) questions (Greasser & Wisher 2001). SIS questions are raised by students with the aim to enlarge their knowledge base or to resolve cognitive conflicts (Van der Meij 1994). SIS questions express a genuine interest and intrinsic motivation of students to inquire into a topic (Jirout & Klahr 2011).

Although student SIS questioning is deemed important, it is rarely observed in classrooms (Engel & Randall 2009). Although many teachers acknowledge the importance of intellectual curiosity, in practice they struggle to balance freedom for students SIS questioning with curricular pressures (Engel & Randall 2009). In education, therefore, the challenge emerges to build a bridge between the intellectual curiosity and personal interests of students (the student perspective) and the responsibility for coverage of the curriculum and attainment of learning goals (the teacher perspective). Because teachers have a pivotal role in building this bridge, they need support to guide effective student questioning, defined as the degree to which student questions emerging from intellectual curiosity contribute to learning curriculum objectives as set by the teacher, handbook or national standards.

To support teachers in guiding effective student questioning, a principle-based scenario was developed in a design-based research project (Stokhof et al. 2017a, b). A principle-based scenario provides a sequence of pedagogical activities, which supports teachers to translate design-principles into concrete classroom teaching and to make adaptive decisions to accommodate activities to local contexts, needs, and possibilities (cf. Wen et al. 2012; Zhang et al. 2011). The aims of the scenario were to encourage students to generate and investigate SIS questions, to align student questioning to the curriculum objectives, and to support and monitor student learning outcomes. In the scenario, mind mapping was selected as the visual tool which teachers supported to (a) define and visualize curriculum objectives, (b) elicit prior student knowledge, (c) generate and discuss student questions, (d) guide collective knowledge construction, and (e) monitor and evaluate the development of both individual and collective knowledge.

First, the design-principles for the scenario were identified in a review study (Stokhof et al. 2017a). Then, a follow-up study was conducted that focused on the development and refinement of a prototype of the scenario for teacher guidance in several iterations of design, implementation, evaluation, and redesign (Stokhof et al. 2017b). The evaluation focused on the relevance, practicality, and process effectiveness of the scenario. The next and final step in the development process of the scenario was to identify the impact of the intervention on student learning outcomes (cf. Nieveen 2009). Therefore, this study explores if and to what extent students can attain curricular goals by raising and exploring SIS questions, when guided by the scenario. In the next section, the design-principles of the scenario and rationale for the use of mind mapping as tool for measuring learning outcomes are described and explained.

Theoretical Framework

Design Principles for Effective Student Questioning

To design a scenario that could support teachers in guiding effective student questioning, first, the literature on the role and effects student questioning was reviewed in previous study (Stokhof et al. 2017a). Four design principles emerged that promote and support a classroom culture in which student questioning can effectively occur: (a) define a core curriculum, (b) support question generation, (c) establish a shared responsibility, and (d) visualize collective knowledge construction.

The first design principle claims that teachers should identify a core curriculum for the topic under study. The challenge for teachers is to define a conceptual focus that allows both freedom for students’ intellectual curiosity and structure for aligning their personal questioning to curriculum objectives. Applebee (1996) suggests that a few core concepts could form the basis for such a curriculum. By limiting a curriculum to its core, students have the opportunity to explore and to elaborate upon these core concepts. Similarly, Scardamalia (2002) considers Big Ideas to be a conceptual structure which allows for student inquiry. Mitchell et al. (2017) demonstrate that Big Ideas allow teachers to (a) introduce and organize content, (b) connect the topic to student experience, and (c) provide the basis for restructuring existing ideas. Moreover, identifying and discussing a core curriculum in preparation of their lessons is expected to deepen teachers’ domain knowledge and to provide them with a conceptual focus for guiding their students’ questions (cf. Mitchell et al. 2017; Zhang et al. 2009; Zeegers 2002).

The second design principle states that teachers should support question generation by making students aware of their prior knowledge and by encouraging and acknowledging all questions. When teachers guide students to activate, structure, and exchange their prior knowledge, this raises students’ awareness of possible gaps in their knowledge (Van Tassel 2001). This awareness of the “not-yet known” is expected to elicit students’ perplexity and questioning (Greasser & Wisher 2001). Therefore, teachers have a pivotal role in supporting the actual generation of questions (Stokhof et al. 2017a). When teachers value all types and levels of student questions as potential contributions to learning, a classroom culture is established in which more student questions emerge which contribute to exploring the curriculum (Beck 1998).

A third design principle is to establish a sense of shared responsibility for collective knowledge construction. If students only answer their own questions, they most likely will not learn the core curriculum because their questions often focus only on a subtopic and not on the big picture (Keys 1998; Polman & Pea 2001). By sharing questions and answers, students’ learning might go beyond their individual questions because they will collectively explore the whole topic in the classroom. Collective knowledge construction allows students to develop an overview of the key concepts of the topic and allows them to contribute their specific expertise to the benefit of all (e.g., Zhang et al. 2009).

Finally, a fourth design principle that is found to support effective student questioning is to visualize progressive inquiry and the process of collective knowledge development. Research shows that student questioning is not static, but is able to progress gradually from basic fact-seeking questioning to more sophisticated wonderment questioning (Hakkarainen 2003). Essential to support the progressive nature of inquiry is to make students aware of their learning progress, by visualizing how answers raise new questions and how collective knowledge thus gradually evolves (Zhang et al. 2009). A collective visual platform seems effective to visualize students’ prior knowledge as a starting point for collective and progressive knowledge construction. Visualizing collective knowledge also supports teachers in monitoring and assessing student learning outcomes.

Based on these four design principles, a principle-based scenario was developed to support teachers in guiding student questioning towards effectivity. The scenario was developed in close collaboration with practitioners in a 4-year study consisting of multiple iterations of design, implementation, and evaluation (Stokhof et al. 2017b).

Reasons for Selecting Mind Mapping as the Visual Tool

Mind mapping was selected as the visual tool in the scenario to guide effective student questioning. Mind maps can be defined as radial structures which allow concepts to be visually organized in organically formed, colored branches (Davies 2010). Mind mapping was selected because it was expected to visually support all four design principles of the scenario.

First, mind mapping supports identifying a core curriculum within the topic under study because the structure of a mind map facilitates a hierarchical categorization of domain content into core concepts, subordinate concepts, and details or examples (Brinkmann 2003). Core concepts are placed on the head branches of the mind map because they represent the top level in the hierarchical structure. Subordinate concepts are placed on sub-branches, representing the next level in the conceptual hierarchy. Concepts representing details and examples will be placed on subsequent levels in the mind maps. The core of the curriculum can thus be identified as concepts in, or close to, head- and sub-branch level. When teachers construct an expert mind map of a topic, they need to consider which concepts are “core” and could represent the conceptual focus and which are subordinate concepts, details, or examples which elaborate and illustrate the core curriculum.

Second, mind mapping supports question generation by visualizing prior knowledge and allowing for divergent questioning. Mind mapping is suitable for visualizing prior knowledge because it supports brainstorming and exchanging information (Shih, Nguyen, Hirano, Redmiles, & Hayes 2009). Divergent questioning can be evoked because multiple key concepts on the head branches visually support the idea that a topic can be explored from multiple perspectives and interests (Eppler 2006). Moreover, when questions are linked to the concepts they address, the mind map structure visualizes the variety of questions both in level and in interest. The hierarchical structure of a mind map visualizes if a question is fundamental, addressing a key concept on a head branch, or very specific, exploring a minor detail in a sub-sub-branch. Therefore, the structure of a mind map visually supports teachers and students in raising a variety of questions and supports valuing the potential contribution of a question for learning the curriculum.

Third, mind mapping supports a sense of shared responsibility by visualizing students’ collective knowledge development (Zhang et al. 2009). All student questions can be visualized in a collective mind map, either as branch or as a hypertext note in a digital mind map (Tergan 2005). Student answers can be integrated in the mind map by adding new information on branches and by restructuring branches. Collective knowledge construction becomes visible when students collaborate to construct a classroom mind map, which is elaborated in size and refined in structure. By collaborating on this collective visual platform, students become aware that each and every question contributes to a collective result (e.g., Zhang et al. 2009).

Finally, mind maps can visualize if and to what extent student questioning has been effective and the core curriculum has been attained. Teachers can use mind maps to monitor and assess the individual knowledge development because constructing a mind map requires both recall of concepts and spatial organization of student’s knowledge about topic in a visual structure (D’Antoni et al. 2009).

In the process of selecting the most appropriate visual tool to use in the scenario, concept maps had also been considered because they have many features as visual tool which support collective knowledge construction. However, mind mapping was found to be more suitable than concept mapping for several reasons. First, mind maps were found to be more accessible for the target group (children aged 8 till 12 years old) because the procedures of concept mapping are relatively complex for novice learners, compared to the procedures for constructing mind maps (Eppler 2006; Merchie & Van Keer 2012). Second, although concept maps allow teachers to visualize a wide range of relations of different natures, the needs of novice learners, who are just starting to mobilize their prior knowledge of the topic under study, seem well supported by the associative and structuring relations which can be visualized in mind maps (Eppler 2006; Wetzels et al. 2011). Third, although Davies (2010) suggested that mind maps are idiosyncratic and hard to understand for outsiders, Shih et al. (2009) showed that the collective use of mind maps was effective for sharing and extending knowledge. Fourth, because of the expected cognitive load of constructing concept maps for primary school children, mind maps were expected to be more valid to guide and assess their emerging knowledge structures.

Measuring Curricular Objectives in Mind Maps

Having developed a principle-based scenario for teacher guidance in a previous study (Stokhof et al. 2017b), the next step in the design-based research project was to identify the impact of the scenario on the learning outcomes for students. The assumption was that the scenario supports effective student questioning. More specifically, the expectation was that by investigating self-raised questions, exchanging answers, and constructing collective knowledge, students would attain and elaborate upon the core curriculum. The curricular objectives of the scenario were thus (a) to assimilate and accommodate a core curriculum as a conceptual framework of understanding, (b) to assimilate and accommodate new knowledge generated by all classroom questions in this conceptual framework of understanding, and (c) to refine the structure of their conceptual framework of understanding as indicator of developing expertise (Chi 2006).

To determine if students attained curricular objectives, three indicators of quality in the student mind maps were operationalized: similarity to the core curriculum, elaboration of the core curriculum, and quality of structure. First, an expert mind map represents the conceptual framework of the intended core curriculum which teachers had in mind during preparation and as such is the point of reference for assessment. Students learn this core curriculum by exploring their prior knowledge and raising SIS questions about it. When student mind maps are compared to an expert mind map, students’ recall of the core curriculum can be assessed by counting the number of similar words (McClure et al. 1999). Second, teachers also intended students to elaborate upon the core curriculum but chose not to define how this elaboration was supposed to take shape. Teachers rather chose to allow freedom for students to explore and extend the core curriculum by means of their questions. Therefore, also added words, which represent new knowledge generated by student questions, should also be taken into account as learning outcomes. Added words that were related to the topic at hand and logically placed in the conceptual structure of the mind map were considered to represent the elaboration of the core curriculum. Third, the degree of hierarchy in the mind maps was expected to represent the ability of students to (re)organize existing and new knowledge in several layers (Chi 2006). Therefore, the degree in which students were able to use multiple levels to structure their knowledge in the mind maps was considered to indicate their degree of mastery of the conceptual structure of the topic.

The hypothesis in this study was that students would gradually internalize the conceptual structure of the core curriculum and could use it to assimilate and accommodate new knowledge acquired by student questioning in their personal knowledge schemes. The collective raising, exchange and discussion of questions and answers, and visualizing this process of collective knowledge construction in a classroom mind map were expected to help students to learn the whole of the curriculum. Therefore, following research question was formulated: To what degree do students attain curricular objectives, operationalized as (1) learning a core curriculum, (2) elaborating on this core curriculum, and (3) refining the conceptual structure of their knowledge, when teachers guide student questioning by means of a mind map supported scenario?

Method

This study is part of a 4-year design-based research project. Design-based research aims to develop a practical solution for a practitioners’ problem, as well as, theoretical understandings about the effectiveness of the design principles (Design-Based Research Collective 2003). In design-based research, solutions are often developed in a series of studies (Schoenfeld & Conner 2009). First, design principles are identified in a review study. Then, a prototype is developed in multiple cycles of design-implementation-evaluation- and redesign, in close collaboration with practitioners (McKenney & Reeves 2012). Focus of evaluation in these cycles is the perceived relevance, practicality, and effectiveness from the perspective of the practitioners (Nieveen 2009). In other words, the effectiveness of the intervention for the experienced curriculum is evaluated (Van den Akker 2003). Finally, when practitioners perceive the prototype as sufficiently effective, its realized effects on learning outcomes for students can be assessed. Then, the effectiveness of the intervention for the attained curriculum is evaluated (cf. Van den Akker 2003).

Having studied the effectiveness on the experienced curriculum in a previous study (Stokhof et al. 2017b), the focus in this study was on collecting the first evidence on the effectiveness in terms of student learning outcomes. Therefore, this study was set up as a single group pre-posttest design, for its aim is to find out if guidance of student questioning by means of mind mapping will support this group of students in attaining the curricular objectives. However, comparison to non-treatment groups was yet beyond the scope of the objectives in this stage of the development of the scenario.

The use of mind mapping to test students’ knowledge of a curriculum is a relatively new approach, and only a few studies have explored mind maps as an assessment instruments (e.g., D’Antoni et al. 2009). Therefore, to triangulate results, multiple choice knowledge tests about the same curriculum topics were developed in close collaboration with the participating teachers of each school. For each of the key concepts on the head branch level of the expert mind map, two to three items for the questionnaire were formulated. Similar but different items were constructed for pre- and posttests. For example, in the knowledge test about Water, two items were related to the key concept Danger addressing the sub concept Flooding. In the pretest, item A was asked: “What factor is most likely to cause flooding in the Netherlands?” In the posttest, the corresponding item B was included: “Which part of the Netherlands would be flooded when the dikes would break?” In the development of the pre- and posttests, teachers were consulted if the items were addressing their intended curriculum. On the basis of their feedback, several items were dismissed or reformulated, while some other items were added. The knowledge tests about water were added as an example in Appendix A Table 8.

To check for possible distorting effects in the findings from differences in pupils’ grade or gender, these covariates were taken into account in analysis. To check for adherence to the design principles of the scenario and to control for potential differences between various cases, video-recordings of classroom activity and student products were collected.

Population

In total 276 students, aged between 8 and 12 years old, participated. All students came from two primary schools from a suburban area in The Netherlands. Both schools were strongly committed to question-driven learning, and the teachers voluntarily participated in the development and trial of the scenario previously. Students were a non-random sample with previous experience with the scenario.

Students were distributed over 10 classrooms, and each classroom was treated as a separate case. In school A, cases 1 and 2 consisted of combined grades 5 and 6. In school B, cases 3–10 were all combined grades 4–6. Students were evenly distributed over grades: 30.2% in grade 4, 37.1% in grade 5, and 32.8% in grade 6. The percentage of special care students was below the national average of 9% in each class. All students were moderately skilled mind mappers, being acquainted with the basic mind maps rules and having applied them at least in one or two previous projects. In total, 231 students, 117 boys and 114 girls, completed all four tests and only their data was used for analysis.

Treatment

In each school, groups of teachers collaboratively designed an expert mind map for their science projects in a preparation session. School A chose the topic The Solar System, and school B chose the topic Water. Teachers first prepared a mind map individually, before discussing with their colleagues which key concepts and subordinate concepts should be in their expert mind map. The collectively designed expert mind map was considered to represent the core curriculum for the chosen topic (Fig. 1).

Fig. 1
figure 1

Example expert mind map “water”

In each school, students worked for 6 weeks on their projects. Teachers organized introductions to the projects with the aim to raise students’ interest and activate their prior knowledge about the various key concepts. For example, to raise interest for the key concept Water Cycle in the Water project, students conducted a small experiment with steam to make them aware of the processes of evaporation and condensation. The activated prior knowledge on various key concepts was shared in a classroom discussion and in subsequent small group work and then visualized in an initial classroom mind map in each case (Fig. 2).

Fig. 2
figure 2

Examples of initial and final classroom mind maps (CMM)

Having explored and recorded their collective prior knowledge, students were invited to raise questions. Teachers supported question generation by organizing small-group question-brainstorms according to the Question-Formulation Technique (Rothstein & Santana 2011). In these brainstorms, students used the classroom mind map as question-focus to generate as many questions about the topic as possible, regardless of quality or formulation. By welcoming all questions in this phase, emerging student questioning was fostered and students were encouraged to explore their own wonderings and extend each other’s ideas (Stokhof et al. 2017b). The output of this phase was a large repository of initial student questions. Students raised a wide variety of questions about every key concept in the classroom mind map. For example, the question: “How can water rise if you open the tap?” was related to the key concept Technology, and the question” How can salt water become sweet?” was related to the key concept Water Cycle. More example questions and their relation to the key concepts can be found in Appendix B Table 9.

In the next phase of formulating questions, teachers had an active role in discussing these initial questions with students. First, teachers discussed with students: “What is the relevance of the question to the topic? To which (key) concepts are the questions linked in classroom mind map?” The identified links between questions and the classroom mind map were then visualized on the Interactive White Board. Next, each question’s potential for learning was discussed. “Does the question’s formulation match the questioner’s intention? If not, how could it be more accurately rephrased? What kind of knowledge will this question possibly produce? Is the answer already known?” Then, possible strategies to investigate questions were discussed. “What kind or resources or actions are needed to find or construct an answer? Might a slightly different formulation make the question more feasible for investigation?” The teacher modeled this with a few examples of student questions. Then, students discussed the other questions in small groups, reformulating them when deemed appropriate. Finally, teachers and students discussed which questions were most interesting for investigation. When teachers and students had prioritized a selection of the most interesting questions, students could adopt any of those questions to their liking.

Subsequently, students investigated their questions in dyads or individually using the internet and books, interviewing experts, or conducting small experiments. In all cases, students were expected to present the answer to their questions in short (2 to 5 min) presentations to the whole class. During these presentations, teachers supported students to relate their new information to the (key) concepts in the classroom mind map and discussed in class how to the answer contributed to the collective knowledge. In most cases, teachers or students also discussed how to visualize the new information by adding new concepts to the classroom mind map. For example, the answer to question “How can salt water become sweet?” was visualized in the classroom mind map by adding “salt to sweet” and “vapor” (Appendix B Table 9). By adding new concepts, the classroom mind map gradually expanded during the project (Fig. 2).

Data Collection

Teachers’ expert mind maps, classroom mind maps, students’ individual mind maps, and multiple choice knowledge tests were collected as primary data to measure the different stages of knowledge construction and individual learning outcomes. The expert mind maps were considered to measure the conceptual structure of the intended curriculum as perceived and constructed by the teachers. The classroom mind maps were considered to measure the collective knowledge construction starting from students’ prior knowledge to subsequent stages of added questions and answers. The individual mind maps were considered to measure the degree to which students can recollect, understand, and visually represent the conceptual structure of a subject under study.

The individual mind maps were constructed on empty, A3-size, landscape sheets of paper with colored markers and/or pencils. Finally, a multiple choice test, consisting of 18 items distributed over the various key concepts from the expert mind maps, was developed by the researchers to co-measure the individual learning outcomes in an alternative way to the mind maps. Participating teachers at each school were consulted before administering the knowledge tests, to ensure test items would address relevant topics within the intended curriculum, resulting in minor adjustments. Next to the primary data, video-recordings were made of the classroom sessions and informal interviews were held with teachers to control for fidelity of implementation. Furthermore, the students’ question worksheets were collected to get an overview of how many questions were raised by the students and which topics were addressed. To comply to the ethical standards for the collection of video materials, all recordings were only made after informed consent of the participants; the data were securely stored in a protected location and were only used by the researchers for analysis of the data.

To test individual learning outcomes, students made a pretest mind map and knowledge test just after initial introduction and a posttest just after finishing the project. The pre- and posttests mind maps were made under similar conditions, and students were allowed 30 min to complete them. Pre- and postknowledge tests were made after the mind map tests, although not in the same session but on the next day in order to minimize test interference.

Analysis

Because the scenario had a principle-based character, it offered opportunity to teachers to adjust the scenario to specific classroom contexts and needs. Therefore, as a first step in the analysis implementation, fidelity was established (Mombray et al. 2003). By checking video recordings and product collection, it was determined if teachers had adhered to the design principles: (a) constructing an expert mind map as representation of the intended core curriculum, (b) evoking and recording student prior knowledge about this core curriculum in a classroom mind map, (c) elicit student questioning and align it to the classroom mind map, (d) encouraging students to investigate their questions and share their answers in class to build collective knowledge with the classroom mind map as a collective point of reference, and (e) evaluate knowledge development in pre- and posttest mind maps.

The video observations and product collection confirmed that all cases adhered to the design principles of the scenario. The only remarkable difference between cases was that cases 2, 6, 7, and 9 did not elaborate the classroom mind map with new concepts and did not visualize the progress of the collective knowledge construction. However, the videos showed that the underlying design principle, of the use the classroom mind map as a point of reference when sharing answers, was adhered to.

To score the individual and classroom mind maps, an analysis instrument was developed. Mind maps were scored on three aspects: similarity to core curriculum, elaboration, and quality of structure. First, similarity to the core curriculum was determined by counting the number of words which were identical or synonymous to words in the expert mind maps. Degree of similarity was assessed on three levels: on head branch level, on sub-branch level and on subsequent levels beyond sub-level termed sub-sub-branches. Elaboration of the core curriculum was assessed by counting added words on the three levels. An extra check was made to ensure those words were relevant to the topic and logically placed in the mind map. Quality of structure in the mind maps was analyzed by identifying four levels of conceptual categorization: On the first level, words are merely loosely associated with the key concept on the head branch; on the second level, words are hierarchically structured on two consecutive branches; on the third level, words are hierarchically structured in three consecutive branches; and on the fourth level, words are organized in four (or more) hierarchically structured branches (Fig. 3).

Fig. 3
figure 3

Levels of hierarchical structure in mind map

First two raters, who were not part of the research team, were trained to use the mind map analysis instrument. Then, the two external raters scored all mind maps. An interrater reliability analysis using the Kappa statistic was performed for 20% of the data to determine consistency of the mind map scoring instrument among the two raters. An average score of κ = .88 was calculated for all indicators in the instrument, indicating a strong agreement among raters. Dependent paired-sampled T tests were used to compare means of similar concepts on various levels in individual mind maps. To control for gain scores, the Bonferroni correction was applied to prevent type 1 mistakes, lowering significance levels of p = .05 to .01. A linear regression analysis in SPSS was run, to control for any distorting effects of co-variates “gender,” “grade,” and “case” on the difference between pre- and posttests, but no significant effects were found.

To analyze the quality of structure in the individual mind maps, first, an average score for all branches for each mind map was calculated, before comparing means using a dependent paired-sampled T test in SPSS. Similarly, the scores on knowledge tests were compared using a dependent paired-sampled T test. Because observed variance was high in findings of the T tests, additional analysis was conducted to determine which percentage of the students either (a) improved between pre- and posttest, (b) remained the same, or (c) regressed between pre- and posttest. This was analyzed for the sum of all concepts, for similar words, for added words, and for quality of structure.

To identify which factors in guidance might have contributed to progress in student learning outcomes, both questions and classroom mind maps were analyzed. Starting from the assumption that all types of questions contribute to learning (design principle 2), analysis of students’ SIS questions in the worksheets focused on how many questions were raised and whether they addressed the core or the elaborated curriculum. To determine if and how the numbers and focus of questions affected individual and collective learning outcomes, multiple linear regression analysis was run in SPSS™ (version 23).

To analyze the development of collective knowledge, the mind map scoring instrument was used to score both the initial and final versions of the classroom mind maps for similarity, elaboration, and quality of structure. Outcomes for each classroom mind map for each category were listed in a table together with the development of student mind maps for comparison. Multiple linear regression analysis was run in SPSS to determine if and how expanding the classroom mind maps affected development in student mind maps.

Results

What was the mean effect of the scenario as support for students to attain the curricular objectives?

When comparing the individual pre- and posttest mind maps, Table 1 shows that the mean of all similar words increases significantly. When zooming in on the distribution of similar words over the various levels, a large effect is observed on head branch level and medium effects are found on sub- and sub-sub-branch level. Students tended to use more similar words on all levels, but especially on head branch level.

Table 1 Similarity to core curriculum

Furthermore, what was the mean effect of the scenario for elaboration of the core curriculum? Table 2 shows that the mean of all added words, referring to those words which elaborate upon the core curriculum, increased significantly. When zooming in on the three levels in the mind maps, a significant decrease of added concepts on head branch level was observed, indicating that students tended to use more key concepts from the core curriculum. The increase of added words on sub-level approaches insignificance, but elaboration on sub-sub-levels was both significant as well as large in effect size. Students seem to have elaborated their mind maps thus primarily by adding words on levels that represent details and examples to the key concepts from the core curriculum.

Table 2 Elaboration of core curriculum

When analyzing the structure of individual mind maps, Table 3 shows that the mean level of hierarchy did increase significantly in the posttest. A large effect size was observed, which indicates students were able to organize their mind maps into more hierarchical structures.

Table 3 Level of hierarchy in individual mind maps

Because a relatively high standard deviation was found in the T test, additional analysis was conducted to determine which percentage of the students either progressed, regressed, or retained a status quo between pre- and posttests. Table 4 shows that approximately 80% of all students succeeded in making progress on all four major variables, however still a substantial percentage of 15 to 18% regressed between pre- and posttests.

Table 4 Overview on student progression, regression, or status quo on major mind map variables

Were the findings, as measured by the mind maps, repeated when a multiple choice knowledge test was used? In both schools, knowledge tests were administered addressing either the topic “Solar System” for school A or “Water” for school B. For school B (N = 195), a significant moderate effect size was observed (Table 5). However, for school A (N = 38), no significant effects could be reported.

Table 5 Multiple choice knowledge test

Having observed significant development in students’ individual mind maps on the number of core concepts, detailed elaborations, and increased conceptual structure, the collective classroom mind maps and student SIS questions were analyzed in order to see if the significant knowledge gain could be correlated to the number and focus of questions and or the collective use of the mind maps as supported by the scenario. Table 6 shows a summary of findings for each case.

Table 6 Overview on SIS questions, classroom mind maps, and student mind maps for each case

As can be observed in Table 6, the number of questions varied significantly between cases, ranging from 10 to 25 questions. However, in many cases, not all worksheets could be retrieved. Whether students did not use the worksheets or lost them somewhere in the process, could not be determined. In all cases but one, the number of elaboration questions exceeded the number of questions about core concepts significantly.

Findings in Table 6 show that the classroom mind map was expanded in only six cases. Teachers mentioned various explanations for not expanding the mind map in the interviews. Two teachers (cases 2 and 9) indicated they had not been sufficiently aware that they could have expanded the mind map to visualize growing collective knowledge. Other teachers felt either time-pressured (case 6) or instructed their students to make personal notes instead of expanding the classroom mind map (case 7). In those cases, where the classroom mind map was expanded, a significant increase in elaboration of the core as well as enhanced quality of structure was observed, next to a slight increase in similarity. These findings suggest that expanding the classroom mind maps resulted primarily in elaboration of the core curriculum and refining its structure.

Multiple regression analysis showed that the question variables (core, elaboration, total number, or missing) did not significantly affect the development of student mind maps. Therefore, we conclude the focus or number of questions did not seem to have a direct effect on progress in student learning outcomes. This implies that other factors than the student SIS questions may have influenced students’ ability to construct their mind maps.

The effects of the question variables on the development of the classroom mind maps could not be calculated in multiple regression analysis because the assumption of independent errors (Durbin-Watson test) was not met. However, two-tailed Pearson’s correlation analysis of the cases in which the mind map were expanded, (n = 6), produced some interesting findings (Table 7).

Table 7 Correlations between question variables and classroom mind maps

The only significant (and negative) correlation with development of the core concepts in the classroom mind map were the “elaborating” questions. This might be explained by the relatively large number of elaborating questions, which might have diverted students’ attention of learning the core concepts. By contrast, the elaborated curriculum in the classroom mind maps was strongly correlated to all question variables. As expected, when questions were asked, this correlated positively, and when questions were missing, negative correlations were observed. More surprising was the finding that both “core” and “elaborating” questions were strongly correlated to the elaborated curriculum. This suggests that also core questions supported the exchange and learning of new concepts. Finally, the quality of structure of the classroom mind map seemed to be positively dependent of the number of elaborating questions and negatively influenced by core questions. Again, this finding might be explained by the relatively high number of elaborating questions, in those cases were the quality of structure was increased significantly (for example, cases 3 and 8). We conclude that asking and exchanging SIS questions was in general positively correlated to building and visualizing collective knowledge.

The effects of the classroom mind map on the development of student mind maps were analyzed using multiple regression. The only significant variable that contributed to progress of all words in student mind maps was the increase of elaborated concepts in the classroom mind map (R2 = .045, β = .207, p = .029). In other words, the overall development of student mind maps was enhanced, when new concepts beyond the core curriculum were added to the classroom mind maps. Significant variables for students’ progress in similarity to the core curriculum were both the increase of core concepts (R2 = .171, β = .181, p = .007) as well as the increase of elaborated concepts (R2 = .171, β = .279, p = .000). This finding suggests that expanding the classroom mind map with elaborated concepts has a larger significant effect on learning the core curriculum than expanding it with core concepts. Remarkably, none of classroom mind map variables had any significant effect on development of the elaborated curriculum in student mind maps. Apparently, students were able to elaborate their mind maps beyond the core curriculum, regardless if the classroom mind map had been expanded or not. Finally, the quality of structure in student mind maps was positively correlated to the increase of elaborated concepts in the classroom mind map (R2 = .079, β = .399, p = .000) but negatively influenced by the quality of structure in the classroom mind map (R2 = .079, β = − .199, p = .028). This, somewhat unexpected, finding suggests that not the quality of structure of the classroom mind map but the learning of new concepts supports students in refining their knowledge structures.

Discussion

The aim of this study was to establish if and to what degree students were able to learn a core curriculum when supported by a scenario to guide effective student questioning. To measure student learning outcomes, both individual student mind maps and classroom mind maps were collected. Mind map tests were triangulated with multiple choice knowledge tests.

Findings on individual learning outcomes showed an increase in similarity and decrease of elaboration in student mind maps on head branch level, which indicates students tended to adhere more to the conceptual structure of the core curriculum as represented in the expert mind map. Elaboration was especially found on sub-sub-level, which means students were able to add more details, examples, and associations to the core curriculum. Also, a significant higher level of knowledge organization in the mind maps was found. This might be interpreted as a development from novice to expert knowledge about the topic (Chi 2006). However, not all students progressed when comparing pre- and posttest mind maps. Approximately 20% of the students regressed or remained in a status quo.

Another indicator of individual student knowledge advance was the moderate increase measured by the knowledge test. However, this was only significant for school B. One of the possible explanations for non-significance of the knowledge test in school A is the stage of the curriculum which was measured. The knowledge tests were developed prior to work on the projects in the classrooms. Therefore, the tests were based on the intended curriculum, measuring curriculum content which teachers were expecting to teach (cf. van den Akker 2003). The mind map pre- and posttests were part of the operational curriculum, thus measuring aspects of the curriculum which were actually investigated, shared, and discussed in class. The development of the classroom mind maps showed that teachers chose to follow an emergent operational curriculum, in which the key concepts were given meaning by students’ questions and answers. In this process, teachers allowed the curriculum to develop somewhat differently in the classroom than originally conceived. The knowledge test in school A seems therefore to have been less aligned to the operational curriculum than previously conceived and therefore did not measure accurately what students were actually learning.

Findings on collective learning outcomes showed some remarkable differences between cases. Results show that all teachers were able to use mind mapping as a collective platform for linking student questions to the core curriculum. However, not all teachers were aware, able, or willing to visualize the development of collective knowledge. In each case where the classroom mind map was expanded, a large number of new concepts were added to the conceptual structure of a core curriculum. This shows that classroom mind maps can be useful platforms to exchange and visualize new knowledge. When comparing expanded classroom mind maps between cases, however, differences became apparent in the degree of similarity to the core curriculum and quality of structure.

To explain observed differences in individual and collective learning outcomes, student SIS questions were analyzed for number and focus in each case. No significant effects of SIS questions on development in student mind maps were found. However, SIS questioning was significantly correlated with development of the expanded classroom mind maps. These findings suggest that inquiring into a single personal question will not be sufficient for students to learn the curriculum. Exchanging and discussing questions and answers, however, does contribute to building collective knowledge. A finding which is congruent to the work of, for example, Brown and Campione (1994) and Scardamalia and Bereiter (2006).

The effects of the classroom mind maps on progress in student mind maps were also analyzed. The findings show that expanding the classroom mind map with core and elaborated concepts supported students in learning the core curriculum and refining their knowledge structures. We conclude that visualizing collective knowledge supports individual learning outcomes.

To correctly interpret our findings, we would like to point out some of the limitations of the sample. First, we did not have a randomly selected sample but a homogeneous group of students, who all had some previous experience with the scenario. Second, participating classes were taught by motivated teachers, who had contributed to the development of the scenario. Therefore, comparison to non-experienced teachers or classrooms cannot be made at this point in time. An implementation study testing the robustness of the scenario in new contexts could contribute to a broader understanding of the effects of the scenario in the future. Finally, although many efforts were undertaken to optimize data collection, 17% of the data was incomplete because of student absence during pre- or posttest. Therefore, data from these students could not be used for analysis, which might have influenced our findings.

Two major practical implications for teacher guidance seem to emerge from these findings. First, the findings show that a variety of questions contributed to collective knowledge building. This is congruent to findings of Khanlari et al. (2017), who found there was no significant difference between the positive effects of fact-seeking or of exploratory questions on knowledge building. At the same time, the results demonstrate that the ratio of “missing questions” had strong negative effects on knowledge construction. This suggests that teachers should focus on involving all students in questioning rather than putting much effort in the formulation of the right type of questions. Furthermore, those students who were engaged in answering questions were more likely to learn the curriculum. This might be explained by the observation that the student questions which were raised in class seemed to motivate students to learn more about the topic and made the learning of new content more meaningful because of the connections to own inquiries (cf. Hume 2001; Keys 1998; Van Tassel 2001). Therefore, findings suggest that teachers should not only encourage students to collectively raise questions but also to make sure that all students are engaged in answering those questions. A second implication concerns the exchange of answers. It seems beneficial for students’ individual learning outcomes to discuss and visualize the construction of collective knowledge, especially when teachers relate students’ answers to the core concepts of the topic under study. This implies that teachers might not only need to discuss answers with their students but also need to visualize the relations of these answers to the core curriculum.

A theoretical contribution of this study is the finding that visualizing a core curriculum in a mind map supports teachers in balancing freedom for student questioning with attainment of curricular objectives. Visualizing core concepts (or Big Ideas) in a mind map supports teachers to share intellectual control with their students (cf. Mitchell et al. 2017). Providing a conceptual focus by means of a core curriculum, as suggested by Applebee (1996), is operationalized in the scenario as teachers constructing an expert mind map. This focus allows teachers not only to identify the most central curriculum content but also to construct a conceptual framework that is generative, connecting various concepts with student experiences (Perkins 1992). When students’ prior knowledge is visualized in a classroom mind map, students can be prompted to raise relevant SIS questions about the core curriculum. Finding answers to these SIS questions can extend and direct development of an emergent curriculum which evolves from this core curriculum (Scardamalia 2002).

Another contribution to theory might be the development of the mind map research methodology. Originally, mind mapping was primarily intended to support teachers in guiding effective student questioning. However, in this study also, a mind map research methodology was developed to evaluate individual and collective learning outcomes. The mind map analysis instrument supported researchers in measuring students’ knowledge construction in an emergent curriculum on several aspects and to compare this to a core curriculum. Of course, mind mapping has, as every research strategy, also its limitations. Mind mapping requires students to recall, organize, and visualize their cognitive structures, and although mind mapping seemed suitable for the target group as a visual tool, for some students, it still seemed to exceed their cognitive load. This may have limited the validity of some of the findings. However, we consider mind map analysis to be a useful addition to the existing research instruments since measuring an emergent curriculum with a more traditional multiple-choice instrument proved problematic.

Finally, we would like to point out some of the future challenges for guiding effective student questioning. First, results suggest that teachers need time and practice to learn to identify the key concepts of the core curriculum. Classroom mind maps showed that teachers did only partly cover the core curriculum beyond head branch level. In interviews, teachers explained that in retrospect, many of the selected concepts in their expert mind map on sub- and sub-sub-level turned out to be less relevant or too abstractly formulated for students. A more careful selection of the concepts in the expert mind map might have enhanced the degree of “coverage” of the core curriculum. This finding is congruent to the experiences of Mitchell et al. (2017), who suggested that framing Big Ideas is not simple and teachers need more support in developing them. Second, it seems necessary to clarify which factors need to be taken into consideration to support visualization of the collective development in mind maps. Considering the positive effects on student learning outcomes, the challenge is to encourage all participating teachers to visualize collective knowledge construction. Finally, we conclude that guiding effective student questioning by a mind map-supported scenario enhanced learning outcomes of most students. However, not all students benefited and teachers should be aware that some students might need additional support to internalize the collective knowledge construction. Moreover, although mind mapping as a visual tool seemed suitable for the target group, for some students, it still exceeded their cognitive load, and additional scaffolding seems necessary for them.