This paper presents a mixed methods study in which 77 students and 3 teachers took part, that investigated the practice of Learning by Design (LBD). The study is part of a series of studies, funded by the Netherlands Organisation for Scientific Research, that aims to improve student learning, teaching skills and teacher training. LBD uses the context of design challenges to learn, among other things, science. Previous research showed that this approach to subject integration is quite successful but provides little profit regarding scientific concept learning. Perhaps, when the process of concept learning is better understood, LBD is a suitable method for integration. Through pre- and post-exams we measured, like others, a medium gain in the mastery of scientific concepts. Qualitative data revealed important focus-related issues that impede concept learning. As a result, mainly implicit learning of loose facts and incomplete concepts occurs. More transparency of the learning situation and a stronger focus on underlying concepts should make concept learning more explicit and coherent.
Similar content being viewed by others
Science and technology play an important and increasing role in our modern world. However, international studies, e.g. ROSE (Sjöberg and Schreiner 2010), indicate this is not followed by an increasing interest in and understanding of science and technology among juveniles. To counter that, more meaningful and motivating teaching methods based on interdisciplinary teaching are necessary (Lustig et al. 2009; Osborne and Dillon 2008). In response to this the integration of science, technology, engineering and mathematics (STEM) has become a main topic within educational systems (Rennie et al. 2012) where (designing) technology, due to its wide contexts and informative activities, has the means of becoming the catalyst for integration (Clark and Ernst 2007).
Based on Roth (2001) technology is described as the entire set of activities that leads initial vague ideas through construction and testing of prototypes to a final model. This model, including all the knowledge and skills its creation entails, solves a problem or improves a pre-existing solution. On this basis, the potential of teaching science through designing technology is that the design task provides the context for applying science knowledge and science concepts provide content needed for design realisation. Many attempts to respond to this strong interplay of science and technology appear to be unsuccessful (Lustig et al. 2009; Osborne and Dillon 2008). Nevertheless, reasonably successful approaches like “Learning by Design” (LBD; Kolodner 2002b) show that proper integration can bring significantly better collaboration skills, meta-cognitive skills (e.g. checking work) and science and technology skills (e.g. fair testing). An essential prerequisite for success, however, is a teacher’s deep understanding of the design process. Science teachers, for example, often may not be able to adapt their science teaching methods to activities that involve designing technology (Sidawi 2009; Wendell 2008). In view of this it is obvious that technology and design teachers have to take a leading role. Therefore, they need know that LBD suffers from the fact students do not learn scientific concepts better within LBD (Kolodner 2002b; Kolodner et al. 2003a, b) which is also the main topic of this study that gains insight into why concept learning is limited and how to fortify it.
Importance for technology curricula
The introduction demonstrates technology and design education are directive for LBD and STEM education in general. However, technology education curricula should also benefit from STEM research like, for example, this LBD study. This is based on the Standards for Technological Literacy (International Technology Education Association 2007) where design activities are regarded as a core process of technology education. All too often, however, design is used as instructional strategy where product realisation has the emphasis, often by using trial-and-error as strategy (Burghardt and Hacker 2004). To tackle this problem students have to notice that conceptual knowledge and design processes cannot be divorced (Jones 1997). The goal is to produce students with a more conceptual understanding of design technology (International Technology Education Association 2007). For example, students should focus on concepts behind design realisations such as properties of materials, construction techniques and knowledge of electric circuits, where the latter concerns this study. Therefore, the ITEEA (formerly ITEA) emphasizes design challenges that rely on math and science knowledge to improve design performance. Against this background we need to say that also technology has its own network of conceptual knowledge. It is just because of the focus of this study that this conceptual framework has less attention. In practice this framework is just as important as the scientific knowledge domain. To conclude, knowledge about (the interplay of) concept learning and design processes, both important technological learning objectives, will strengthen technology curricula.
Foundations of LBD
LBD is a project-based inquiry approach where students learn, among skills and practices, scientific content through achieving design challenges (Kolodner 2002b). LBD is based on two educational pedagogies. First, problem-based learning (PBL): a cognitive apprenticeship approach that stimulates learning by collaboration, solving real-world problems and reflection (Norman and Schmidt 1992). Second, case-based reasoning (CBR): a constructivist model of learning that refers to solving new problems by adapting old solutions or interpreting new situations in light of similar situations (Kolodner et al. 1996). Thus, combining both pedagogies learning becomes problem-based, collaborative, reflective, context-related and task-related (doing and knowing). All fundamental elements in solving real-world, design-related problems. Therefore, design is a suitable context for learning, where science provides a part of the content needed for success (Sidawi 2009).
Figure 1 shows LBD is based on two cycles of activities: design and investigation. We will give a short description of how both cycles interplay. A more detailed description can be found in Kolodner et al. (2003a). To achieve a design challenge students have to explore design-related skills and concepts they need to learn/know. By carrying out investigations they learn those things in order to apply them during prototyping and eventually in the final design. Investigation of this application may lead to other things they need to learn and investigation starts again. Thus, students learn concepts and skills (science- and technology-based) that are needed for success by identifying a need to learn them, trying them out, questioning their handling and thinking, and acting again (iteration). That is how the practice of science (investigation) and technology (design) constantly interact. But, students will not necessarily identify all aspects they have to learn and not all insights will be applied properly. Therefore, teacher-guided rituals (poster session, pin-up session and gallery walk) take place for sharing experiences and ideas among design groups. This is more or less similar to how engineers engage with peers and clients (Kolodner et al. 2003b). Complemented by whole-class discussions students are assisted in understanding design-related principles and science. Table 3 shows how LBD activities take place in practice.
Because concept learning has the main focus in this study it is necessary to illuminate how LBD aims for this. Generally, we can say LBD points toward a constructivist mode of education, in which one learns by extracting wisdom from experiences (Kolodner et al. 1996). It fits together with the conceptual change model that learning is a process of personal construction and that students, in an appropriate environment, will construct a more scientific framework of knowledge if they notice scientific conceptions are superior to their pre-task conceptions (Abdul Gafoor and Akhilesh 2013; Cobern 1994). For this, LBD challenges deliberately address cognitive conflicts where students’ existing ideas are no longer sufficient for succeeding. In compliance with Nussbaum and Novick (1982), and Cosgrove and Osborne (1985), LBD contains four main elements for conceptual change: (a) exploration of students’ preconceptions: preliminary phase, (b) sharpening student awareness of own and other’s framework: focus phase, (c) investigation and explanation of the conceptual conflict: challenging phase, (d) accommodation of the new (context-free) conceptual model: application phase. Furthermore, according to literature on learning, e.g. Brandsford, Brown, Donovan, and Pellegrino (2003), LBD contains several elements that (should) promote concept learning: collaboration, reflection, contextual learning, applying what is learned, learning from failures and iteration, and connecting skills, practices and concepts.
Previous LBD research and objective of this study
From 1999 till 2003 over 3500 American middle school students (ages 12–14; grades 6–8) took part in studies that compared achievements of LBD classes to non-LBD classes (Kolodner et al. 2003a, b). Results show that LBD students learn scientific concepts as well or slightly better (not significant) than comparison students with respect to knowledge transfer (mastery outside the design context). However, LBD students performed significantly better at collaboration skills, metacognitive skills (e.g. checking work, reflection) and science skills (e.g. fair testing, using prior knowledge). So it seems LBD makes students more skilful but does not care for better concept learning. This is supported by review of literature on design-based science teaching (Sidawi 2009).
This is notable because, according to the previous section, LDB theoretically provides a sound basis for concept learning. Thus, what factors impede concept learning? The results of previous LBD studies were based on a set of validated performance tasks and multiple choice tests conducted before and after the learning intervention. A detailed analysis of the LBD practice itself had less attention despite the fact it could provide more insight into the process of concept learning. Therefore, this will be the main objective of this study.
To gain insight into a hypothesis that states why concept learning is limited and to provide the study with important points of interest, literature upon design-based learning is helpful. Nearly all design-based science approaches are complex because many objects of integration (e.g. skills, practices, attitudes and content) are combined and remain under-exposed (Berlin and White 1994). Various studies give similar focus-related explanations for this. For example, expert designers focus on content because skills, practices and activities are familiar. Novices mainly focus on process-related issues, needed for success, in which content is largely overlooked (Popovic 2004). Wendell (2008) states that scientific content may not emerge because students focus on doing. For this, they try to avoid unknown content areas, because of complexity and diversity of hands-on activities that mainly dominate the process. Students rather rely on prior knowledge and assumptions (trial and error). Thus, a lack of focus on content and a dominant process focus might cause limitations in concept learning. Therefore, this study investigates where students (senior general education) focus on during LBD and, more specific, how and when scientific content is addressed and what students learn from it. Eventually, implications can be deduced for better concept learning and further research.
For this study the methodology of design-based mixed methods research was chosen. Beside quantitative data about learned science, qualitative data is necessary to investigate the learning process by a thorough analysis of events. The study took place in the second grade of senior general education (havo). 77 students (age 13–14; 33 female; 44 male), spread over 3 adjacent classrooms, were involved accompanied by 3 teachers. All students and teachers had prior experiences regarding characteristic LBD components, but the students had no specific prior knowledge with respect to the scientific design-related content.
Design of the LBD challenge
The LBD task “Back to the Nineties” was related to the physics domain “direct current electric circuits” and design groups (3 students per group, randomly chosen) were challenged to build a battery-operated dance pad that let them use their feet to sound a buzzer or flash lights. The dance pad had to consist of four operating floor pads and one main power switch. The entire activity took 5–6 class periods of 100 min and was guided by an instructive presentation and a student’s and teacher’s guide. To accomplish the task, design specifications were formulated, shown in Table 1, that stimulated the use of underlying science (A till D) and the process of decision-making and creative thinking (E till G).
Regarding specifications A till D, the most fundamental (scientific) design principles concerned proper wiring (combining series and parallel parts) and a proper use of conducting and insulating materials for floor pad creation. Figure 2 shows an example of a design outcome and wiring. To investigate and design electric circuits students used real experiments and an interactive simulation (PhET™ DC-circuit construction kit). Beside proper circuit creation, the design challenge sets for more scientific objectives. Table 2 shows all objectives and their initial appearance.
Furthermore, Table 3 shows which LBD stages and activities took place to guide the process and to help students to understand underlying concepts and phenomena. In addition, a few modifications, listed below, were implemented to enrich the original LBD approach. These modifications mainly concern the usage of modern learning resources.
A fully equipped (online) electronic learning environment (ELE) with guidance for each design stage, background materials regarding skills and practices and space to collect (requested) writings, pictures, sketches, simulations, etc.
The possibility of using tablets, laptops and smartphones to build a digital design diary and to access digital resources like internet and simulation software.
The obligation to build virtual simulations in addition to real experiments, based on Finkelstein et al. (2005).
Framework for learning
Because the students’ focus and the way they use and learn science from that is the main topic of this study, literature was studied to become more informed about elements related to concept learning. This resulted in three important, closely connected, elements that were helpful in collecting and analysing qualitative data. According to Horton (2006), as shown in Table 4, a learning activity has three essential types of interaction that should contribute learning. Within these interactions five important intertwined activities can be specified. Maybe not surprisingly, all elements in Table 4 are to a greater or lesser extent part of the LBD approach.
To get informed about the (advancement of) students’ mastery of content knowledge pre- and post-exams (multiple-choice) were used. The same exam was used for pre- and post-testing and a control group (N = 26), not taught the task-related content, was used to rule out knowledge absorbing from taking the test. Questions were based on validated multiple-choice tests that proved to uncover students’ (mis)conceptions (Engelhardt and Beichner 2004; Licht and Snoek 1986; Niedderer and Goldberg 1993). The exam consisted of 20, objective-linked, questions. Each objective was served by pairs of similar conceptual and contextual questions to investigate differences in de- and recontextualisation (transfer). Figure 3 shows two examples of paired questions.
During the challenge direct non-participant observations took place to investigate students’ and teachers’ behaviours and actions. The event- and scan-based observations mainly focused on occurrence, frequency and (indirectly) absence of events. The observations were guided by observation forms to respond to the simultaneous occurrence or close temporal proximity of events. These forms included a list of behaviours and events, grouped by the learning-related key elements mentioned in Table 4, with space for describing the observation in detail.
During the learning task sound recordings were made of teacher instructions, teacher-student interaction, collaboration between students within design groups and class activities. Sound recordings provide authentic data (regarding the content of explication, reflection and feedback) and express students’ thoughts and use of science vocabulary. Especially, because students were encouraged to think aloud.
Questionnaires (mostly open-ended) were used to ask students to reflect on the learning process. Questions were based on the STARR-method that provides a framework for proper reflection (Verhagen 2011). Especially, students were asked to express their opinion on learning outcomes, disturbing elements and activities that stimulated learning. Questioning took place after the learning intervention and included all students.
For deeper understanding of students’ answers, retrospective interviews took place at the end. Stimulated-recall techniques were used to investigate the extent to which students used science consciously; according to literature a rich source of data (Popovic 2004; Rennie et al. 2012; Roth 2001). In preparation for this, student products were studied to become informed of used science and successfulness of design outcomes. Visible scientific elements were noted and served as stimulus during interviews. Sixteen students, the number data occurred to be saturated (Mason 2010), and all teachers were interviewed.
The results of the pre- and post-exam scores will be represented by the total number of correct answers among all students and corresponding fractions. This is performed per question, for contextual and conceptual questions separately and for all questions. The fractions will be used to calculate the gain index 〈g〉. The latter is defined as the ratio of the actual average gain (%post − %pre) to the maximum possible average gain (100 − %pre) (Hake 1998). A paired samples t test and a Wilcoxon signed-rank test are used to determine the difference between pre- and post-scores. Both tests were used because frequency analysis showed the data was only approximately normally distributed. The internal consistency of the exam was tested by calculating Cronbach’s alpha for the items within the different objectives, resulting in an average correlation. Finally, a factor analysis was used to test the (amount of) assumed objectives the exam is based on.
For the qualitative data we derived guidelines for analysis from methodological literature (Boeije 2005; Trochim 2006). Table 5 gives an overview of the qualitative data collection and a brief description of the analysis.
Because observation forms were based on learning-related elements mentioned in Table 4, those elements were also used as label (code) for (re)grouping the observations (A: collaboration, B: reflection, C: feedback, D: explication, E: process-related). A sixth label was added (F: miscellaneous) for observations that seemed hard to define. In addition, also the type of interaction, according to Table 4, was noted. In the context of methodological triangulation we used the same labelling method for analysing sound recordings, questionnaires and student interviews. Where, in the case of questionnaires and student interviews, labelling took place per question. Sound recordings were first broken down into relevant fragments, after whereupon labelling started. Next, per data collection the data was first sorted by type of interaction, in order to specify the initial focus. Second, the data was sorted by learning-related element(s) to gain insight in the learning process. This resulted in sub-categories of common content, where each sub-category was accompanied by a short description. At this stage the two researchers, guaranteeing reliability by peer debriefing, compared their findings until agreement was reached. According to literature inter-rater agreement can be determined by dividing the number of agreements by the sum of agreements and disagreements (Bijou et al. 1968). In our case (NA = 54, ND = 8) inter-rater agreement was 0.87 which is sufficient.
Furthermore, sound recordings and interviews were analysed to get informed about an increase in learned science. First, sound recordings of student collaboration were used to investigate possible changes in verbal use of scientific terms during the process. For this, the usage of 13 predefined scientific, design-related terms was counted for different stages. Second, student products were examined by simply writing down the used science that was visible in products. Then, student interviews made clear, by looking at the quality of scientific reasoning/underpinning, whether this science was understood and used consciously.
Also, design realisations were reviewed per design specification by two experts using three categories (successful, partially successful and unsuccessful). The percentage of successful scores indicates how successful students faced the design challenge. By calculating the linear weighted Cohen’s Kappa inter-rater agreement was established.
For the qualitative part we ensured validity and reliability in several ways (Hake 2004; Niedderer and Goldberg 1993). First, all methods of qualitative data collection and analysis were based on scientific literature in order to guarantee test-validity resulting in well-founded results. Second, due to coding, peer debriefing and member checking a coherent and explicit chain of analysing and reasoning was provided. Third, we used direct investigation techniques in a real-world educational setting to avoid restricted experimental settings that may cause quasi-valid results because important impacts are ruled out. Fourth, methodological and investigator triangulation is used to check results and to interpret findings.
Table 6 shows the pre- and post-exam results (experimental and control group) listed by objective. Cronbach’s alpha, for each individual objective, indicates that the questions have sufficient internal consistency. Regarding objective 1–6 (α1 = 0.80, α2 = 0.83, α3 = 0.71, α4 = 0.79, α5 = 0.66, α6 = 0.76) we find an overall alpha of 0.76.
A principal component analysis suggests, according to Kaiser’s criteria (eigenvalue > 1), 7 factors are present. However, screen plot analysis indicates, based on linear coinciding, the data should be analysed for 6 factors. Studying the (rotated) component matrix 17 test items (questions) across the components match the distribution of the questions across the objectives which gives a 85 % match.
The control group, used to determine a possible learning effect from completing the test, showed now average gain (%pre = 29; %post = 30). For the experimental group, a paired samples t test indicates the overall gain is significant, t(76) = −18.18; p < 0.001. This is confirmed by the Wilcoxon signed-rank test that gives the same p value. Unless, the experimental group made significant progress, substantially more gain could be possible because the overall gain is just enough to be called “medium” (Hake 1998). Compared to the gains found in different physics (-related) course studies, also including LBD, this gain is comparable (Churukian 2002; Coletta and Phillips 2005; Hake 1998; Kolodner 2002b).
The exam results regarding objective 1, 4, 5 and 6 are consistent, where objective 5 barely shows any gain, and for objectives 2–3 an anomaly is shown. Analysing the questions with no or low gain two things are noteworthy. First, question 2 shows a slight decline because students used the concept for parallel current behaviour, that was mainly important, in a series circuit. Second, the other questions also appealed to concepts that were barely exposed during the challenge (potential difference and resistance). For example, objective 5 (resistance and current flow) was addressed by the increasing amount of components students had to add to their design. However, a correlation with changing current was not investigated. Objective 1 and 4 (highest gains) were appealed strongly during the challenge. Thus, unravelling the requested design is important to predict learning outcomes, to set objectives and to notice possible shortcomings. Finally, differences between CT- and CC-questions were not found.
Table 7 shows how students’ final designs were scored by two experts. For these results the linear weighted Kappa κw is 0.70 (lower limit = 0.60; upper limit = 0.79), so inter rater agreement can be specified as good or substantial. The average relative amount of successes (successful) based on all specifications and both experts is 73 %. For the specifications based on proper science (A–D) this percentage is even 84 %, what implies that a medium gain according to learned science was sufficient for proper design realisation.
Despite the fact students performed reasonably well and students’ talking, as illustrated by Fig. 4, showed increasingly more scientific terms, interviews made clear, shown in Table 8, students lacked proper scientific reasoning. This is supported by the observation students continuously tend to apply trial and error to complete tasks. Summarizing, according to the knowledge dimensions of Bloom’s taxonomy, scientific insights are used as isolated facts and explicit interrelationships that enable them to function together remain underexposed (Krathwohl 2002).
Because the students’ focus is one of the main topics of this study, students were questioned about learning outcomes. Table 9 shows experienced learning outcomes were mainly task- and product-related. Only 9 % of all replies were related to a better mastery of electricity concepts. According to questionnaires and interviews concepts were seen as a tool for designing a dance pad and the latter was, maybe logically, qualified as the ultimate goal of the challenge. This also explains that the virtual simulation was a successful tool for circuit creation, but circuit operation was not sufficiently understood. Overall, as suggested earlier, our novice design students indeed focused on process-related issues needed for success.
To investigate science-related learning incitements, students were asked to rate activities based on Table 4. Rating took place, as shown by the results in Table 10, using a five point Likert scale (very poor, poor, fair, good, very good).
Student (to student) interaction
Table 10 shows student (to student) interaction was, according to students, least helpful to learn about electricity. Especially, self-reflection is not appreciated as a useful learning activity. Collaboration with peers scored a better rating (fair), where spontaneous collaboration was mainly triggered by the presence of the construction materials/tools and the making of sketches and drawings. This was established by counting different triggers for student–student discussion based on observation forms (Table 11).
Regarding information seeking, interviews made clear that gathered information is not shared spontaneously among peers. The major reason for this was, also demonstrated by interviews, the inability of students to properly estimate the value of the information. Furthermore, enthusiastic, highly involved students tended to dominate collaboration or, in case of no effect, to act alone in the future. This in order to finish a task as quickly as possible and experience a sense of accomplishment.
Student to teacher and content interaction
These interactions are rated equally, where circuit simulation gets the highest score regarding learning about electricity. However, interviews made clear, as mentioned before, students need considerable assistance to explicate scientific insights and design decisions adequately. For this, the teacher seems to be important. 12 of 16 interviewed students mentioned the teacher as the most reliable and important source for this kind of reasoning and fellow students were seen as incompetent doing this. Nevertheless, all teachers described the (guidance of the) LBD-task as intensive, time-consuming, complex and a real challenge for students and teachers. Especially, the process of sensitive assistance, mentioned before, seemed to be difficult. Reasons for this were mainly time constraints and the tendency to be too helpful. Students’ reactions were more or less similar and included the complexity of the design diary and complete challenge, mainly due to the extent and openness, experienced time constraints and, sometimes, the low intensity of relentless senses of accomplishments. Students often mentioned to find it difficult to stay focused and to make up their mind, but nearly 72 % also mentioned they became more motivated than usual, what also was noted by teachers. Finally, nearly one fifth of the students indicated it would be desirable to enrich the challenge by adding non-dance pad-related tasks or content. Table 12 provides an overview of the most important criticism expressed by students and teachers.
Discussion and implications
By studying the practice of LBD and less emphasis on pre- and post-testing, this study reveals why concept learning has its limitations, despite the fact LBD theoretically provides a rich learning environment. It clarifies why the found average medium gain (0.35 gain-index) stayed relatively low and offers room for improvement. For example, a previous survey of pre-/post-test data for 62 introductory physics courses, based on interactive engagement (IE) methods, showed gain-indices up to 0.60 (Hake 1998). Those IE-methods are, similar to LBD, designed to promote conceptual understanding through heads- and hands-on activities contributed by peer feedback and discussion and intensive teacher guidance (Hake 1998). A main difference between those IE-methods and LBD is the amount and extensiveness of objects of integration, where LBD seems to be more diverse: teachers and students defined the LBD challenge as complex, mainly due to the extent and openness. Where time-constraints, the malfunctioning of the virtual simulation and network connection, and a disturbing emphasis on the (extensive) design diary were additional negative elements. Thus, the complexity and extensiveness forced students to focus on doing the right things and delivering requested products. Therefore, in accordance with the hypothesis stated before, students were indeed strongly product- and process-focused (What to do and what to deliver?) and qualified scientific content (What to learn?) as tools they needed for success.
The science students learned and used for producing their design mainly became available from activities that strongly determined a successful completion of the challenge. First, the virtual simulation that provided insight in electrical wiring and, second, teacher-driven activities (e.g. student–teacher interaction and teacher-driven class discussions) when concepts were discussed explicitly. Therefore, the more concepts directly determined a successful design outcome the better the concepts were understood. An important fact that also was indicated by Jones (1997) for technological concepts. The students’ strong focus on acting and delivering successful products, according to students the main goal, suppresses the fact that those processes (can/must) increase their concept-related knowledge. This resulted in the fact that concepts, certainly when they were poorly design-related, were badly or only partially understood. This lack of focus on scientific objectives and associated concepts caused the learning of isolated facts that stayed, more or less, implicit. Students used more scientific terms and symbols and designed proper electric circuits, but did not achieve a deeper conceptual understanding. Thus, students learned incomplete concepts, just enough for design-implementation, and too little explicit interrelationships that is essential to master the knowledge domain (Brandsford et al. 2003; Wiggins and McTighe 2006).
This problem of incidental, implicit, informal or unintentional learning was also found in other non-LBD studies (Baskett 1993; Kerka 2000; Marsick and Watkins 2001; Rogers 1997). For example, our findings correspond to the important design-related issue stated in the run-up to the presentation of this study: design is seen as an instructional strategy where product realisation has the emphasis and more (underlying) conceptual understanding is necessary to improve design performance and conceptual understanding. Therefore, the results of this study can be more broadly understood. The practice of design offers a rich learning environment but an overall reinforcement of conceptual awareness is required.
Possibilities for improvement and further research
According to the previous, there are mainly two (interrelated) problems for which solutions will need to be found: (1) reducing the complexity of the challenge without diluting the potentially rich learning environment; (2) a stronger focus on domain specific objectives and related (scientific) concepts, where important interrelationships become explicit.
In general, a detailed analysis of concepts (technology and science), crucial for succeeding, is necessary: when they (have to) emerge and how they are related. This also makes clear which concepts are poorly task-related and need to be addressed otherwise (e.g. demonstrations, lectures, further readings, experiments, etc.). Other previous studies give some more important insights. To discuss and explicate concepts students used for their products and during their collaboration the technique of guided discussion may be helpful (Brandsford et al. 2003). This teacher-led discussion technique encourages students to share (scientific) insights and develop a deeper understanding. To emphasize en explicate the important role of concepts for design purposes (elements of) informed design (Burghardt and Hacker 2004) might be interesting. This strategy aims for thoughtful design decisions based on scientific and mathematical concepts without reverting to trial and error, the tendency the students in our study had. Furthermore, applying explicit instruction (Archer and Hughes 2011) and the use of scaffolding strategies (Bamberger and Cahill 2013) are interesting. Both strategies help to facilitate students’ understanding and overseeing of the learning process. Students are guided through the learning process with clear instructions, proceeding in small steps, checking for understanding and achieving active and successful participation by all students. This focus on successful participation could also respond to the problem we encountered: students (within design groups) were sometimes not equally involved.
To conclude, LBD activities are very teacher-dependent, whether due to content-related choices and otherwise, because of sensitive assistance (guidance). Maybe not surprising, because the teacher plays a significant role in the succeeding of learning activities (Bamberger and Cahill 2013; Van der Veen and Van der Wal 2012). Thus, it will be valuable to study (the interplay of) concept learning and teacher handling in detail to distract important clues for appropriate teacher behaviour.
Abdul Gafoor, K., & Akhilesh, P. T. (2013). Strategies for facilitating conceptual change in school physics. Researches and Innovations in Education, 3(1), 34–42.
Archer, A. L., & Hughes, C. A. (2011). Exploring the foundations of explicit instruction. In K. R. Harris & S. Graham (Eds.), Explicit instruction: Effective and efficient teaching (pp. 1–21). New York: The Guilford Press.
Bamberger, Y. M., & Cahill, C. S. (2013). Teaching design in middle-school: Instructors’ concerns and scaffolding strategies. Journal of Science Education and Technology, 22(2), 171–185.
Baskett, H. K. M. (1993). Workplace factors which enhance self-directed learning. Paper presented at the seventh international symposium on self-directed learning, West Palm Beach, FL.
Berlin, D. F., & White, A. L. (1994). The Berlin-White integrated science and mathematics model. School Science and Mathematics, 94(1), 2–4.
Bijou, S. W., Peterson, R. F., & Ault, M. H. (1968). A method to integrate descriptive and experimental filed studies at the level of data and empirical concepts. Journal of Applied Behavior Analysis, 1(2), 175–191.
Boeije, H. (2005). Analyseren in kwalitatief onderzoek. Amsterdam: Boom Lemma Uitgevers.
Brandsford, J. D., Brown, A. L., Donovan, M. S., & Pellegrino, J. W. (2003). How people learn: Brain, mind, experience, and school. Washington, DC: National Academy Press.
Bruinsma, M. (2003). Effectiveness of higher education: Factors that determine outcomes of university education. Doctoral dissertation, Rijksuniversiteit Groningen, Groningen.
Burghardt, M., & Hacker, M. (2004). Informed design: A contemporary approach to design pedagogy as the core process in technology. Technology Teacher, 64(1), 6–8.
Churukian, A. D. (2002). Interactive engagement in an introductary university physics course: Learning gains and perceptions. Doctor of Philosophy Dissertation, Kansas State University, Manhattan, KS.
Clark, A., & Ernst, J. (2007). A model for the integration of science, technology, engineering and mathematics. The Technology Teacher, 66(4), 24–26.
Cobern, W. W. (1994). Worldview theory and conceptual change in science education. Paper presented at the National Association for Research in Science Teaching, Anaheim.
Coletta, V. P., & Phillips, J. A. (2005). Interpreting FCI scores: Normalized gain, preinstruction scores, and scientific reasoning ability. American Journal of Physics, 73(12), 1172–1182.
Cosgrove, M., & Osborne, R. (1985). Lesson frameworks for changing childrens ideas. In R. Osborne & P. Freybergs (Eds.), Learning in science: The implications of childrens science. London: Heinemann.
Engelhardt, P. V., & Beichner, R. J. (2004). Students’ understanding of direct current resistive electrical circuits. American Journal of Physics, 72(1), 98–115.
Finkelstein, N. D., Adams, W. K., Keller, C. J., Kohl, P. B., Perkins, K. K., Podolefsky, N. S., & Reid, S. (2005). When learning about the real world is better done virtually: A study of substituting computer simulations for laboratory equipment. Physics Education Research, 1(1), 8.
Fortus, D., Dershimer, R. C., Krajcik, J., Marx, R. W., & Mamlok-Naaman, R. (2004). Design-based science and student learning. Journal of Research in Science Teaching, 41(10), 1081–1110.
Hake, R. (1998). Interactive-engagement versus traditional methods: A six-thousand-student survey of mechanics test data for introductory physics courses. American Journal of Physics, 66(1), 64–74.
Hake, R. (2004). Design-based research: A primer for physics-education researchers. American Journal of Physics, 1–35. http://www.physics.indiana.edu/~hake.
Hennessy, S., & McCormick, R. (1994). The general problem-solving capability process in technology education. Myth or reality? In F. Banks (Ed.), Teaching technology (pp. 94–108). London: Routledge.
Horton, W. K. (2006). E-learning by design. San Francisco, CA: Wiley.
International Technology Education Association. (2007). Standards for technological literacy: Content for the study of technology (3rd ed.). Virginia: Reston.
Johnson, S. (1997). Learning technological concepts and developing intellectual skills. International Journal of Technology and Design Education, 7, 161–180.
Jones, A. (1997). Recent research in learning technological concepts and processes. International Journal of Technology and Design Education, 7, 83–96.
Kerka, S. (2000). Incidental learning. Trends and Issues Alert, 4. Retrieved from http://eric.ed.gov/?id=ED446234.
Kolodner, J. L. (2002a). Facilitating the learning of design practices: Lessons learned from an inquiry into science education. Journal of Industrial Teacher Education, 39(3), 9–40.
Kolodner, J. L. (2002b). Learning by design: Iterations of design challenges for better learning of science skills. Bulletin of the Japanese Cognitive Science Society, 9(3), 338–350.
Kolodner, J. L., Camp, P. J., Crismond, D., Fasse, B., Gray, J., Holbrook, J., & Ryan, M. (2003a). Problem-based learning meets case-based reasoning in the middle-school science classroom: Putting learning by design into practice. The Journal of the Learning Sciences, 12(4), 495–547.
Kolodner, J. L., Gray, J. T., & Fasse, B. B. (2003). Promoting transfer through case-based reasoning: Rituals and practices in learning by design classrooms. Cognitive Science Quarterly, 3(2), 1–28.
Kolodner, J. L., Hmelo, C., & Narayanan, N. (1996). Problem-based learning meets case-based reasoning. The Journal of the Learning Sciences, 12(4), 495–547.
Krathwohl, D. R. (2002). A revision of bloom’s taxonomy: An overview. Theory into Practice, 41(4), 212–218.
Licht, P., & Snoek, M. (1986). Electriciteit in de onderbouw: Een inventarisatie van begrips- en redeneerproblemen bij leerlingen. NVON Maandblad, 11(11), 32–36.
Lustig, F., West, E., Martinez, B., Staszel, M., Borgato, M. T., Iosub, I., & Weber-Hüttenhoff, U. (2009). Experiences and results from the European project ‘Integrated Subject Science Understanding in Europe’. Paper presented at the ESERA conference, Istanbul.
Marsick, V. J., & Watkins, K. E. (2001). Informal and incidental learning. New Directions for Adult and Continuing Education, 2001(89), 25–34.
Mason, M. (2010). Sample size and saturation in PhD studies using qualitative interviews. FQS: Forum Qualitative Social Research, 11(3), Art. 8. http://nbnresolving.de/urn:nbn:de:0114-fqs100387.
McCormick, R. (1997). Conceptual and procedural knowledge. International Journal of Technology and Design Education, 7, 141–159.
Murphy, P., & Hennessy, S. (2001). Realising the potential—And lost opportunities—For peer collaboration in a D&T setting. International Journal of Technology and Design Education, 11, 203–237.
Niedderer, H., & Goldberg, F. (1993). Qualitative interpretation of a learning process in electric circuits. Paper presented at the Annual Meeting of the National Association for Research in Science Teaching, Atlanta.
Norman, G. R., & Schmidt, H. G. (1992). The psychological basis of problem based learning: A review of the evidence. Academic Medicine, 67(9), 557–565.
Nussbaum, J., & Novick, S. (1982). Alternative frameworks, conceptual conflict and accommodation: Toward a principled teaching strategy. Instructional Science, 11(3), 183–200.
Oorschot, F., Ottevanger, W., Spek, W., Boerwinkel, D. J., Eijkelhof, H., de Vries, M., et al. (2014) Enschede: SLO (Nationaal Expertisecentrum Leerplanontwikkeling).
Osborne, J., & Dillon, J. (2008). Science education in Europe: Critical reflections (p. 32). London: The Nuffield Foundation.
Parkinson, E. (2001). Teacher knowledge and understanding of design and technology for children in the 3–11 age group: A study focussing on aspects of structures. Journal of Technology Education, 13(1), 44–55.
Popovic, V. (2004). Expertise development in product design-strategic and domain-specific knowledge connection. Design Studies, 25(5), 527–545.
Rennie, L., Venville, G., & Wallace, J. (2012). Integrating science, technology, engineering and mathematics. New York: Routledge.
Rogers, A. (1997). Learning: Can we change the discourse? Adults Learning, 8(5), 116–117.
Roth, W.-M. (1995). Inventors, copycats, and everyone else: The emergence of shared resources and practices as defining aspects of classroom communities. Science Education, 79, 475–502.
Roth, W.-M. (2001). Learning science through technological design. Journal of Research in Science Teaching, 38(7), 768–790.
Sidawi, M. (2009). Teaching science through designing technology. International Journal of Technology and Desing Education, 19(3), 269–287.
Sjöberg, S., & Schreiner, C. (2010). The ROSE project: An overview and key findings. http://roseproject.no/network/countries/norway/eng/nor-Sjoberg-Schreiner-overview-2010.pdf. Accessed 10 Nov 2015.
Trochim, W. (2006). Web center for social research methods 2013, from http://www.socialresearchmethods.net/.
Van der Veen, T., & Van der Wal, J. (2012). Van leertheorie naar onderwijspraktijk. Groningen: Noordhoff Uitgevers B.V.
Verhagen, P. (2011). Reflectie met de STARR-methode Kwaliteit met beleid. Bussum: Coutinho.
Wendell, K. B. (2008). The theoretical and empirical basis for design-based science instruction for children. Unpublished Qualifying Paper. Tufts University.
Wiggins, G. (2012). 7 keys to effective feedback. Educational Leadership, 70(1), 10–16.
Wiggins, G., & McTighe, J. (2006). Understanding by design. Upper Saddle River, NJ: Pearson Education Inc.
About this article
Cite this article
van Breukelen, D.H.J., de Vries, M.J. & Schure, F.A. Concept learning by direct current design challenges in secondary education. Int J Technol Des Educ 27, 407–430 (2017). https://doi.org/10.1007/s10798-016-9357-0