Advertisement

Outcome analyses of educational interventions: a case study of the Swedish “Boost of Technology” intervention

  • Lena Gumaelius
  • Eva Hartell
  • Joakim Svärdh
  • Inga-Britt Skogh
  • Jeffrey Buckley
Open Access
Article

Abstract

In Sweden, there have been multiple large scale interventions to support compulsory school teachers generally and within specific subjects. Due to the costs associated with such interventions it is critical that interim evaluation measures exist which can indicate potential success. Additionally, evaluation measures which can measure the actual impact of interventions relative to their intended aim are also needed as validation tools. The Swedish regional ‘Tekniklyftet’ or ‘Boost of Technology’ project which ran from 2011 to 2013 is presented here as a case study exploring evaluation measures for educational interventions in technology education. Three different evaluation approaches were used as measures of the intended outcomes of the intervention. These included (1) analysing the preconditions which exist in schools for teachers of Technology, (2) analysing the use of local long term technology education planning documents (school work plans) developed during the intervention, and (3) analysing the potential change over time in student performance in Technology based on national grades at the end of compulsory school. The findings gained from each approach indicate that the Boost of Technology project was a success. However, there were shortcomings associated with each approach. They are therefore discussed in the Swedish context with the intention to support future international stakeholders in the evaluation of interventions aspiring to develop technology education.

Keywords

Technology education Educational interventions Educational evaluation 

Introduction

Insufficient teacher education, in this case relative to quantity of its provision, is detrimental for any subject at primary and post-primary level. Lately, Science, Technology, Engineering and Mathematics (STEM) education in the whole of the Swedish school system is suffering from declining results in many areas. This is apparent from results of international studies such as from the Trends in International Mathematics and Science Study (TIMMS) (IEA 2018) and from the Programme for International Student Assessment (PISA) (OECD 2018) in which Sweden has shown a decline in performance since 1995 and 2000 respectively with the exception of the most recent round of results. While student achievement is suffering in these areas as illustrated by these studies, the importance of primary and post-primary STEM education is well-acknowledged in Sweden. One way this is evidenced is through the numerous initiatives aiming at increasing achievement and interest in STEM subjects. Studies have been undertaken by the government as well as private actors, an approach which is reflective of international efforts (e.g., Hernandez et al. 2014; Mammes 2004). However, for both educational and economic reasons, determining whether the outcomes of these initiatives are in line with their stated aims needs to be addressed.

In-depth knowledge of teaching and learning in the STEM topics is based on and depends on knowledge of the individual subjects individually and in combination. In this present study we exclusively focus on the subject of Technology, which in Sweden often is seen as the weakest link in the STEM chain. We specifically examine a regional educational intervention ‘Tekniklyftet’ (herein referred to by an English translation, ‘Boost of Technology’), which was conducted in the Stockholm area between 2011 and 2013, is examined. This intervention aimed at improving the provision of Technology in Swedish compulsory schools. The project was funded by the European Social Fund (ESF) in which approximately 15.5 million SEK (≈ 1.5 million EUR) was spent and nearly 750 unique teachers participated to varying degrees. A number of evaluation studies were performed during the intervention (e.g., Gumaelius and Skogh 2015; Hartell et al. 2015) and two further studies were carried out subsequent to its completion (Johannesson 2014; Norström 2014). Johannesson (2014) published the final external evaluation report in which the possibilities and limitations, primarily regarding organisational issues, were highlighted. Norström (2014) analysed participating teacher’s views and understanding of the concept of technology and found a considerable variation in how this concept was perceived. Norström’s (2014) results also point to these teachers being unaccustomed to reflecting on the complexity of the concept of technology. Based on these studies and the final report, the funding agency (ESF), indicated that they consider the intervention to be successful. However, to the organizers as well as to the involved educators, it is not yet completely apparent whether the intervention was successful in terms of reaching the overall goal of improving technology education in the participating schools.

Background

The need for the improvement of technology education in Sweden

In Sweden, Technology has been a mandatory subject in compulsory school for more than 30 years. However, the desired and necessary level of quality in the provision of the Technology subject at post-primary level in Sweden has not been reached (Skolinspektionen 2014; Teknikföretagen and CETIS 2012). The Swedish Technology curriculum is considered to be difficult to interpret, and the technological understanding and knowledge among those teaching the subject in Sweden varies greatly (Norström 2014). It appears as though the subject has not yet established a unique position within the Swedish educational curriculum.

In 2013, the Swedish School Inspectorate performed the first national technology education evaluation (Skolinspektionen 2014). The Inspectorate concluded that Technology often remains in the background to the benefit of the other STEM subjects, that teacher’s instruction often lacks logical progression, and that teaching is not sufficiently aligned with the current curricula. The report also highlighted the lack of access to teaching materials, including books and equipment, and suggested a lack of collegial learning and planning with other teachers, factors which were also reported in a study conducted by Hartell and Svärdh (2012). In line with Skolinspektionen (2014), Hartell et al. (2015) also identified that not enough hours of instruction are allocated to the subject. Two of the main stakeholders promoting compulsory school technology education in Sweden—the Centre for School Technology Education (CETIS) and the Association of Swedish Engineering Industry (Teknikföretagen)—undertook a similar evaluation study (Teknikföretagen and CETIS 2012) in which they report on deficient technology education provision. Additionally, according to this study there is a strong relationship between the quality of teaching in Technology and the preconditions for teaching it, i.e., the level of qualification of teachers, the adequacy and quality of the premises, and the scheduling for the subject.

The subject’s low status in Sweden (Skolinspektionen 2014; Teknikdelegationen 2010) and the fact that many Swedish teachers lack relevant training in teaching the subject (Teknikdelegationen 2009) undoubtedly creates problems regarding the quality of the teaching. Reports suggest as much as 70–93% of Technology teachers in Sweden are unqualified to teach the specific subject they are teaching (Skolinspektionen 2014; Teknikföretagen 2005). Teachers teaching Technology may have a teaching certificate, but the majority of them are trained in other subjects and not specifically in Technology. There has been, and still is, a large need for educational interventions targeting Technology teachers in Sweden, particularly at primary and post-primary level (Hartell et al. 2015). The ESF funding of the Boost of Technology intervention should be considered both within and as a response to these circumstances.

The Boost of Technology project

To address these reported deficiencies in the Technology subject in Sweden, Vetenskapens hus (English translation: House of Science), a science educational centre owned by KTH Royal Institute of Technology and Stockholm University, and its partners, secured ESF funding to launch the Boost of Technology project in 2011. The activities at Vetenskapens hus are of varied nature, always aiming at increasing students’ awareness in STEM subjects. Most often students are visiting with their class mates for a few hours, attending a laboratory program, but, activities also include events such as Researchers night or Physics in an Amusement park where thousands of students can be present at once, for more information see: www.vetenskapenshus.se. Vetenskapens hus was the natural owner of this project as they serve as the link between university and school. The long-term goal was to contribute to a broad renewal of technology education in Swedish compulsory schools. Focusing on Technology teachers and their work situations, the project sought to strengthen teachers’ skills and raise the national status of the subject by including teachers who teach Technology, but also other teachers, school administrators and representatives of the local school authorities. In line with Skolinspektionen (2010) and the OECD (2009), teachers’ competence was seen as a critical factor in increasing students’ interest in the subject and encouraging more young people to pursue technology education and careers. Initially, the main focus was on the final years of compulsory school (Years 7–9), however some schools included classes from preschool through Year 9. The improvement of technology education was defined as improvements towards teaching according to the national curriculum for Technology (Skolverket 2011a) in relation to content, resources, and pedagogy.

While many schools and teachers were involved to varying degrees in the Boost of Technology intervention, the measurable goal was to certify 20 invited pilot schools in the Stockholm region that agreed to develop technology education according to a set of goals. The improvement of technology education was defined as improvements towards teaching according to the national curriculum for Technology (Skolverket 2011a) in relation to content, resources, and pedagogy. To receive the certification, schools had to:
  1. 1.

    Account for at least one teacher having obtained 45 European Credit Transfer System (ECTS) credits in subject-specific training (i.e., in Technology and/or Technology pedagogy) to align with the level required by teacher education programmes in Sweden for teacher certification in specific subjects since 2011 (Sveriges Riksdag 2011a).

     
  2. 2.

    Account for all Technology teachers to have obtained at least 15 ECTS-credits in Technology and/or Technology pedagogy (unlike the previous criterion, this was decided internally within the Boost of Technology project).

     
  3. 3.

    Submit a local work plan for the subject of Technology. The development of this plan had to be interdisciplinary; in other words, teachers from different subjects participated in the process to raise the status of the subject at the school. Both teachers and school management had to agree on the work plan.

     
  4. 4.

    Develop collaboration with a nearby company/industry and a science centre to link technology education to reality.

     

Activities were led by Vetenskapens hus and involved university institutions, industry and science centre’s representing different areas of society dependent on the nature and aspect of technology education being focussed on at a specific point in time. The project was considered as a pilot study. If it were proven to be efficient and effective, the aim was to undertake the project in other schools and areas in Sweden.

Educational interventions supporting technology education and their evaluation in Sweden

Interventions to support teachers are not uncommon in Sweden. The Boost of Technology intervention is one of multiple such efforts, which have been or are being conducted which either directly or indirectly support Technology teachers. Directly, a number of local and national educational interventions targeting schools have been undertaken since the turn of the millennium such as ‘Teknik8an (competition in STEM for students in grade 8) (www.teknikattan.se), ‘Snilleblixtarna’ (for teachers and students in K-6) (www.snilleblixtarna.se),‘Tekniken lyfter’ (English translations: ‘Boost of Technology’ or ‘Technology Lift’) (e.g., Lindahl and Jansson 2004), which ran from 1999 to 2003, and the currently on-going ‘Lärarlyftet’ (English translations: ‘Boost of Teachers’ or ‘Teacher Lift’) (Skolverket 2018a). Indirectly, ‘Matematiklyftet’ (English translations: ‘Boost of Mathematics’ or ‘Mathematics Lift’) (Skolverket 2018b) which ran from 2012 to 2016 targeting Mathematics teachers may benefit Technology teachers due to the relationship between Mathematics and Technology.

Considering the extent of the investment, both financially and in terms of time, there is a clear need to be able to evaluate these interventions. Such evaluation is needed early and continuously to decide on the merit of continued investment, and at the end to determine the effect of the intervention as a whole. To study the effects of interventions is not a new area of research internationally or in Sweden. Lärarlyftet has been evaluated several times by the Swedish Agency of Education, mostly by interviewing involved teachers about how satisfied they are with the courses they have taken through the intervention. For example, Westlund (2012) found that teacher’s self-efficacy increased during the studied intervention. Regarding Matematiklyftet, an interim report (Karlsson et al. 2013) was performed based on a mixed method approach (teachers and management interviews, observations of classroom activities and text analyses of formal documents). Additionally, the final report highlighted the success of the project (Österholm et al. 2016).

In line with these studies, there is now a need to reflect on and evaluate the Boost of Technology intervention. Many different approaches can be and have been used to determine whether technology education is of an acceptable quality. For example, Timperley (2011) shows that appropriate methods include measuring student achievement outcomes and attitudes. According to Teknikföretagen and CETIS (2012), there is a strong relationship between the conditions available to teach Technology and the way technology is taught. In their report, a ‘formula’ for technology teaching based on six criteria is suggested:
  1. 1.

    Each school should have at least one teacher certified in Technology.

     
  2. 2.

    Professional development should be available to all teachers teaching Technology.

     
  3. 3.

    Each school should have a subject specific plan (work plan) for Technology.

     
  4. 4.

    There should be time set aside dedicated for all teachers to plan and develop the teaching of Technology.

     
  5. 5.

    The subject of Technology should be included in all participating schools’ curricula and it should be continuously taught across Years 1–9.

     
  6. 6.

    The subject of Technology should be included in all participating schools’ curricula and it should be continuously taught across Years 1–9.

     

In conclusion, the importance of developing and highlighting effective interventions that aim to increase knowledge and interest in Technology and improve the quality of technology education cannot be underestimated.

Aims and research question

There are two complementary aims of this study. The first is to examine the outcomes of the Boost of Technology intervention relative to its aim of improving the provision of technology education in Sweden beyond the remit of the previously discussed associated studies and final report. The second, and arguably more important aim as it informs future practices, is to explore ways of gaining an insight into the potential future success of an intervention by examining the outcomes of the Boost of Technology intervention. The main objective is to identify possible ways to design usable evaluation methods that allow for an early insight into the efficacy of technology education interventions so as to inform the continuation of such projects. This study is particularly interested in the institutional level, i.e., how the schools have succeeded in changing education in line with the aims of the intervention. It is critical that the evaluation method is (1) relatively cost efficient and easy to collect relevant data, (2), possible to repeat, with generalizable results and (3) permits exploration of actual improvement in the schools’ teaching-practice and resultant learning. As such, the research question guiding this study is:

To what extent are the following approaches appropriate and relevant, relatively cost-efficient, and methodologically suitable, when evaluating a technology education intervention?
  • Approach 1: Exploring if and how the preconditions for teaching technology change during an intervention.

  • Approach 2: Analysing the development of work plans devoted to the subject of Technology during an intervention.

  • Approach 3: Examining changes in student performance as a result of an intervention.

The three approaches

Approach 1: exploring changes in the preconditions for teaching technology

In line with Teknikföretagen and CETIS (2012), the first selected approach was to analyse how the preconditions for teaching Technology have changed over time as a result of the Boost of Technology intervention. Three of the suggested criteria were considered including (1) the education level of Technology teachers, (2) how Technology as a subject was scheduled, and (3) what material standard the participating school offered for Technology. While this approach does not provide a direct measure of the ability of teachers or of students’ results, it has been suggested as a criterion that is vital for good technology education provision (Teknikföretagen and CETIS 2012).

Method

Data was collected through the requirement of documents within the Boost of Technology project. The collected documents consisted of local work plans, local timetables including information of when (which school year) Technology was taught, budgets for all subjects, the number of certified Technology teachers per school year, and the books and other teaching materials used in teaching technology. This information was gathered where available at the start-up of the project, 2010–2011, and after finishing the project 2012–2013. Where necessary, phone call interviews with schools were conducted to clarify information from within this documentation.

Findings

Precondition 1: teachers level of qualification

To be a certified Technology school as a result of the Boost of Technology intervention, each school needed to educate at least one teacher to have 45 ECTS credits in the subjects of Technology and/or technology pedagogy. All other teachers delivering Technology in the participating schools had to have at least 15 ECTS credits.

Throughout the entire project, approximately 1000 teachers and school leaders participated in the different professional development activities facilitated through the Boost of Technology intervention. This level of participation suggests that the intervention achieved positive impacts on a large number of Technology teachers. During the project, teachers at 17 of 20 selected pilot schools attended training courses. The remaining three schools either already met the criteria regarding ECTS credits or had teachers who were unable to attend. On average, 1.85 teachers per school participated, and 3.45 courses per teacher were undertaken. Nine schools had fulfilled the ECTS certification requirements when the project finished and the remaining schools were working on their last component of competency training. During the Boost of Technology intervention, the number of accredited teachers (15 ECTS credits) in the participating schools increased by 28%, with this number expected to rise.

Precondition 2: technology as a subject within school schedules

According to the steering documents, students must receive 800 h of instruction1 in science and technology during Years 1–9, and school leaders are free to distribute the allocated time among the different subjects to suit their local circumstances.

At the beginning of the Boost of Technology intervention, at least four schools did not mention Technology as a specific subject at all, but during the project they addressed and changed this practice. Figure 1 shows that at the end of the project, Technology was mentioned as a subject in its own in all local syllabuses in school Years 7–9. In school Years 1–6, it seems more natural to treat the subject as part of the Science and Technology block as nine of 20 schools had a syllabus combining those subjects. It should be noted that schools with missing formal information were likely to not offer Technology as an explicit subject as if they did the documentation would have been available.
Fig. 1

Status of the Technology subject in local school syllabi at the beginning (2011) and end (2014) of the Boost of the Technology intervention in the included pilot schools (n = 20)

CETIS (2014) recommends that 200 h should be explicitly devoted to the subject Technology in post-primary education with 100 of those hours being taught in Years 7–9. At the beginning of the project, due to a lack of documentation or clarity within documentation, it was difficult to obtain explicit information concerning how many hours were solely dedicated for Technology. Among the 14 schools that were able to provide information at the beginning and the end of the project, eight increased the time set aside for the subject. At the end of the project, large variations were seen in the amount of hours dedicated to Technology in Years 7–9 (Min = 54, Max = 144, M = 90.40, SD = 24.94).

Although most schools treated Technology as a subject in its own right with a time plan of its own, the work plans (discussed below) showed that most participating schools also approached it in an interdisciplinary way. This shows that an interdisciplinary approach does not necessarily mean that the subject receives less attention.

Following up on this criterion was relatively easy as all schools had information about class schedules available. However, it was difficult to determine that technology was actually taught in classes when combined with other subjects.

Precondition 3: What educational conditions are present in the schools for Technology?

Data were collected verbally (interviews with teachers and principals), in written form, (by documentation or reports from participating school’s economical accounting), and from inspections of the school premises during school visits. Due to the nature of the subject, in that it frequently involves practical elements, the basic condition assessed was the premises or classroom for Technology. Figure 2 reveals a development from using general purpose classrooms for Technology at the beginning to using specially equipped Technology classrooms at the end of the intervention. This is seen as positive development and a result of the project. It should be noted that schools with missing formal information were likely to use general purpose classrooms for Technology as specially equipped rooms would have been detailed in school documentation.
Fig. 2

Premises used for teaching Technology at the beginning (2011) and end (2014) of the Boost of Technology intervention in the included pilot schools (n = 20)

To show the importance of appropriately equipped classrooms, Fig. 3 illustrates two examples of how material and equipment were stored and used in schools without specific Technology facilities at the beginning of the project.
Fig. 3

Examples of storage facilities for Technology materials in schools at the beginning of the Boost of Technology intervention

Participating schools were also asked about the budget for teaching materials and equipment. At the beginning of the project, 14 schools submitted their budgets, and all schools provided this information at the end of the project. At least nine schools had increased their budgets for Technology, while 11 schools at least had specific budgets for Technology at the end of the project. This is also seen as positive development and a result of the project.

Of the schools that did not have a budget specifically for Technology at the end of the project, most included it in that for Science subjects or had a common budget for teacher teams teaching all subjects to a specific group of students. Some schools did not have an allocated budget for Technology, and of those which did, some spent 10 SEK/pupil (equivalent to 1 EUR) annually, and others spent 500 SEK/pupil (equivalent to 50 EUR) annually.

Approach 2: analysing the development of work plans devoted to technology

In Sweden teachers are advised to work in line with the General guidelines provided by the Swedish National Agency for Education (Skolverket 2011b). The General guidelines are recommendations for how schools and kindergartens should apply rules and regulations. These recommendations must be followed unless a school can ensure that the regulatory requirements are met in a different way. According to the General guidelines, a school’s mission is to create opportunities for all students to develop as far as possible in relation to the national targets. This requires well-planned and structured teaching based on the Education Act (Sveriges Riksdag 2011b) and curriculum and syllabi (Skolverket 2011a). Teachers should also plan to evaluate students’ knowledge and their own practice, and for communicating with students, parents, head teacher and other teachers (Skolverket 2011b).

In order to become at certified Technology school within the Boost of Technology project, each participating school had to develop a strategic, local steering document (work plan) showing how Technology would be taught throughout the entire compulsory school curriculum (Years 1–9). Since the new Education Act was applied in 2011 (Sveriges Riksdag 2011b), legislation for documentation of local steering documents has not yet been created. However, many schools and municipalities use similar documents created at local level (sometimes referred to as local work plans or local pedagogical planning) for each subject in school. It is often argued that this kind of documentation is good for quality assurance (Eriksson 1994; Teknikföretagen and CETIS 2012). Analysing schools work plans for Technology is the second approach investigated here in terms of determining a cost-efficient and effective approach to evaluating educational interventions.

Method

At the start of the project only three out of the 20 schools could submit local work plan, reflecting previous research findings (Bjelksäter 2011; Skolinspektionen 2014; Teknikföretagen and CETIS 2012). At the end of the project local work plans (approved by the local school administration and management) from 19 schools had been submitted. The 19 work plans were analysed according to a scheme developed by the research team. The two members of the research team who conducted the analysis worked in collaboration to achieve consensus for each judgment. The seven categories which the work plans were evaluated under were determined both from important qualities mentioned within steering documents and by the Boost of Technology research team. They included:
  1. a.

    Objectives of teaching and how they are linked to the curriculum.

     
  2. b.

    Evaluation and assessment.

     
  3. c.

    Distribution of selected teaching elements between years.

     
  4. d.

    Collaboration between the Technology subject and other school subjects.

     
  5. e.

    The four perspectives given in the national curriculum (the school values); (1) the historical, (2) the international, (3) the ethical, and (4) the sustainable development perspectives.

     
  6. f.

    Gender.

     

The work plans were evaluated under seven categories by being graded on a 5-point scale. However, at the time of analysis, it was determined that some schools required scoring in half points to more accurately reflect variances in quality. It can therefore be considered that a 9-point scale was utilised. However verbal descriptors were not created for the extra points.

The scale applied for the analysis was:
  1. 1.

    Aspect is not mentioned in the document.

     
  2. 2.

    Aspect is mentioned, but no description of it is included in the document.

     
  3. 3.

    Aspect is mentioned with a brief description, but no commentary is provided as to what or how it will be taught.

     
  4. 4.

    Aspect is mentioned with a brief description, and what and how it will be taught are partially described.

     
  5. 5.

    Aspect is mentioned with a brief description, and what and how it will be taught are fully described.

     

Findings

Every school was given an overall assessment of their work plan, which corresponds to the total number of points in this evaluation.

The highest score a school could achieve was 35 points (i.e., five points for all 7 categories). The average score for the participating schools was 23.7 points. The highest score achieved was 32.5 points and the lowest score was nine points. Eight schools achieved over 25 points. A full breakdown of the results from the analysis of the schools work plans is provided in Table 1.
Table 1

Evaluation results of school local work plans

School

Category

Total

M

SD

a

b

c

d

e

f

g

SCH1

1.5

1

2

1

1

1

1.5

9.00

1.29

0.39

SCH2

3.5

1

3

3.5

1

3

3.5

18.50

2.64

1.14

SCH3

5

4

5

1

2

1

1.5

19.50

2.79

1.82

SCH4

3

1

5

4

2

1.5

4

20.50

2.93

1.48

SCH5

4

2.5

2

3.5

2

1.5

5

20.50

2.93

1.27

SCH6

3.5

2.5

5

3

3

2.5

2.5

22.00

3.14

0.90

SCH7

5

4.5

4.5

2

3

1

2

22.00

3.14

1.55

SCH8

2

3

4.5

4.5

3

2

4

23.00

3.29

1.07

SCH9

4

3.5

3.5

3

3

4

3

24.00

3.43

0.45

SCH10

4.5

2

5

4.5

3

3

2

24.00

3.43

1.24

SCH11

4

3.5

5

5

3

3

1

24.50

3.50

1.38

SCH12

4.5

3

5

5

3.5

1.5

2.5

25.00

3.57

1.34

SCH13

4.5

4

5

3.5

2.5

5

1

25.50

3.64

1.46

SCH14

4.5

2.5

5

5

1

4

3.5

25.50

3.64

1.46

SCH15

4.5

4

5

4

3

2

3.5

26.00

3.71

0.99

SCH16

4

4

5

4

4.5

4.5

2.5

28.50

4.07

0.79

SCH17

5

3.5

5

1.5

4.5

5

5

29.50

4.21

1.32

SCH18

4

4

4

3.5

5

5

5

30.50

4.36

0.63

SCH19

4

4

4.5

5

5

5

5

32.50

4.64

0.48

Further examination of the mean result for each category (Fig. 4) illustrates which categories schools considered most and least when planning for the provision of Technology. Schools performed best in relation to their distribution of Technology across school years (category c) and in linking their objectives to the curriculum (category a). In relation to evaluation and assessment (category b), the average score was 3.00. Only three schools specifically mentioned how and what should be assessed, although most schools did include some form of assessment criteria, most often directly copied from sample material written by the Swedish Agency of Education. Category d, which describes collaboration between the Technology subject and other subjects in schools, had an average score of 3.50. This suggests that many schools included the teaching of other subjects in projects/learning activities aimed at developing Technology education. Category e, the relationship between the work plans and the perspectives in the national curriculum, had the lowest average score of 2.89. Categories f and g (gender issues and entrepreneurship respectively), are important school values that should permeate all teaching in all subjects (Skolverket 2011b). The average score of approximately 3 for both aspects can therefore be seen as low. Six schools (not the same schools for both categories) out of 19 described these aspects in a clear and relevant way.
Fig. 4

Mean scores across the seven categories for the Technology work plans in the participating pilot schools (n = 19). Error bars represent 95% confidence intervals

While all work plans referred to the policy documents (categories a and e), which is considered a positive result, this does not imply translation into practice. Many of the work plans contained quotes from the curriculum without any specification regarding how the aspects in question should be enacted (choice of activities and/or aims and learning goals). In the cases where actual suggestions regarding themes and activities did occur, the learning objectives (i.e., what the students should/must achieve) were rarely described. As mentioned, the work plans demonstrated that the participating schools were able to distribute the provision of Technology between different years. While this could be interpreted as an easy task, considering that many of the participating schools did not have Technology represented in each year prior to the start of the project, this is seen as proof of improvement in the provision of the subject.

Approach 3: exploring changes in student performance at a group level

The third approach explored in terms of an evaluation method for educational interventions is to investigate the potential change in participating Year 9 students’ grades in compulsory school over time, during the time of the Boost of Technology intervention. The national Swedish compulsory school examinations in Year 9 are considered a high-stakes assessment as they serve as a matriculation system for entry into high school. Therefore, these grades function as proof of how students’ succeed in each subject. However, caution is advised regarding this approach in Sweden as the Swedish grading system has been shown to possess many flaws (Gustafsson et al. 2014). It has been found, for example, that both school assessment systems and individual teacher assessments are not reliable among different Swedish schools (Nusche et al. 2011), meaning that students received different grades depending on what school they entered. Even though there are national criteria in the form of standardised learning outcomes specified for each subject (Skolverket 2011a), it is difficult for individual teachers to grade their students based on their performance. In some subjects national tests can help teachers to define what level their students perform at, but in Technology no such tests are available. The grades are the only formal documentation collected nationally in Sweden regarding performance in Technology.

Method

The grades from Year 9 for all students at the participating schools were retrieved from the National Agency database (SIRIS), where the pertinent available data that was collected and analysed included the number of students examined and the mean performance in Technology for each school. Data was analysed for three cohorts of Year 9 students (ntotal = 4634). The 2010 cohort (n2010 = 1610) was selected as this was the year before the intervention started, the 2013 cohort (n2013 = 1438) was selected as this was the year the intervention ended, and the 2016 cohort (n2016 = 1586), who were examined 3 years after the project finished, was selected to explore the potential longitudinal effects of the project in these schools. In 2010 merit points were awarded according to a 3-point scale, were the highest grade corresponded to 20 points, the middle grade to 15 points, and the lowest grade to 10 points. Subsequently a new grading system was implemented (Skolverket 2011a) meaning for the 2013 and 2016 cohorts a 5-point scale for merit points was used. The highest grade still corresponded to 20 points, followed by 17.5, 15, 12.5 and finally 10 points.

Findings

Figure 5 illustrates the mean performance across the 20 pilot schools for each of the three cohorts and reveals a positive trend with a mean difference of 1.87 merit points (9.35%) between 2010 and 2016. There was a statistically significant difference between cohorts as determined by a one-way ANOVA (F(2,4631) = 215.507, p < .000, η p 2  = .085) and a Tukey HSD post hoc test revealed statistically significant mean differences in performance (p < .000) between each pair of cohorts. This trend is also apparent in the percentage of students who achieve a passing grade in Technology across each of the years with 74.50% passing in 2010, 93.56% passing in 2013, and 95.77% passing in 2016.
Fig. 5

Mean merit scores in Technology, from students in the Boost of Technology intervention pilot schools. Error bars represent 95% confidence intervals

While the trend is positive and there are statistically significant mean differences across each year, the effects cannot be purely attributed to the Boost of Technology intervention due to potential confounding variables. Comparing the performance differences in the pilot schools with other schools to more clearly determine the effect of the intervention is not possible for two reasons. Firstly, during the intervention some teachers and principals in the pilot schools moved to other schools to teach Technology and secondly, as discussed approximately 750 teachers engaged with the project to some degree. While the pilot schools interacted with the project the most, there is no guaranteed pure sample for comparison. Despite this, the positive trend is considered to be at least partially a result of the Boost of Technology intervention and these results are seen as a good implication of the project.

Discussion

As previously outlined, there were two complementary aims for this research. The first was to examine the outcomes of the Boost of Technology intervention relative to its aim of improving the provision of technology education in Sweden. The findings for each of the three approaches discussed in this paper, examining preconditions, work plans, and student performance, suggest that the intervention was successful in this regard. Furthermore, when combined with the results of the previously published studies on the intervention (Gumaelius and Skogh 2015; Johannesson 2014), more certainty is given relative to this aim.

However, there were shortcomings with each approach described in this paper which are important to be noted as they most likely affected the final results. First, teachers were found to move their positions to nearby schools. Second, school principals were also found to move to other positions. Finally, very few schools had the same school management in the beginning as in the end of the Boost of Technology project. These circumstances affected the coherency and efficiency in data collection, may have negatively impacted potential change, and invalidated the use of nearby schools as reference schools as they were most likely to change as well.

The second aim of this research was to explore ways of gaining an insight into the potential future success of an intervention by examining the outcomes of the Boost of Technology intervention which are (1) relatively cost and time efficient, (2) possible to repeat, giving generable results and (3) permitting of exploration into actual improvement in the schools’ teaching practice. Each of the three approaches from this paper will now be discussed and compared relative to these criteria.

Time and cost efficiency

When comparing cost efficiency it is assumed that labour costs derived from data collection account for the majority of the total costs. In terms of time consumed, consideration is given to whether the material sought from schools was produced for the sake of the investigation, or whether this material was already produced for other purposes within the school.

For the first approach, examining the preconditions within a school relative to teaching Technology, the time needed for data collection was longer and more difficult than expected, which impacted the associated costs. School management needed to be contacted between one and four times (approximately twice on average) over a time period of several months, as the data was not compiled by habit. At the same time, the data was achievable by all schools. Therefore, the introduction of a compulsory scheme or template for compiling such data yearly could easily be achieved in future interventions. For the second approach, examining school work plans, it was found that the time required for collecting data was entirely dependent on whether the school had used work plans before this investigation or not. As a result, the labour cost for collecting the material at the end of the project was relatively short in comparison to at the beginning of the project. Introducing the use of work plans created substantial work for the teachers who did not already do this, but as a method of collaboration between teachers it was quickly accepted positively within schools. It seems surprising that such planning documents were uncommon, however their absence was also observed by Bjelksäter (2011) who found work plans or similar documentation to be rare. Therefore this approach is not considered to be cost-efficient, especially if the use of work plans was not already included in the schools’ regular activities. For the third approach, examining student performance grades, the data was collected easily through a national database, and is therefore considered to be the most cost efficient and time efficient method regarding data collection.

Repeatability and generalizability of results

Examining school preconditions for teaching Technology are considered to be both repeatable and generalizable if the parameters asked for in terms of preconditions are clear and decided upon beforehand. Most of the parameters asked for in this project were available to some extent, but as previously stated they were difficult to obtain. Differences between schools were observed, for example in relation to subject budgets, which meant that this precondition was difficult to compare and thus not generalizable. Examining subject work plans is determined to be the least repeatable as there are no national or regional standards for working with such documents. However, the Boost of Technology intervention showed that when standardised templates were provided to teachers and schools, the results could be generalizable. Analysing student performance relative to the grades in Year 9 was found to be the approach which was easiest to handle, most repeatable, and it provided generalizable results. This may seem intuitive as assessment is often a national tool for quality assurance and provides a standardised structure and criteria that are equal across all schools throughout the system. To date, almost all efforts for establishing a comparative measure of school quality in Sweden have focussed upon the grading system. The biggest issue for this approach in the Swedish context is the variance in performance measures between schools. While this may not be an international limitation, in Sweden at least more work is needed in terms of validating the tasks teachers use with considering Year 9 performance in future studies which consider such grades as a variable.

Measurement of actual improvement

The last and perhaps most interesting issue to discuss is whether these three approaches can actually measure real and purposeful changes within schools. Examining school preconditions for teaching a subject does not provide any insight into meaningful changes in teaching practice as it is only a measure of the improvement in the prerequisites for improved education. Having better preconditions does afford a teacher better opportunities in a classroom for teaching, but it does not necessarily mean these will be used, will be used effectively, or account for teacher’s general pedagogical capacity. However it is important that teachers have the preconditions they need in order to teach effectively and previous reports further confirms this (Teknikföretagen and CETIS 2012). Additionally previous research (Hartell 2015) has found that preconditions and affordance for teachers’ assessment practices must be enhanced in order to bridge teaching and learning, so supporting teachers with this is of great importance. Therefore, while not a measure of actual changes in teaching, it is considered necessary to ensure that appropriate prerequisites are in place for teaching Technology. Examining subject work plans or other similar planning documents also does not directly measure real changes in educational practices as it cannot be guaranteed that enacted practice will align with planned activities. However, this approach does give insight into intended classroom activity, and into teachers’ competence and ability to structure their teaching activities. Therefore, it is arguable that analysing work plans or other similar planning documents is a more useful and sophisticated tool than studying changes in general preconditions, and that assessing work plans could be more relevant as a predictor of potential learning. Examining performances grades appears to be the most direct measure of determining actual changes in the provision of Technology as they reflect student learning. However, this is not always the case. This uncertainty may have arisen due to the fact that the Swedish grading system is difficult to interpret (Norström, 2014) and due to the variability in teachers assessment despite the standardised national criteria. In Sweden it is difficult to measure a change specifically in the Technology subject, as this subject is often not taught explicitly and instead is often taught sporadically in conjunction with Science subjects. The result of this means that a good grade in Technology does not necessary mean that the student is performing well in technology, due to its conflation with Science. Furthermore, this approach does not allow for quick access to evidence during an intervention to determine if efforts are having a positive effect as such examinations are not necessary regularly, or carry such high stakes.

Conclusion

This study was a follow up study for the Boost of Technology intervention, which aimed at improving technology education in Swedish compulsory schools. During this project three different approaches for measuring the effect of the educational intervention where used, analysed and compared in order to find the most appropriate evaluation approach. When triangulating the results, it is clear that these aligned as similar effects could be measured for all three approaches. This suggests that the intervention was successful and had a positive impact on the provision of Technology within the included schools. However, with respect to identifying a single ‘best’ approach, there remains some difficulty. Table 2 provides a summary of each approach, evaluated under the three criteria previously established.
Table 2

Evaluation of approaches for evaluating a technology educational intervention

Approach

Criteria for being a useful evaluation approach

Cost and time efficient

Repeatable and generalizable results

Measurement of actual improvement

1. Examining school preconditions for teaching Technology

Very efficient, if data to be collected is agreed upon in advance of the project

Can easily be repeated with generalizable results

Only provides evidence of improved preconditions, not actual practice or impact

2. Examining work plans for Technology

Not efficient as it takes time to establish the work plans, especially if not standard practice

Can be repeated but a template is needed for generalizable results

Indicated potential change in teachers’ intentions, but may not reflect enacted practice

3. Examining student performance over time through a national assessment

Easy to collect as grades are set in the end of the year but can be slow dependant on examination schedules

Very easy to repeat with generalizable results, if the grading system is not changed on national level

Providing grades are reliable, this appears to give the best indication of learning and impact

Examining preconditions for teaching Technology is seen as the most valuable approach if a fast and reliable approach is desired. Evaluating work plans is believed to be the approach that could best predict what is intended to happen in practice, and therefore if they are found to mirror enacted practice they provide the best insight into change in educational provision. Finally, performance grades provide a good measure of change as they can reveal an impact on learning. However, the examinations must be purposeful and analysis of performance is restricted based on the examination schedule. Considering the results of all approaches align, this suggests that enhancing the preconditions and work plans can impact teaching and learning positively. Therefore examining preconditions and work plans may act as a good interim predictor of the success of an educational intervention, but further work with a more controlled intervention is needed to confirm this, and examining student performance can act as a good predictor depending on the duration of an intervention or as a good validation tool at the end.

Footnotes

  1. 1.

    From Autumn 2018 the Technology subject is required to have an explicit timetable of in own in compulsory school (Sveriges Riksdag 2017).

References

  1. Bjelksäter, Y. (2011). Vad gör egentligen tekniklärarna? In S. O. Hansson, E. Nordlander, & I.-B. Skogh (Eds.), Teknikutbildning för Framtiden: Perspektiv på Teknikundervisningen i Grundskola och Gymnasium (pp. 49–61). Stockholm: Liber.Google Scholar
  2. CETIS. (2014). Teknik: Ge eleverna en chans till god teknikundervisning! Tänk 200!. Linköping: Linköpings Universitet.Google Scholar
  3. Eriksson, M. (1994). Lokal ämnesarbetsplan. Nämnaren, 2(1), 15–18.Google Scholar
  4. Gumaelius, L., & Skogh, I.-B. (2015). Work plans in technology: A study of technology education practice in Sweden. In M. Chatoney (Ed.), PATT2015: Plurality and complementary approaches in design and technology education (pp. 188–194). Marseille: PATT.Google Scholar
  5. Gustafsson, J.-E., Cliffordson, C., & Erickson, G. (2014). Likvärdig kunskapsbedömning i och av den svenska skolan – problem och möjligheter. Stockholm: SNS Förlag.Google Scholar
  6. Hartell, E. (2015). Assidere necesse est: Necessities and complexities regarding teachers’ assessment practices in technology education. (Doctoral Dissertation). KTH Royal Institute of Technology.Google Scholar
  7. Hartell, E., Gumaelius, L., & Svärdh, J. (2015). Investigating technology teachers’ self-efficacy on assessment. International Journal of Technology and Design Education, 25(3), 321–337.CrossRefGoogle Scholar
  8. Hartell, E., & Svärdh, J. (2012). Unboxing technology education part I—Starting point. In T. Ginner, J. Hallström, & M. Hultén (Eds.), PATT2012: Technology education in the 21st century (pp. 211–222). Stockholm: PATT.Google Scholar
  9. Hernandez, P., Bodin, R., Elliott, J., Ibrahim, B., Rambo-Hernandez, K., Chen, T., et al. (2014). Connecting the STEM dots: Measuring the effect of an integrated engineering design intervention. International Journal of Technology and Design Education, 24(1), 107–120.CrossRefGoogle Scholar
  10. IEA. (2018). TIMMS & PIRLS. Retrieved from www.timssandpirls.bc.edu.
  11. Johannesson, C. (2014). Tekniklyftet final report. Stockholm: Vetenskapens hus.Google Scholar
  12. Karlsson, M., Nilsson, I., Petersson, S., Stenman, T., & Wolf-Watz, O. (2013). Utvärdering matematiklyftets utprövningsomgång. Stockholm: Ramböll.Google Scholar
  13. Lindahl, B., & Jansson, C. (2004). Hur gör vi med tekniken? Teknikundervisning i grundskolans tidigare år. Kristianstad: Högskolan Kristianstad.Google Scholar
  14. Mammes, I. (2004). Promoting girls’ interest in technology through technology education: A research study. International Journal of Technology and Design Education, 14(2), 89–100.CrossRefGoogle Scholar
  15. Norström, P. (2014). How technology teachers understand technological knowledge. International Journal of Technology and Design Education, 24(1), 19–38.CrossRefGoogle Scholar
  16. Nusche, D., Halász, G., Looney, J., Santiago, P., & Shewbridge, C. (2011). OECD reviews of evaluation and assessment in education: Sweden. Paris: Organisation for Economic Co-Operation and Development.CrossRefGoogle Scholar
  17. OECD. (2009). Education at a glance. Paris: Organisation for Economic Co-Operation and Development.Google Scholar
  18. OECD. (2018). PISA: Programme for international student assessment. Retrieved from http://www.oecd.org/pisa/.
  19. Österholm, M., Bergqvist, T., Liljekvist, Y., & van Brommel, J. (2016). Utvärdering av Matematiklyftets resultat: Slutrapport. Ümeå: Umeå forskningscentrum för matematikdidaktik.Google Scholar
  20. Skolinspektionen. (2010). Rätten till kunskap: En granskning av hur skolan kan lyfta alla elever. Stockholm: Skolinspektionen.Google Scholar
  21. Skolinspektionen. (2014). Teknik - gör det osynliga synligt: Om kvaliteten i grundskolans teknikundervisning. Stockholm: Skolinspektionen.Google Scholar
  22. Skolverket. (2011a). Curriculum for the compulsory school, preschool class and the leisure-time centre 2011. Stockholm: Skolverket.Google Scholar
  23. Skolverket. (2011b). Planering och genomförande av undervisningen - för grundskolan, grundsärskolan, specialskolan och sameskolan. Stockholm: Skolverket.Google Scholar
  24. Skolverket. (2018a). Lärarlyftet. Retrieved April 17, 2018, from https://www.skolverket.se/kompetens-och-fortbildning/larare/kurser-och-ansokan.
  25. Skolverket. (2018b). Matematiklyftet. Retrieved April 18, 2018, from https://www.skolverket.se/kompetens-och-fortbildning/larare/matematiklyftet.
  26. Sveriges Riksdag. (2011a). Förordning (2011:326) om behörighet och legitimation för lärare och förskollärare. Stockholm: Sveriges Riksdag.Google Scholar
  27. Sveriges Riksdag. (2011b). Svensk författningssamling (2011:185). Stockholm: Sveriges Riksdag.Google Scholar
  28. Sveriges Riksdag. (2017). En stadieindelad timplan i grundskolan och närliggande frågor: Utbildningsutskottets betänkande 2016/17:UbU23. Stockholm: Sveriges Riksdag.Google Scholar
  29. Teknikdelegationen. (2009). Samverkan mellan skola och arbetsliv - flaskhalsar och framgångsfaktorer. Stockholm: Teknikdelegationen.Google Scholar
  30. Teknikdelegationen. (2010). Vändpunkt Sverige - ett ökat intresse för matematik, naturvetenskap, teknik och IKT. Stockholm: Teknikdelegationen.Google Scholar
  31. Teknikföretagen. (2005). Alla barn har rätt till teknikundervisning!: En rapport om teknikämnet i dagens grundskola. Stockholm: Teknikföretagen.Google Scholar
  32. Teknikföretagen & CETIS. (2012). Teknikämnet i träda: Teknikföretagens och CETIS rapport om teknikundervisningen i grundskolan. Stockholm: Teknikföretagen.Google Scholar
  33. Timperley, H. (2011). Realizing the power of professional learning. Maidenhead: McGraw-Hill Education.Google Scholar
  34. Westlund, I. (2012). Lärarlyftet: En utvärdering av upplevda effekter och processer i samband med lärares deltagande i Lärarfortbildningssatsningen inom Lärarlyftet. Linköping: Linköpings Universitet.Google Scholar

Copyright information

© The Author(s) 2018

Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Authors and Affiliations

  1. 1.KTH Royal Institute of TechnologyStockholmSweden

Personalised recommendations