Keywords

Reflecting on his long-time service in the Organization for Economic Cooperation and Development (OECD), the Norwegian social economist Kjell Eide (1925–2011)Footnote 1 described the historical relations between the Nordic countries and the OECD in education. Writing about the 1960s, Eide (1990) contended that “the Nordic countries were in a period of strong expansion and reform, and there too, it was at times valuable to have the OECD’s blessing for the political directions underlying the reforms” (p. 20). One of Eide’s main points in his 1990 essay was that the Nordic countries and the OECD have had a close relationship in education for a long time. He even hinted that the OECD has played the role of a knowledge broker in the Nordic region.

Recent research has painted the same picture. In their comparative analysis of education policies in the Nordic region, Dovemark et al. (2018) emphasized how OECD country reviews and other expert reports—often commissioned by the national governments—are used to “legitimize economic and strict educational policy decisions” (p. 125). In her recent analysis of the OECD’s role in the governing of education in Sweden, Grek (2020) argued that the OECD takes up a position as a boundary organization “constructing a very carefully maintained equilibrium amongst the different powers and interests of the actors in the field” based on “a hybrid of both knowledge and policy closely intertwined.” More specifically, the hybrid consists of “hard numbers, administrative advice, managerial know-how and best practice recommendations in a big, versatile, complex and ever-changing mixture of facts and values” (p. 17). Important nuances to these pictures emerge from the previous chapters in this volume.

In this chapter, we follow this trail of research into the OECD–Nordic region relations in education policy by applying a specific focus on the relations between the OECD and national knowledge brokers. In other words, the chapter investigates the extent to which the OECD via its relations with national institutions has infused policy change in the Nordic region. As such, we do not focus on the OECD as an actor that impacts national school reform; rather, we examine the relation between the OECD and national policy actors that, at critical stages, draw on the authority of the OECD to develop and substantiate their own national reform strategies. The chapter offers an in-depth analysis of Denmark, Finland, and Iceland as empirical cases to understand the nexus or assemblage of the OECD with national institutions serving as knowledge brokers in the Nordic region.

The relevance of this perspective is supported by the fact that the OECD does not have the mandate or the ability to dictate policies in member countries. As many researchers have noted, the OECD operates with a distinct soft power mode of governance (Bieber & Martens, 2011; Mundy et al., 2016; Steiner-Khamsi, 2019). One example of this soft governance is the peer pressure associated with multilateral surveillance among member countries in the OECD. In this respect, Morgan and Volante (2016) pointed out that “the OECD has pursued a strategy of ‘soft’ persuasion that naturalizes the idea that performance in a series of measurement exercises represents educational quality” (p. 778). This strategy is underpinned by what John Krejsler (2019) has called a “fear of falling behind” among PISA-participating nations.

In terms of education policy reforms, the previous chapters in this book have amply demonstrated that a number of national institutions serve as key providers of knowledge. They serve as arbiters, brokers, producers, and mediators of knowledge and policy flows between transnational, national, and local spaces. From an OECD perspective, they might even be described as bridgeheads or intermediaries for the dissemination and impact of OECD policy recommendations and policy instruments.

In a theoretical sense, this observation might be expressed using the concept of “instrument constituencies” (Béland & Howlett, 2016). According to Simons and Voß (2018),

Policy instruments […] are not only “active” or “alive” because they contain scripts for reordering society […] but also because they gather a constituency comprised of practices and actors oriented towards developing, maintaining and expanding a specific instrumental model of governing. (p. 31)

In their analysis of the OECD, Verger et al. (2019) drew on the same concept using Kingdon’s (2003) terms. Verger et al. (2019) pointed out how the potential for the OECD governance mechanisms to advance agendas is rooted in a “[…] capacity to open a policy window through which the problem, policy, and politics streams are affected in a relatively coordinated and coherent way” (p. 236). In this sense, the chapter contributes to our understanding of the OECD as a policy actor—even though that is not our specific focus—because we unpack the role and relations between the OECD and national institutions in the three case countries.

Arguing the Relevance and Context of the Three Case Countries

The following sections outline a brief frame of interactions between the OECD and the three case countries. This outline serves as the argument for selecting the three case countries.

The OECD has a history of influencing and making recommendations for the Danish field of education (Ydesen, 2021). For instance, in April 1963, the Danish Ministry of Education established an economic and statistical section in response to an OECD request in the program for Educational Investment and Planning (Ydesen & Grek, 2019). In 2004, an OECD report, produced at the request of the Danish government, found that Danish education research was too unfocused and called for the establishment of a clearinghouse for educational research in Denmark (Krejsler, 2017; OECD/CERI, 2004). The Danish Clearinghouse for Educational Research was established in 2006. The report also emphasized the importance of establishing an evaluation culture, which led to the implementation of national testing in compulsory education in subsequent years (Shewbridge et al., 2011). Another key initiative following from the 2004 OECD report was the formation of the School Agency, which had an explicit focus on evaluation culture and improving quality in the public school system. In 2012, the OECD identified Denmark as one of only three countries where the PISA results have had an “extremely” big impact on educational policies and practices.

The OECD has been a frequent collaborator in Finnish education policy as well, and Finland has received additional international attention due to its high-scoring performance in PISA. As described in Chap. 5, researchers investigating Finnish education disagree on how direct the influence of the OECD on national education policy is. It is clear that the main higher education reforms in Finland have been preceded by OECD reviews (Kallo, 2009) and that there is an element of using the OECD as a clearinghouse for higher education reforms (Kauko, 2011); however, the effect of the OECD on primary and secondary education is more debatable. In understanding this effect, PISA has been the focus of research. Sahlberg (2011) argued that PISA success has resulted in a lack of innovation in primary and secondary education. However, Seppänen et al. (2019) found that Finnish governments have been noticeably active in comprehensive school policies during the new millennium. Kauko et al. (2021) argued that PISA has been compartmentalized from national reviews and thus has limited effect. Rautalin (2013) pointed out how results were used to strengthen the interest groups’ and government officials’ views with little media criticism.

The OECD has a long history of influencing education in Iceland, starting explicitly in the 1960s, when the minister of education introduced a human capital approach through an extensive examination of the education system (Guttormsson, 2008, pp. 88–89). This examination was one of the building blocks of the comprehensive schooling act in 1974. The OECD began undertaking examinations of compulsory schooling in 1986 (Guttormsson, 2008, p. 264f). The OECD’s most apparent influence on Icelandic education is through the PISA measurements that have substantially influenced Icelandic educational discourse for the last 18 years. After the financial crash in 2008, the economic and governmental system was highly criticized (Oddsdóttir, 2014). In recent years, Iceland has strived to rebuild its education system. The OECD has played a role in shaping the discourse in recent educational policy papers that have strived for a more professional and transparent system.

Research Questions

Building upon these exemplary connections and policy flows, we hypothesize that there are very strong interactions between Denmark, Finland, and Iceland as cases and the OECD in education. Building on the findings of the national chapters in this volume, the objective of the chapter is to analyze policy flows between the OECD and the three case countries and analyze the political capital created by the OECD and its use in the national contexts. More specifically, we aim to investigate the gearing, entry points, and interactions in the links between the OECD and national institutions in infusing policy change.

In terms of policy reforms, we focus on the same reforms as have been analyzed in the respective national chapters in this volume. In this sense, the chapter offers supplementary insights into the other chapters of this volume through analyses of the transnational policy flows and the knowledge brokers behind the education reforms in each case country. In pursuing this aim, we follow the guidance of three research questions:

  1. 1.

    Which policy instruments connect the OECD with each national context?

  2. 2.

    Which national institutions are the central providers of evidence for national education reforms?

  3. 3.

    How are these institutions located in the national fields of education, and to what extent do they serve as knowledge brokers between the OECD and the national contexts?

Methodology and Chapter Structure

Our methodological approach takes a starting point in the Foucauldian idea about bringing knowledge and power into one analytical field, assuming that these two are connected and in interrelation molding each other (Popkewitz & Brennan, 1998). We treat institutions and experts as agents who are positioned in a privileged way that allows them to be the providers of seemingly objective knowledge underpinning education reforms while at the same time exerting and institutionalizing power relations and power discourses in the political field of education reforms. Understanding the workings of this mechanism is vital for understanding the nexus between the OECD and the Nordic region.

The methodological recipe employed in the chapter consists of three analytical steps aligned with the three research questions. The first step pinpoints the central policy instruments connecting the OECD with each national context and identifies the central institutions associated with these policy instruments. According to Lascoumes and Le Galès (2007), a policy instrument may be defined as:

a device that is both technical and social, that organizes specific social relations between the state and those it is addressed to, according to the representations and meanings it carries. It is a particular type of institution, a technical device with the generic purpose of carrying a concrete concept of the politics/society relationship and sustained by a concept of regulation. (p. 5)

In the case of the OECD, the main policy instruments are policy reviews (e.g., country or thematic reviews), global progress reports (e.g., Education at a Glance), and international large-scale assessments (e.g., PISA and TALIS). Following the definition above, such policy instruments resonate in the national institutions, allowing knowledge brokers to operate between the OECD and the Danish, Finnish, and Icelandic contexts (e.g., universities, sector research institutions, and consortia).

The second step relies on descriptive statistics of bibliometric analysis and content analysis of national and Nordic policy documents. This step serves the purpose of identifying the significance and centrality of the national institutions associated with the OECD. We use documents from three different reforms in Denmark, Finland, and Iceland. The bibliometric data is from white and green papers in each context. We draw on the policy instruments identified in the first step and analyze how the national institutions are engaged in the translation of these instruments into the national contexts.

The third step employs a contextual analysis of the institutions and agents to locate them in the respective national field of education. We use open sources to conduct the analysis of key institutions and agents in the respective national fields (Menashy & Verger, 2019). Finally, we look at the significance and position of these national institutions and the OECD in the national educational field.

In the concluding discussion, we comparatively look across the three cases and offer insights into the research questions. In this sense, the chapter illuminates the configuration and workings of the OECD-centered epistemic community forming the modes of knowledge and governance woven into the Nordic education fabric.

Denmark: A Contested Field of Education Evidence and Research

Looking at the most central OECD policy instruments playing into the Danish reform process of 2013, it is clear that global progress reports and international large-scale assessments (ILSAs) are most prevalent in terms of citations. The OECD was the most frequently cited international publisher, with the most important document being the 2009 PISA results (OECD, 2010a). Thus, OECD policy instruments served as important points of orientation among key agents in the Danish education policy field.

Despite the large number of studies in Table 11.1, the OECD was only the fifth most cited publisher in the Danish policy documents. Clearly, a number of other knowledge providers were at play in the 2013 Danish education reform. Three of the knowledge providers that were cited more often than the OECD were government organizations, namely the Ministry of Education itself and government-funded sector research institutions (see Chap. 4 in this volume). They were responsible for no less than two-thirds of all citations in the policy documents (Fig. 11.1).

Table 11.1 Cited documents published by the OECD in the Danish database
Fig. 11.1
A column chart of data versus publishers has the following values. The Ministry of Education, 57. E V A, 16. S F I, 16. SAGE publication, 11. A K F, 7. Oxford Press, 7.

Most cited publishers in the Danish policy documents

The role and significance of the OECD as a knowledge provider cannot be determined by the number of citations alone. The OECD also influenced the mindset and the constituency surrounding the reform process. A key point of orientation here appears in the knowledge brokers between the OECD and the Danish education field, which can be found in the shifting consortia tasked with conducting the PISA surveys. At the time of the reform, the Danish PISA consortium consisted of the Danish School of Education (Aarhus University), Statistics Denmark, and the Danish Institute of Government Research (AKF). The consortium included a university, a sector research institution, and the national statistical service, all based in the greater Copenhagen area.

The chairman of the PISA consortium since 2000 was Professor Niels Egelund from the Danish School of Education. Egelund was the leading figure behind the report Danske unge i en international sammenligning [Danish Youth in International Comparison], which reported on the results of PISA 2009, and he was a member of the School Agency chairmanship. Another leading figure was Professor Lars Qvortrup, who at the time was dean of the Danish School of Education and who worked closely with Egelund. It is striking that Egelund, Qvortrup, and their colleagues, Professor Jens Rasmussen and Andreas Rasch-Christensen, head of research at VIA University College, served on a number of ministerial committees and institutions surrounding the reform. Generally, these four prominent agents, who command considerable capital in the Danish field of education, have been very vocal and visible in the whole reform process, beginning with the preparatory work and continuing to the evaluation of the reform.

However, looking at the configurations of the field of education research in Denmark paints a picture of a rather acrimonious research environment. The 2013 school reform has been a particular bone of contention. The researchers mentioned above, who were associated with OECD policy instruments and the development of education policy, constitute one camp in the field, whereas a host of critical researchers make up another camp. Most notably, in his PhD dissertation, Keld Skovmand (2017) claimed that the 2013 education reform was not grounded in evidence or knowledge. These debates are still ongoing some seven years after the reform was implemented. Thus, it is fair to say that Danish education research often finds itself in a very toxic environment with significant antagonism between at least two main clusters, one being the evidence-based what-works type of research and the other being research adhering to pedagogical ideals about Bildung and emancipation as well as a notion of pedagogy being a unique field with its own values and contributions (Rømer, 2017).

In this environment, the Ministry of Education has followed its own agendas and priorities without engaging or siding explicitly with one camp or the other. Being preoccupied with these agendas and priorities, the Ministry has fallen short of making explicit connections between OECD policy instruments and Danish education reform. This somewhat retired role has provided ample space for professional and academic debates to unfold—and perhaps for the trenches to be dug deeper.

A combination of these insights with the findings in Chap. 4 of this volume indicates that the OECD lent authority to knowledge by providing political capital through evidence. This capital was picked up—or extended—to power national agents who were able to shape the Danish education agenda in accordance with the ambitions and instruments featured in the 2013 reform.

Finland: State-Centered Production of Data

The OECD policy instruments under scrutiny in this chapter most relevant for Finland’s education system are OECD policy reviews and international large-scale assessments. Research has documented more national policy changes in relation to the former, while researchers have seen the latter as serving more a legitimation purpose (e.g., Rautalin, 2013; Rinne et al., 2004; Sahlberg, 2011; Seppänen et al., 2019). In the OECD electronic archive,Footnote 2 which starts from 2005, there are three policy reviews that discuss primary and secondary education in Finland: a thematic review on equity in education (OECD, 2005), a country case study on digital learning resources (OECD, 2008a) as part of a Nordic report (OECD, 2009), and in a school leadership report where Finland was one case country (Pont et al., 2008). There is also one influential (Kallo, 2009; Kauko & Diogo, 2011) review from 2006 (OECD, 2006) addressing tertiary education. Finland features in the global progress reports entitled Education at a Glance, and the country has participated in all OECD ILSAs apart from TALIS in 2008 (Sivesind, 2019). Finland was first in PISA in reading (2001), mathematics (2003), and science (2006) before dipping slightly in reading to second (2009), in science to fifth (2015), and more dramatically in mathematics to 12th (2012; OECD, 2002, 2004, 2006, 2010b, 2014, 2016). While the results might have caused debates resulting in dramatic changes in other countries, the provision of education and the basic principles of the comprehensive school have remained the same in Finland. One of the biggest reforms has been the Core Curriculum Reform of 2014, as described in Chap. 5. In this subsection we analyze the curriculum reform, drawing on the bibliometric database before we scrutinize the OECD reports and analyze the networks working with ILSAs.

The findings from our bibliometric analysis reveal that the most important sources of knowledge for the curriculum reform in 2014 can be divided into five main groups. Table 11.2 displays all publishers with more than ten references, accounting for 51% of all Finnish references in the database. The most important group contains government organizations, specifically the Finnish National Agency for Education and the Ministry of Education and Culture. These organizations have published around one-third of all references in the Finnish documents. The next group includes two universities, the University of Jyväskylä and the University of Helsinki, with around 12% of all references. The third group comprises national and international publishers, focusing on both popular and science publications (PS Publishing, Werner Söderström Limited Company WSOY) or only on science (Taylor & Francis). This group is responsible for publishing 6% of the referenced material (35 references). The fourth group consists of OECD Publishing, with 14 references (2%). In summary, over 60% of all references in the Finnish database were published either by state or university actors, and the focus of this chapter, the OECD, played a minor role in the number of direct references.

Table 11.2 Most cited publishers in the 2014 Finnish curriculum reform green and white papers

Table 11.3 lists the 14 OECD-published documents that are referenced in the 2014 Finnish curriculum reform. Half are directly linked to numeric education indicators (i.e., global progress reports and ILSAs); specifically, five are PISA-related, and two are linked to Education at a Glance. Apart from one source (Understanding the Brain), all policy reviews were closely linked to the content of the curriculum reform: skills, competences, future projections, and career guidance.

Table 11.3 Cited documents published by the OECD in the Finnish database

Outside the reference database, when looking at the background of all reports on Finland published by the OECD, the social networks reveal a more nuanced picture. The background information of OECD policy reviews makes it clear that the main informants for the OECD teams come from ministries, universities, and interest groups. In the visiting program of the two review teams (OECD, 2005, 2008a) and the working group writing the background memo (Ministry of Education, 2007) for the leadership report (Pont et al., 2008), 86 namesFootnote 3 of informants are mentioned. The main groups are 24 officials from ministries or the National Agency for Education, 16 university researchers (mainly from the University of Helsinki or University of Jyväskylä), 12 labor market organization representatives, 10 education interest group representatives, 8 schoolteachers or principals, and 6 representatives from cities or municipalities. More women (n = 49) than men (n = 37) were interviewed by the review teams. The number of people interviewed does not necessarily communicate the impact of single institutions on the report, as many of the interviews happened in groups.

When considering OECD ILSAs, two institutions and communities of experts are important. The Ministry of Education and Culture contracted the implementation of PISA to either one or both of two organizations: the Finnish Institute of Educational Research at the University of Jyväskylä and the Centre for Educational Assessment at the University of Helsinki. The former has also been responsible for TALIS. During and after the curriculum reform, the contractor has been a consortium of these two (Opetus- ja kulttuuriministeriö, 2013, 2015, 2018). The universities of Jyväskylä and Helsinki and their university research centers seem to be main hubs for OECD data expertise in Finland. They were most relevant in the OECD review visits, and they are also responsible for implementing ILSAs in Finland. The connection seems to be institutional: the names of the experts interviewed for the reviews and those conducting ILSA research do not overlap except for Professor Jouni Välijärvi, who was the head of the Finnish Institute of Educational Research until 2017. Nevertheless, these two universities are also the main publishers featured in the evidence production for the curriculum reform analyzed in Chap. 5.

When these arrangements are considered together, the picture of the main institutions for providing knowledge for decision-making in education starts to unfold. The production of knowledge in the case of the curriculum reform was much aligned with the picture of the country review visits. As noted in Chap. 5 and as we have seen in the Danish case, there is a distinct state-centeredness in data production through the involvement of the Finnish National Agency for Education and the Ministry of Education and Culture. In addition to these, the University of Helsinki and the University of Jyväskylä stand out from the academic side.

Iceland: OECD as a Leading External Source

Iceland features a well-established and systematic state-centered production of OECD data for all school levels. The Ministry of Education, Science, and Culture is a key institution that has issued many study reports that played an essential role in the policy papers from 2013 to 2017 along with the OECD ILSA documents. The previous National Centre for Educational Evaluation was the main knowledge broker between the OECD and Iceland concerning ILSA documentation on PISA from 2000 and TALIS from 2008. In 2015, the institutional structure in the field of education was reformed when the National Centre for Educational Evaluation and the National Centre for Educational Materials merged into one institution, the Directorate of Education (act of law nr. 91, 2015). Currently, the production of OECD ILSA documents is there.

In Iceland, PISA results have always received much attention in the media. The reporting of the PISA results in Iceland had considerable stability as the same person, Almar M. Halldórsson, was until recently the project manager of PISA and the main mediator of what was highlighted in the PISA results for Iceland. He authored or co-authored all the Icelandic state reports from the beginning of PISA in 2000 and until 2013 (Björnsson et al., 2004; Halldórsson, 2006; Halldórsson et al., 2007, 2013, 2007, 2013). He has also written some academic journal articles on the Icelandic gender gap in PISA (Halldórsson & Ólafsson, 2009; Ólafsson et al., 2006). When the Directorate of Education took over the PISA project, the institution started to authorize the OECD and the Directorate as the authors (Menntamálastofnun, 2017; Menntamálastofnun & OECD, 2019). The ILSA documents of TALIS kept its personal authorization by Ragnar F. Ólafsson. This different governmental process of authoring reports from the same institution tells a story about Icelandic governance and its inconsistency. This process also mirrors the emphasis on the PISA results at the Directorate office led by the former head of the Education Department at the Ministry of Education Dr. Arnór Guðmundsson, that was appointed by Illugi Gunnarsson, Minister of Education 2013–2017, as the first director of the new institution after having led the editorial work of the White Paper 2014 (WP2014) (Ministry of Education, Science and Culture, 2014a).

Most of the time, there have been weak ties between the higher education field in Iceland and OECD data production for the education government body, as revealed by the low percentage of academic references in the bibliographies of the policy papers (Magnúsdóttir & Jónasson, Chap. 6 in this volume). This situation changed substantially with a new policy adopted by the Directorate of Education. For PISA 2015 and 2018, there was systematic cooperation among professors from the School of Education at the University of Iceland and specialists from the Directorate in analyzing the Icelandic results of PISA 2015 and introducing in a public forum. In 2016 peer-reviewed special issue on PISA literacy was published by the School of Education, University of Iceland.Footnote 4 Thus, in recent years, the academization of PISA in Iceland has become markedly more prevalent.

The institutional arrangement of OECD data production has now been introduced and the next step is to analyze the types and numbers of OECD references in the bibliometric database. Second, we scrutinize the authorship and the use of references in two OECD country case reports (shaded in Table 11.4) focused on the compulsory education in Iceland that were referred to in White Paper 2014 (WP2014). The analysis of the two case study reports gives some further insight to how the national institutions are working with the OECD when producing some national evaluation and policy data.

Table 11.4 OECD documents in the Icelandic database

In general, Iceland has few publications or documents that can count as green or white papers, and few of the documents that exist have citations and reference lists. There is no tradition for reference lists when framing education acts or curriculum guides. However, the protocol is changing, and the most recent documents published by the state do have citations and reference lists. Only three documents published in the 2013–2018 reform period fulfilled all requirements for this study. These documents were the only white paper (Ministry of Education, Science and Culture, 2014a), published specifically as such (WP2014) and two green papers (European Agency for Inclusive Education, 2017; Ministry of Education Science and Culture, 2014b) that had a proper reference list for bibliometric analysis. The latter green paper (European Agency for Inclusive Education, 2017) was updated to count as a white paper (WP2017) soon after it was published (Magnúsdóttir & Jónasson, Chap. 6 in this volume). These three documents contain a total of 203 references, more than half of which appear in the only green paper (GP2014) that is heavily referenced by domestic publishers (Statistics Iceland and the state) (Fig. 11.2).

Fig. 11.2
A bar graph has the following values. Ministry of Education, Science, and Culture, 51. Statistics Iceland, 45. O E C D, 18. Ministry of Finance and Economic Affairs, 6. University of Iceland, 6.

Most cited publishers

Combining the bibliographic information for these three documents reveals that the OECD is the third most cited publisher and the only one that is international. The Icelandic references are overwhelmingly governmental and statistical. As discussed in the national chapter, neither local nor global academia plays a big part in providing knowledge in these documents, according to the bibliography. Of external knowledge providers the OECD is the most cited. The main reason for the frequency of OECD references is the many citations in WP2014. Table 11.4 lists all the OECD publications included as references in these three documents.

The green paper from 2014 was a country background report written as an input to the OECD Review of Policies to Improve the Effectiveness of Resource Use in Schools. The document was prepared in response to guidelines the OECD provided to all countries. The white paper from 2017 was written by the European Agency of Special Needs, so WP2014 is the only document that was written originally in Icelandic by officials at the Ministry of Education without any kind of “external help or guidance” (Magnusdottir & Jonasson, Chap. 6 in this volume). Of these OECD publications in the Icelandic database, 78% are cited in the white paper from 2014. Analyzing WP2014 qualitatively through Atlas.ti software reveals that the OECD is much more prevalent than its reference list accounts for. The reference list has 36 citations when including in-text citations, 12 of which originated from the OECD. Through word counting, we determined that the OECD is mentioned 66 times in the document (English version) on 21 of the 45 pages of texts (excluding the reference list). OECD citations in WP2014 are a mix of values, conceptual framework, numerical data (mainly PISA), and advice. Still, numerical data are the dominant form of knowledge that leads the advice given for Icelandic policy in the WP2014.

By exploring the two country case reports (the ones shaded in Table 11.4) one can better understand the knowledge production and procedure. The working procedure is to hire Icelandic scholars or specialists by the Ministry to write a background report that typically counts for a majority of the final country case report. These background reports are not published. That partly explains the scarce of published green papers in Icelandic governance. The OECD Study on Digital Learning Resources in Iceland (2008b) was based on six case studies in the Icelandic education system. The appendix provides a list of 47 people who participated, 27 of whom were women. None is authorized but only mentioned as participants in the knowledge process. A group of local experts participates in informative meetings. Skúlína Kjartansdóttir, a former school principal and currently an adjunct at the University of Iceland, wrote the other case study report under review; OECD Reviews of Vocational Education and Training (2013). She was hired to the Ministry to work on this report and attended meetings with the OECD authors. Her background report is not mentioned in the reference lists, though her name is mentioned in the acknowledgments along with some officials in the Ministry. There is no list of participants available in the report. References are 35, thereof 17 of which were published by the OECD. It is very similar to WP2014 in terms of heavy use of references from OECD. Conversely, the OECD Study on Digital Learning Resources in Iceland (2008b) has 13 references, 5 of which are academic journal articles with only 1 OECD reference which is more in line with GP2014, referencing mainly national knowledge providers rather than OECD documents.

Concluding Discussion

All case country analyses reveal multiple layers in their OECD-related references. On the one hand, all policy documents—both green papers and white papers—tend to follow the demands and credos of evidence-based policy. Perhaps more importantly, the documents associated with OECD policy instruments carry more weight than what can be seen in the mere list of references. First, this finding reinforces the fact discussed previously in this volume that a bibliometric reference is more than a reference and signals commitment in addition to relaying information. Second, following the theoretical framework of this chapter, this finding leads us to conclude that the references are shaped by power relations and in the sense of political capital.

Drawing on all three case analyses, we find support for a hypothesis that the power networks that have been formed transnationally are manifest in the use of references in the documents analyzed. The same Finnish network of knowledge brokers functioned in PISA data collection, national education data collection, and ministry-commissioned national data gathering. In Iceland, the weak ties between national and international organizations formed a base of knowledge selection and use. This could be seen as a case where social capital is transferred into knowledge. In the Danish case, we also see the clear contours of a powerful national network of protagonists associated with evidence-based policy advice in general and the results from OECD reports and ILSA data in particular that have been able to exert considerable influence. The basis of this influence is found both in vocal media appearances and in participation in the relevant government bodies and consortia.

A mere quantitative analysis of references would seem to suggest that the OECD plays only a minor role in national education policies due to the fact that its references are in the minority despite being an important international reference in the analyzed country cases. However, it is our understanding that the importance of the OECD was not best visible in these references. When deeply analyzing each of the cases, we identified an appreciation of knowledge and an increased importance of national institutions through links to the OECD. This observation raises serious questions for bibliometric analysis and complex questions for further analysis and conclusions.

Without the pre-existing knowledge from other studies on the OECD, the quantitative analysis would probably have erred toward assigning the OECD a less important role than it has. This finding remains a warning for further analysis. From another perspective, one could argue that this analysis now gives the OECD a more important role than it deserves, as we have dug out different networks and connections with the OECD. Some perspective as to whether this argument stands could be drawn from an attempt to understand what would have been the alternative results. For instance, is it coincidental that the Finnish universities with more connections to the OECD are more referenced than the ones without? At the very least, the power of knowledge seems to channel through the same hubs. The OECD is indeed powerful in forming an epistemic community and a constituency underpinning its policy instruments that is more powerful than other international organizations in the Nordic countries. These findings correspond well with Grek’s aforementioned argument about the OECD taking up a position as a boundary organization in the Nordic region. Further studies should now move to unravel the configurations and workings of the nexus or assemblage that creates the basis for this power.