Towards Smart Specialisation 2.0. Main Challenges When Updating Strategies


The present work introduces an overview on the framework for the design of updated smart specialisation strategies (S3) in the EU for the programming period 2021–2027. The article reviews and analyses the main challenges that managing authorities face when updating or reshaping their S3 and policies, discussing potential mechanism to consider when facing them, contrasting it with numerous literature perspectives. While a theoretical perspective is presented, the article also aims at analysing the challenges from an applied point of view, structuring the different approaches in a way it can serve as a roadmap to build an updated S3.

Introduction: From Smart Specialisation 1.0

Originally, the expression ‘smart specialisation strategies (S3)’ came fundamentally from a theoretical approach that was developed by the logic presented by Landabaso (1997) and by an expert group set up in the first decade of this century by the European Commission called ‘Knowledge for Growth (K4G)’, following a practical approach based on the long-studied domain of innovation policy in Europe in the previous decade, following previous periods, as studies by Cooke and Morgan (1998). This group of experts had as its main goal designing and establishing research and innovation (R&I) strategies for these to be the base of a long run, sustainable and sustained economic growth. This group devised S3 as a sound approach, understanding this concept as the identification of competitive advantages in a territory (based on the relative position in front of other regions/countries), fostering them, identifying at the same time ways of collaboration and knowledge and technology exchange within that territory, as well as with other ones, in a synergic way, enlarging the interaction among innovation stakeholders (Foray et al., 2009; European Commission, 2012; McCann & Ortega-Argilés, 2013; Foray, 2015). This concept soon caught the attention of numerous agents in the sphere of innovation policy in the EU, which started seeing it as an interesting new paradigm for the European regional innovation systems (RIS).

The concept of RIS, with a vast literature behind (to cite some good examples, see for instance Asheim & Coenen, 2005; Autio, 1998; Braczyk et al., 1998; Cooke et al., 2000; Cooke, 2003; Cooke et al., 2004; Isaksen et al., 2018), has been elaborating on the framework around the numerous stakeholders that we find in innovation ecosystems, and the way they interact, within and without the scope of the policies tackling their activities, as well as the way these stakeholders are evolving over time, following these interactions and cooperation schemes (Edquist, 2005). RIS are to foster the spillovers coming from the synergies created through the collaboration of stakeholders and the agglomeration effects under some specialisation industrial domains (Rip, 2002; Laranja et al., 2008). The concept of entrepreneurial discovery process (EDP)—a key aspect of S3—is based on this. Following Foray (2014) and Asheim (2019), EDP can be defined as the set of processes leading to discoveries undertaken by the stakeholders of a given innovation system, understanding it from a broad angle (Asheim & Grillitsch, 2015) and coming from either formal and informal institutions or networks (Rodríguez-Pose & Wilkie, 2015). In other words, the EDP must lead to the identification of the competitive advantages of a territory, as well as its challenges and priorities, all to be tackled throughout the implementation of the S3. Following these EDP, another S3 principle arises: S3 is to avoid implementing a ‘one-size-fits-all’ in order to maximise the effectiveness of the strategy (Martin & Sunley, 2003), for a homogenous innovation policy would not efficiently work (Tödtling & Trippl, 2005).

The introduction of multiannual S3-based agendas, built on the concept of EDP, was established by the Regulation (EU) 1301/2021 of the European Parliament and of the Council of 13 December 2013 as an ex-ante conditionality for approving European funding under the Thematic Objective for Research and Innovation of the European Regional Development Fund (ERDF) for the programming period 2014–2020. Thus, the majority of regions of the Union (and/or, in some cases, at a state level) designed their S3 agendas, in which they defined a series of territorial specialisation domains and related priorities for which they designed a policy mix with a series of instruments that generate projects aimed at tackling these priorities.

In the academic literature, much has been analysed regarding the conceptual logic of S3 and the opportunities and challenges faced by it. As stated in a detailed bibliometric analysis, Fellnhofer (2018) identified the main research questions around this topic in previous works, and they mostly relate on the theoretical concept, its drivers and benchmarking, with very limited works on actual implementation. On the policy making side, in 2020, when facing the final months of the first period of the S3 agendas (2014–2020), management authorities have a double challenge in front of them: establishing clear conclusions about the outcomes of the strategy to date, analysing its successes and failures through a good monitoring and evaluation model, and using all that information to design an updated version of it, an S3 2.0 (as defined by the European Commission), where they are to maintain successful points, modify what can be improved and introduce all those elements that were not there and which could improve and expand the results. Additionally, the European Commission is expected to introduce two additional axes for the design of the S3 linked to the ERDF for the next programming period: more features of international collaboration on the one hand, and some other linked to industrial transition on the other (Vocaskova, 2020).

In this context, managing authorities face opportunities but also challenges when undertaking the S3 updating towards the programming period 2021–2027. We have divided these 3 main categories, which can be summarised in 3 questions and 3 main hypotheses linked to them, as it is presented in Table 1.

Table 1 Main variables in the design of S3 2.0

These challenges build on the constrain that updating the strategies means envisaging the future without a clear picture of the past, as it will be further discussed in “Learnings from S3 1.0 into S3 2.0”. This leads to a principal question: can S3 2.0 be constructed on major changes for the strategy and its potential impact, or it is more a minor update of the management and the policy mix which may look as new bottles for old wine? To extract some conclusions, our work discusses the main challenges linked to them. This discussion comes from a revision of the literature and the clustering of different points into the main aspects that we analyse in “Learnings from S3 1.0 into S3 2.0”, “Updating the Strategic Framework”, and “Including (or Further Developing) Key Dimensions”. For each of these items, we review some of the most relevant works in recent literature, discussing which the logic of the particular objective–challenge, which shall serve to assess the actual possibilities for a major/minor update of the strategy, as discussed in “Final Remarks”.

Learnings from S3 1.0 Into S3 2.0

Challenge 1: Closing S3 1.0 Without Enough Evidence

The first challenge when updating S3, as it would happen in any other policy domain, relates to the question ‘How to reshape something for the better when it is not yet totally clear what actually worked and what did not?’. Works like Morgan (2015) have criticised the lack of evidence based in the design and implementation of S3, throughout the first period, and the fact that decisions were made according to political decisions (more or less based on EDP) but with poor empirical proofs. Measuring innovation results takes time, especially if we aim at analysing the long run outcomes. Without having a complete overview of the reality for the period 2014–2020, it is hard to think that the strategy can be highly improved towards the next period.

In their article, Dosso et al. (2018), basing their analysis in works like those of the Manchester Institute of Innovation Research (2013), Love and Roper (2015) and Martin (2016a), they present the implications of transferring innovation studies and analyses into policy-making, stating that there are, of course, some challenges to be considered. However, evidence-based policy should not only include empirical results—otherwise an algorithm, through artificial intelligence, would design optimal strategies—but it must be composed by the human side of each policy, and it must be understood as the use of actual information obtained through research integrated into more subjective concepts like experience or judgement (Sackett et al., 1996). To reach this evidence-based system, monitoring and evaluation systems that are flexible, that can be constantly updated and which are ready from the beginning of the development process are to be designed, then implemented and then used to analyse the results.

However, without having such a system for S3, nor other reliable substitutes, the ERDF ex-ante conditionality still requires a 7-year strategy to be in place for the period 2021–2027, and it is therefore needed that S3 managing authorities gather all the available information to improve the strategy to its best. To do that, we must consider the evaluation mechanisms for the closing period. Literature on the evaluation of innovation policy has been developing throughout the last two decades. While it is not our purpose to go into detail on the logic of this domain, works like Arnold (2004), Edler et al. (2008) or Magro and Wilson (2013) offer a good overview on the logic of this concept. Linked to the main trend presented by these and other authors, and as stated by Gianelle et al. (2019a, b), S3 evaluation systems require tackling different information needs, given the multi-level governance logic, and the diverse territorial variables, avoiding too general approaches and trying to provide clear answers. In other words, designing an efficient S3 evaluation system would allow to be able to reach reliable conclusions towards the policy improvement. However, as we will further discuss in “Challenge 10: Building a Sound Monitoring System”, this also presents some design challenges.

In a recent work, Borrás and Laatsit (2019), in their study of innovation policy evaluation in the EU, found that just 6 out of the then 28 EU member states had introduced what they name ‘system-oriented innovation policy evaluation practices’. Additionally, following Papaconstantinou and Polt (1998), there are three commitments that should be present in innovation evaluation systems and, which include aspects that cannot be added ex-post—so they cannot be introduced in S3 1.0: to be designed jointly with the policy/strategy, not after; include formal obligations to reach linked to the results; and to be user-oriented, tackling the information needs of the different stakeholders.

Therefore, the first challenge (updating the strategy) becomes even larger, for there is scare information on the actual results and impact of the strategy and there are very limited evaluation mechanisms to extract conclusions. This leads us to anticipate the hypothesis (that we will be able to contrast in the near future when seeing the updated versions of S3) that the changes introduced in the several S3 strategies in the EU will be limited and far from bold. However, there are five elements from which conclusions can be reached through the implementation’s observation and for which it is possible to introduce relevant changes: (1) the governance system and the competences and interactions of its stakeholders; (2) the results of the EDP, in terms of the first results from this process; (3) the practical design and implementation management of the strategy’s instruments; (4) the smart specialisation process, in terms of the analysis of the projects in each priority domain; and (5) the usefulness of the current monitoring system (if any). We will discuss these points in further detail in the next sections. Nevertheless, we must conclude that these last elements relate just to an implementation-oriented analysis, but which will not tell us much about the main question, regarding whether the strategy’s main objectives are actually being met in the long run. To reach such conclusion, more time and much better analysis and evaluation mechanisms are needed.

Challenge 2: What to Keep, What to Reshape, What to Remove

Linked to the first challenge, managing authorities face the second: without complete information, it is not evident what elements from the first period should be kept, which should be eliminated, and which should be reshaped. Strategies to choose among these possibilities are diverse; some authors (see, for instance, Ghazinoory et al., 2019) have been even attempting at designing mathematical tools for choosing among different alternatives when designing innovation policy mix. In most cases, however, choices are made upon a descriptive reflection on the previous period. We can divide here those choices relating to the strategic scope and those that link to the practical implementation.

On the former, as we shall later analyse, the role of the EDP should be at the centre, assuming that the participatory process has been—to some extent—continuously active throughout the period 2014–2020; otherwise, it should be reactivated. On the latter, decisions can be made in a more informed way, identifying what worked, what failed and what should be implemented in a more efficient manner, according to past evidence.

In parallel, another interrelated—and key—variable should be considered: ERDF allocation for the next period, for the funds will largely support the implementation of the strategy (for the thematic objective on research and innovation). First of all, we must have in mind that not all territories are the same; for instance, regions with a less developed environment often present what was named the European regional innovation paradox, which refers to the fact that they require further funding while they have limited absorptive capacities (Urraya et al., 2018; Muscio et al., 2015; Papamichail et al., 2019). This difference, among other aspects, must be considering when reshaping the proposed ERDF operational programmes, which must be adapted to the territorial context and needs. Managing authorities must decide here how to allocate the budget, according to priority objectives and their instruments, and they can only do so from the previous experience, mostly based on practical implementation rather than any long run information on the actual impact under any cost-benefit analysis.

Updating the Strategic Framework

Challenge 3: Enlarging and Integrating the Governance System

The role of governance in public policies has been largely studied (see, for instance, Peters & Borrás, 2009; Risse, 2012; Ansell & Torfing, 2016); when it comes to innovation policy, there is some consensus around the fact that the quality of the institutional governance presents large differences across territories (Davoudi et al., 2008; Rodríguez-Pose, 2013; Rodríguez-Pose et al., 2015). While some regions with more innovation tradition and well-established institutions might be better prepared to implement S3, they might also face challenges when dealing with the larger relative number of stakeholders, with many ideas and different needs (Farole et al., 2011; Trippl et al., 2019). In other cases, these differences will be based on different governance quality according to their experience and other factors (Charron et al., 2014). Therefore, a one-fits-all approach will not lead to a sustainable efficient system. When assessing the changes needed towards S3 2.0, individual analyses should provide answers to questions such as ‘Did the governance model work?’ or ‘Could it be reshaped in a more efficient way?’ Two challenge variables must be analysed on this regard.

The first one refers to the concept of participatory governance, and the collaboration extent between managing authorities and external stakeholders (some sound works discussing these collaborations are those of Voorberg et al., 2015, Torfing, 2018, or Wegrich, 2019; literature has also put attention in the role of civil society in defining public budget, as presented in works like Geissel, 2009, Hong, 2015, or Ewens and Van der Voet, 2019). If EDP is the key factor for the design of S3, stakeholders—representing the quadruple helix—must also be (according to EDP conceptualisation) part of the governance of the strategy, and that includes collaborating in the definition of the policy mix and the continuous monitoring system. However, this requires establishing realistic coordination schemes to gather information and decisions from stakeholders in an efficient and pragmatic way. Authors like Jakobsen and Thrane (2016) have even discussed whether the complexity within public bodies support or constraint innovation, and if no pragmatic collaboration mechanisms are put in place, this could lead to negative externalities.

The second variable refers to policy mix governance. Policy instruments and support mechanisms might be diverse, including different funding schemes, different participating stakeholders, different expected outputs, etc. Considering all these variables, instruments must be coordinated to foster synergies and to avoid duplicities, but not just within the strategy, but also outside of it, analysing the coordination with other funding schemes and establishing connections with governance structures not directly linked to S3 (for instance, institutions managing other funding that could be related). Additionally, the pool of instruments is not always managed in its totality by the same agency, department or body; as stated by Flanagan et al. (2011), multi-governance systems for innovation policy come with large levels of complexity (Hooge & Marks, 2001; Bache et al., 2016; Aranguren, M.J. et al., 2018). In our context, this multi-governance can be understood from a territorial point of view (for instance, regions and states) or from a horizontal point of view, where different stakeholders—and even managing bodies of, let us say, different policy instruments—must be coordinated. Finding strategies to enlarge and improve this multiple-axes coordination could amplify the outcomes of the strategy.

Challenge 4: Improving the Entrepreneurial Discovery Process

As stated by Capello and Kroll (2016), there was never a definition of the EDP concept that was equally understood by all territories and stakeholders, with ongoing lack of consensus (Todeva & Ketikidis, 2017). When it was used to design the S3 for their first period, it was mostly understood as a participatory bottom-up approach to identify priorities, competitive advantages and specialisation domains, following what was defended by Rodrik (2004). However, more recent works, like those of Kroll (2015) or Kleibrink et al. (2017), suggest that EDP should also consider evidence-based aspects and an additional top-down perspective, in order to have a more complete and objective overview to make sound decisions. Reopening the understanding of EDP means rethinking its intrinsic meaning, putting it at the centre of a permanent renovation of the system (Foray, 2014).

EDP is, as the whole S3, to be adapted to the territorial idiosyncrasy, from a broad perspective (Asheim & Grillitsch, 2015). As one could expect, those regions with well-established institutional frameworks will achieve larger and more optimal EPD outcomes (Rodríguez-Pose & Wilkie, 2015), but that does not mean that those that are still in earlier development phases of their institutions cannot have a sound EDP; on the contrary, EDP must serve as a catalyser to accelerate the process, fostering the participatory approach. Some authors, like Isaksen et al. (2018a, b), go even further, arguing that EDP should be ‘institutionalised’ in a system of entrepreneurs if it is to achieve relevant outcomes. However, as stated by Gil and Pinto (2016), Sotarauta (2018), or Magro and Wilson (2019), EDP does not only imply identifying domains or challenges, but also a process of interaction between social and political stakeholders, with different interests and positions. Having in consideration all these different interests and priorities is relevant to maximise the potential benefits of EDP, while it is an arduous task.

Following the experience gathered throughout the first S3 period and the learnings obtained, included from the abovementioned works, when working on the EDP for the period 2021–2027, four main criteria are to be considered. The first one relates to the stakeholder composition, that is, determining which agents are to participate and how, considering the nature of the different agents and the vested confronted interests they have. The second regards the topics on which the EDP should reach conclusions, for ensuring an operative management of the process has been a major challenge throughout the first S3 (Gheorghiu et al., 2016; Fellnhofer, 2017). The third is based on the mechanisms to guarantee an active and wide participation in the EDP, ensuring that objective weighted conclusions are obtained. While framework models could be proposed to undertake the EDP (Del Castillo Hermosa et al., 2015) considering these three points, the discussed large disparities among territories suggest that ad hoc systems are to be considered in each case. Finally, the fourth, way less explored, relates to the possibility to extend the EDP further than just the design of the strategy, enlarging it throughout the whole implementation phase, linking it to the need to design innovation policies that are flexible and ever-adapting, and where stakeholders are to have a role in continuous improving. Merging these four components into a single EDP system comes with a large effort and it represents a main challenge, but the whole S3 concept is built on its logic; therefore, ensuring that an EPD 2.0 becomes (even) more accessible, integrating, and with an always-ongoing role could largely improve its results for S3, while enlarging the positive externalities that come from a participatory process such as this (outcomes from collaboration and networking leading to potential activities even outside the strategy).

Challenge 5: Updating the Strategic Specialisation Domains

While identifying objective specialisation patters in the EU regions can be done in a pretty straightforward way (Esparza-Masana, 2015), selecting priority domains is something more complex, for they are built on the EDP and the policy-related objectives, and establishing priority domains linked to specialisation may lead to misunderstanding the logic of the concept. As pointed out by Hassink and Gong (2019) and Gianelle et al. (2019a, b), specialisation may be a confusing concept, for what it really aims at is the diversification of innovation activities to tackle specific challenges; some authors have even referred to it as smart diversification (see, for instance, Boschma and Gianelle, 2013, and Piirainen et al., 2017).

Selecting priorities—specialising in this broad sense—comes from the argument of efficiency fostering. As presented by Miörner et al. (2018), clustering critical mass of stakeholders and R&I activities will reduce duplicities and facilitate the coordination of policies. However, putting this logic into action has not been easy, and there are limited results (Van der Broek et al., 2018). Under the S3 framework, the selection of priority domains and the interconnectivity of stakeholders were to create flows of information and synergies that were to improve the R&I system. Nevertheless, in many cases, priority domains have been defined in a largely broad way, narrowing the possibilities to generate industrial diversification that could lead to large improvements (Radosevic, 2017; Sörvik & Kleibrink, 2015). In front of the criticism, linked to these flaws, Foray (2018b) insists in the fact that choices are needed and discusses this need, aligning it to EDP and policy experimentation and flexibility.

Having this framework in mind, updating the specialisation domains for S3 2.0 provides the opportunity to present them in this inclusive way, fostering the interconnectivity of the stakeholders to tackle the regional challenges, in a mission-oriented R&I (Mazzucato, 2018; Foray 2018a), integrating the logic of social innovation (Moulaert, 2013). In this direction, two main aspects are to be considered. The first relates a reassessment of the meaning of the ‘sector’ concept, understanding sectors as transformative integrating domains, which have a broad space to include emerging activities. The second builds on transnational collaboration, for one of the key aspects of specialisation is the goal of establishing cross-border connections to work together and exchange R&I results.

Challenge 6: Reconsidering and Reshaping the Policy Mix

The concept of ‘policy mix’ was first defined by Robert Mundell in 1962, in the context of monetary and fiscal policies (Mundell, 1962), and it can be defined the set of policies/policy instruments—and their interaction—to reach strategic goals. The concept evolved in the context of other policies and it reached innovation policy. A good and detailed approach to all the contents covered by the logic of policy mix in innovation can be found, for instance, in ‘The innovation of policy mix’ (OECD, 2010). In our context, when we refer to policy mix, we shall use the definition of Boekholt (2010, p. 353) stating it as “… the combination of policy instruments, which interact to influence the quantity and quality of research and development investments in public and private sectors”. Regarding these policy instruments, works like those of Flanagan et al. (2011) or Cunningham et al. (2016) present a sound overview, as well as the one from Borrás and Edquist (2013), who state that ‘policy instruments must be designed, redesigned, and adapted through time to the specific problems in the innovation systems’.

As the strategy itself, the policy mix can be updated to a 2.0 version. In their study, Guzzo et al. (2018) present that only 21% of the representatives from S3 management bodies from 67 EU consider the design of the policy mix as something easy, in front of 40% saying it is something difficult or very difficult (the rest being neutral). According to this, policy mix design is indeed a challenge, which is to be tackled when updating it, and different dimensions must be considered when doing so (Martin, 2016b). We divide these considerations into four dimensions, all of them including their respective decisions and related challenges.

(1) Updating old instruments: even if the objectives of strategy stayed the same, policy mixes should be reshaped over time to ensure that they still serve their purpose (Kay, 2007; Van der Heijden, 2011; Taeihagh et al., 2013). Therefore, those instruments from the closing period that are staying can also be updated to make them more efficient, include some new strategic considerations or modify some management aspects.

(2) New instruments: after the learnings from the period 2014–2020, and the results of the EDP, some new policy instruments may be introduced to tackle new challenges, to give response to some demands or to improve or enlarge the R&I ecosystem, from both supply and demand sides. For instance, from this demand side, introducing public procurement of innovation (PPI) as a new (even if, in some case, just updated) instrument can be seen as a good example, for PPI is growing in importance in innovation strategies across many different countries, and it is a key instrument at EU level (European Commission, 2007; European Commission, 2011; OECD, 2011), and it is promoted for S3 (European Commission, 2012). PPI can be understood and classified in different ways. For instance, Edler and Uyarra (2013), distinguish PPI as public organisations buying goods and services not yet available in the market, or buying products and services new to the organisation to foster a catalyst effect.

(3) Considering updated priorities: following the EDP, priorities—not just related to specialisation domains, but also on other strategic aspects which have a direct impact in the policy mix—may have changed. Therefore, this mix update includes the assessment, consideration and potential inclusion of these changed or reshaped priorities.

(4) Instrument management: this point related to both updated and new instruments and it refers to the need to improve, when necessary, the logistics of the instruments (simplification of the bureaucracy linked to them, making rules clearer, enlarging the participation of stakeholders, etc.).

A relevant aspect directly linked to the policy mix is funding schemes. Generally, instruments linked to the S3 have been financed in the 2014–2020 period through the ERDF, jointly with the co-funding from the national, regional schemes and/or directly by the beneficiaries of the approved projects. Although it is expected that this situation will continue in the next programming period, two variables should be considered: (1) S3 are not necessarily only attached to the ERDF but may include more objectives, more instruments and more projects that come from other funds; and (2) S3 could make a greater effort to seek synergies with initiatives financed outside the framework of the S3, including those from other European funds (such as Horizon 2020/Europe) and those national and/or sub-national, as presented in Pérez et al. (2014) or Ferry et al. (2016). Considering all these possibilities can be a key factor to decide to which extent S3 is to be understood as a strategy designed to foster long run real impact based on the development of innovation system, or if it stays as a mere condition linked to the European funds.

Including (or Further Developing) Key Dimensions

Challenge 7: Fostering the Industrial Transition

The European Commission is expected to include an item on industrial transition as one of the key points for assessing the S3 agendas of the EU territories under the logic of the ex-ante conditionality of ERDF–R&I thematic objective (Przeor, 2019). The rationality behind it is clear: the EU needs to be more competitive and more prepared for the changes coming from decarbonisation, new digital technologies and, in general terms, globalisation, where new technologies, industry 4.0, and the green economy perspective are rapidly becoming the central point of industrial policy, and innovation is the key element of this process. Therefore, industrial transition should not be understood as modernisation of traditional sectors like carbon or textiles, but rather a general advance towards the new industrial systems, in broader terms (Vocaskova, 2020).

The transformation and modernisation of traditional economic activities have been long studied, for years; major works like Dertouzos et al. (1989) or Sabel et al. (1987) already discussed its logic and interest for industrial policy. In the last years, advances on ICT and robotics have led to interdisciplinary knowledge and technologies integration (Wang et al., 2016) that prove that there is room to upgrade traditional industrial activities into modern and more competitive ones. A social aspect is directly interlinked to it, for it also includes a redefinition of the labour market, where different skills are needed and tasks become more complex, while leading to larger economic efficiency (Kagermann et al., 2013). Since the role of industry 4.0 will be enlarging in the following years (Schneider, 2018), innovation policies cannot be external to this process, and they must assess the possibilities it offers to enlarge the outcomes of strategies such as S3.

In this S3 context, industrial transition can be tackled from two (complementary) perspectives: fostering the ecosystem and advancing on specific areas. The former relates to reducing barriers fostering this transition, while promoting incentives and programmes aiming at facilitating initiatives in this regard. The latter consists in design explicit actions to tackle industrial transition from pre-designed objective-based projects, which can also be directed at specific domains, like digitalisation, sustainability, manufacturing 4.0, etc. It is clear that these domains are getting more and more attention, from both public and perspectives, so establishing ad hoc programmes within S3 policy mix could enlarge outcomes of the industrial transition initiatives, aiming at maximising their impact.

Challenge 8: Enlarging Transnational Collaboration

Transnational collaboration is another ‘novelty’ aspect of the expected requirements of the European Commission when validating the ex-ante conditionality linked to the ERDF innovation, SMEs and ICTs thematic objectives (Vocaskova, 2020). Traditionally, these cross-territorial collaborations have been located in just a few specific geographical areas (Hoekman et al., 2009; Maggioni & Uberti, 2009); extending that to all the regions and countries is now the goal. However, S3 tend to be based on the region they cover, and they lack a sound orientation towards transregional collaboration (Radosevic & Stancova, 2018). While S3 interregional collaborations on common specialisation domains has not been much analysed when it comes to S3 (Iacobucci & Guzzini, 2016), Radosevic and Stancova (2018) conclude that there is not much understanding, interest and/or capacity to explore when it is regarding the possibilities coming from transregional cooperation. In the same direction, Uyarra et al. (2018) analyse data from a survey on EU regions and conclude that there is recognition of the importance of this transregional collaboration but there are some limits to make it actually happen in a large scale, namely lack of political commitment or difficulties to develop joint instruments.

Introducing cross-border collaborations is based on enlarging the synergies and positive externalities that arise from these collaborations, learning from the stakeholders and models in other territories, that is, as stated by Uyarra et al. (2018), avoiding local myopia when just performing the EDP without analysing the R&I systems and patterns of other territories. Building transnational innovation ecosystems, understood as the integrative cross-border collaboration of different national or sub-national ones (Chaminade & Nielsen, 2011; Frankel & Maital, 2014) can lead to developing these synergies. However, we can identify three main challenges linked to these transregional initiatives:

(1) There is (very) limited experience, especially at a regional level, when it comes to transnational innovation initiatives, under strategic frameworks such as S3. Most of the international collaboration programmes are managed by bodies external to these regions (a major example is Horizon 2020 for the period 2014–2020, managed directly by the European Commission). National and—especially—sub-national authorities may have narrow capacities to do so. The support of programmes such as the European Territorial Cooperation (also known as Interreg) ones can be a base to foster these collaborations from a policy-making perspective, enlarging these capacities. Other programmes on initiatives at macro-regional or multi-regional level could also serve as key examples to learn from (European Commission, 2018a). For instance, the Vanguard Initiative could set a good example; under these collaboration programme, regions from 11 EU member states (plus Scotland) aim at developing long-term projects with stakeholders in their regions linked to their S3 domains of interest or specialisation (Interreg Europe, 2017).

(2) While 2014–2020 ERDF offered the option of devoting some funding to transnational projects, this possibility has not been much explored. Since funding is limited, territories have more incentives to invest the funds locally. The European Commission plans to allocate ERDF resources to transnational collaboration projects under common S3 priority domains (Vocaskova, 2020). Measures to design and to implement these initiatives are to be therefore considered in the updated S3, for which new development models are required.

(3) As stated by Edler and Flanagan (2011), there is as well a need for better indicators providing clearer and more objective information on existing and future international collaborations to ensure efficient policies to improve their number and results. Therefore, and linked to the monitoring procedures that are discussed in “Challenge 10: Building a Sound Monitoring System”, new mechanisms to assess the actual impact of transnational collaboration instruments and projects are to also be in place.

Challenge 9: Ensuring More and Better Communication and Dissemination

Although there is widespread awareness regarding the need to expand the role of communication and dissemination in relation to innovation policies, strategies for this purpose are still scarce and often insufficient, as there is still little knowledge of R&I results—and their utility and impact—in civil society in general. While researchers often put lots of efforts in explaining and disseminating their results (Priest, 2014), training on how to communicate them to the general public is still weak (Silva & Bultitude, 2009), from graduate students in scientific and technology domains (Leshner, 2007) to most researchers in general. Promoting more and better communication from the R&I provider side is relevant, but assessing the right approach to reach the general public can be even more challenging, for we are referring to an audience that not always understands the concepts or has the right interest (Kahan et al., 2012; Johnson & Hamernik, 2015).

The criteria established by the European Commission in relation to the design of smart specialisation strategies did not explicitly include the introduction of communication and dissemination elements, nor is included in the available proposals for S3 2.0 (Vocaskova, 2020). Additionally, most now-closing S3 have introduced these elements as specific strategic elements. However, different authors have recently emphasised the need to foster R&I communication facilitate the understanding and relevance of these activities the general public (Fischhoff and Scheufele, 2013; Kahan, 2015).

Would strategy-makers follow this logic and consider communication and dissemination activities an additional (new or reinforced) element for S3 2.0, they can do it in two parallel ways. The first one is to include communication strategic objectives and measures, which could include instruments (hard and/or soft) to foster it. The second one would be based on fostering the activities of communication and dissemination within the funded projects, through formal actions or work packages.

Challenge 10: Building a Sound Monitoring System

In the study mentioned in “Challenge 6: Reconsidering and Reshaping the Policy Mix”, Guzzo et al. (2018) also include information on the S3 practitioners’ view on S3 monitoring. Only 20% of the 67 respondents said that they find it easy or very easy, in front of an overwhelming 54% saying that they found it difficult or very difficult (the rest were neutral). Compared with all other domains in designing and implementing S3, monitoring clustered the larger number of respondents stating that they found it something (very) difficult, more than EDP, governance or any other strategic aspect.

Authors like Arnold (2004), Elder et al. (2008) or Magro and Wilson (2013) offer sound perspectives on monitoring and evaluation schemes for innovation policies, stressing how monitoring innovation must necessarily be different than monitoring public policies in general. When it comes to S3, the chance to implement a new or updated approach for monitoring these policies should be used. Following Esparza-Masana and Fernández (2019), a sound S3 monitoring system must be based on three key axes: direct outputs, smart specialisation process and long run impact.

The first one is based on accountability and it measures the direct results of the instruments and their projects, with clear quantifiable indicators that aim at assessing whether the direct objectives were met (increase in R&I expenditure, new innovative start-ups, high-skilled employment generated, and a long etc.). Large academic discussions have been undertaken regarding the selection of the right objective quantitative-based indicators for S3 (see, for instance, recent examples in Balland et al., 2019; Varga et al., 2020). These indicators, or at least a number of them, are already required to monitor the ERDF, and they are already considered in most strategies, serving accountability purposes (Hanberger, 2011; Magro & Wilson, 2015).

However, as stated by Grillitsch and Asheim (2018), qualitative systems are also needed to understand the whole impact of innovation policy. The first qualitative aspect (which also includes some quantitative variables) refers to the second S3 axis, which monitors the evolution of the specialisation sectors or domains, that is, gathering information that shows how each of these sectors are changing, or what trends they are following. Besides assessing their relevance and the patterns these sector follow, monitoring the smart specialisation process can include indicators on the emerging activities for each sector (or cross-sector), or the development of the collaborative economy (European Commission, 2018a, b), among other. This axis unavoidably requires the participation of experts in the different sectors, developing technology assessments (Weinberger et al., 2013).

Finally, a third axis refers to the long run impact, the evolution of the innovation ecosystem, the stakeholder learnings. S3 aims at developing the whole R&I system, not just at implementing some policy instruments; this is why it is crucial to understand which are the long run results, the positive externalities of the short run policy mix, that is, the positive externalities. Largely cited works like those of Jaffe et al. (1993) or Audretsch and Feldman (1996) already aimed at studying innovation spillovers and their role in system transformation (Floc’hlay & Plottu, 1998). The indicators in this axis are not always objective, not easy to compare over time or over territories, they require the involvement of stakeholders (Kleibrink et al., 2016), etc. That is, they are based on a dynamic model, in which there are flexible indicators that allow it to be shaped to answer the different questions that will be posed in order to make sound decisions regarding the strategy and its policy mix. When we talk about giving answers, we not only refer to the managing authorities, but also to the agents that are part of the EDP, who must also be able to participate in this system and obtain conclusions from the information contained therein; therefore, monitoring must cover a broad spectrum of points covering different variables.

Including these three axes poses relevant challenges, for they imply large efforts, including funding and human capital. However, linking to the first challenges presented at the beginning of this article, without a sound monitoring system, it will be difficult to extract relevant conclusions for complete evaluation conclusions (Leeuw & Furubo, 2008).

Final Remarks

Throughout the last sections, we have reviewed the major common challenges when updating S3 to a 2.0 version to be implemented in the period 2021–2027. A major point when envisaging the potential for a sound S3 update, following our discussion, is based on the lack of sustained evidence on the results and impact of the strategy in the closing period. Innovation activities usually require longer periods of development before actual impact analyses can be performed. Nevertheless, managing authorities in charge of designing the new S3 version have room for improvement (when needed) when it comes to pragmatic criteria: enlarging the role of the innovation stakeholders, improving the governance system, update or change the policy instruments (including a revision on their functioning), etc. Therefore, linked to our initial first two hypotheses, we can speak of a mixed update: it is possible to improve or change some practical aspects linked to the management of the strategy—for there is already evidence on the results—while more strategy-oriented decisions are more complicated to make, for a larger period of time is needed for the assessment of the actual impact of the strategy 2014–2020 in terms of the change in the innovation ecosystem.

The third domain (introducing—or largely upgrading—some aspects including transnational collaborations, industrial transition, new monitoring systems or communication and dissemination tools) offers large potential for the next programming period, statement we can highlight after the discussion in “Including (or Further Developing) Key Dimensions”. Among these axes, cross-border collaborations are to play a major role and it offers large opportunities to achieve the synergies aimed by the development of S3 at a European level—not just territorial. However, the impact coming from these initiatives will depend on both the efforts to plan the right instruments to motivate the stakeholders and to be able to introduce flexibility mechanisms to adapt to the changes inherent to any innovation project. In this direction, it is important to underline the need for sound monitoring and evaluation mechanisms that can provide information and conclusions on these new or upgraded aspects, while reinforcing the analysis of both the objective and subjective outputs and outcomes of the strategy.

We have discussed these different variables, all directly linked to the strategy’s design and implementation. However, we cannot forget about the external environment that can affect its logic in the next years. Changes in dynamic aspects like the research around climate change, social changes or unexpected major challenges (like the crisis generated by the COVID-19 pandemic) can require flexibility to address them and the potential bottlenecks that can appear when undertaking R&I initiatives. When considering all the analysed criteria, we cannot forget about these external ecosystem threats (but also opportunities) and, when tackling each challenge, they must be considered and addressed.

At the present time, being at the end of the period 2014–2020, and given the limitations coming from the lack of long run evidence on the strategy results, as we discussed in “Learnings from S3 1.0 into S3 2.0”, S3 updating presents some limitations, but there is room for relevant upgrading of strategic and managerial aspects, as well as the introduction of new axes. The coming months and years will allow studies to be carried out based on concrete information on the impact of S3, for which data is still insufficient today. This will allow to carry out in-depth empirical academic analyses, while, at the same time, offering policy makers conclusions already based on evidence. For now, managing authorities have before them a wide range of possibilities when updating the S3 at this transition point; the level of ambition (and possibilities) in each case may differ. The scope and the consequent impact will depend both on the actual design of the updated strategy and on the economic and social context of the coming years.


  1. Ansell, C., & Torfing, J. (Eds.). (2016). Handbook on theories of governance. Northampton (MA): Edward Elgar Publishing.

    Google Scholar 

  2. Aranguren, M. J., Magro, E., Navarro, M., & Wilson, J. R. (2018). Governance on the territorial entrepreneurial discovery process: Looking under the bonnet of RIS3. Regional Studies, 53(4), 451–461.

    Article  Google Scholar 

  3. Arnold, E. (2004). Evaluating research and innovation policy: A systems world need systems evaluation. Research Evaluation, 13(1), 3–17.

    Article  Google Scholar 

  4. Asheim, B. T., & Coenen, L. (2005). Knowledge bases and regional innovation systems: What about new path development in less innovative regions? Innovation: The European Journal of Social Science Research, 32: 8–25.

  5. Asheim, B. T., & Grillitsch, M. (2015). Smart specialisation: Sources for new path development in a peripheral manufacturing region. CIRCLE Electronic Working Papers, no. 2015/11. Lund (SE): CIRCLE, Lund University.

  6. Asheim, B. T. (2019). Smart specialisation, innovation policy and regional innovation systems: What about new path development in less innovative regions? Innovation: The European Journal of Social Science Research, 32: 8–25.

  7. Audretsch, D. & Feldman, M. (1996). R&D spillovers and the geography of innovation and production. American Economic Review, 86(3).

  8. Autio, E. (1998). Evaluation of RTD in regional systems of innovation. European Planning Studies, 6, 131–140.

    Article  Google Scholar 

  9. Bache, I., Bartle, I., & Flinders, M. (2016). Multi-level governance. In Ansell, C. and Torfing, J. (eds.) Handbook on theories of governance. Northampton (MA): Edward Elgar Publishing.

  10. Balland, P. A., Boscha, R., Crespo, J., & Rigby, D. L. (2019). Smart specialization policy in the European Union: Relatedness, knowledge complexity and regional diversification. Regional Studies, 53(9), 1252–1268.

    Article  Google Scholar 

  11. Boekholt, P. (2010). The evolution of innovation paradigms and their influence on research, technological development and innovation policy instruments. In R. Smits, S. Kuhlmann, P. Shapira (Eds.), The Theory and Practice of Innovation Policy—An International Research Handbook, Edward Elgar

  12. Borrás, S. & Edquist, C. (2013). The choice of innovation policy instruments. CIRCLE Electronic Working Papers, no. 2013/04. Lund (SE): CIRCLE, Lund University.

  13. Borrás, S., & Laatsit, M. (2019). Towards system oriented innovation policy evaluation? Evidence from EU28 member states. Research Policy, 48(1), 312–321.

    Article  Google Scholar 

  14. Boschma, R., & Gianelle, C. (2013). Regional branching and smart specialisation policy. JRC technical reports. Seville: Joint Research Centre, European Commission.

  15. Braczyk, H. J., Cooke, P., & Heindenreich, M. (Eds.). (1998). Regional innovation systems. London: UCL Press.

  16. Capello, R., & Kroll, H. (2016). From theory to practice in smart specialization strategy: Emerging limits and possible future trajectories. European Planning Studies, 24, 1393–1406.

    Article  Google Scholar 

  17. Chaminade, C., & Nielsen, H. (2011). Transnational innovation ecosystems. Santiago de Chile: Eclac.

    Google Scholar 

  18. Charron, N., Dijkstra, L., & Lapuente, V. (2014). Regional governance matters: Quality of government within European Union member states. Regional Studies, 48(1), 68–90.

    Article  Google Scholar 

  19. Cooke, P. (2003). Strategies for Regional Innovation Systems: Learning transfer and applications. Policy Papers Series. Vienna: United Nations Industrial Development Organisation.

    Google Scholar 

  20. Cooke, P., Boekholt, P., & Tödtling, F. (2000). The governance of innovation in Europe. London: Pinter.

    Google Scholar 

  21. Cooke, P., Heindereich, M., & Braczyk, H. J. (2004). Regional Innovation Systems: The role of governance in a globalised wold. London: Routledge.

    Google Scholar 

  22. Cooke, P., & Morgan, K. (1998). The associational economy: Firms, regions, and innovation. Oxford: Oxford University Press.

    Google Scholar 

  23. Cunningham, P., Edler, J., Flanagan, K., & Larédo, P. (2016). The innovation policy mix. In Edler, J., Cunningham, P., Gök, A., and Shapira, P. (eds.) Handbook of innovation policy impact. Cheltenham: Edward Elgar.

  24. Davoudi, S., Evans, N., Governa, F., & Santangelo, M. (2008). Territorial governance in the making: Approaches, methodologies, practices. Boletín de la Asociación Española de Geógrafos, 46, 33–52.

    Google Scholar 

  25. Del Castillo Hermosa, J., Paton Elorduy, J., & Barroeta Eguía, B. (2015). Smart specialization and entrepreneurial discovery: Theory and reality. Revista Portuguesa de Estudos Regionais, 39, 5–22.

    Google Scholar 

  26. Dertouzos, M., Lester, R., & Solow, R. (1989). Made in America: Regaining the productive edge. Cambridge: MIT Press.

    Google Scholar 

  27. Dosso, M., Martin, B. R., & Moncada-Paternò-Castello, P. (2018). Towards evidence-based industrial research and innovation policy. Science and Public Policy, 45(2), 143–150.

    Article  Google Scholar 

  28. Edler, J., Ebersberger, B., & Lo, V. (2008). Improving policy understanding by means of secondary analyses of policy evaluation. Research Evaluation, 17(3), 175–186.

    Article  Google Scholar 

  29. Edler, J., & Flanagan, K. (2011). Indicator needs for the internationalisation of science policies. Research Evaluation, 20(1), 7–17.

    Article  Google Scholar 

  30. Edler, J. & Uyarra, E. (2013). Public procurement and innovation. In Osborne, S.P. & Brown, L. (eds.) The handbook of innovation in public services. Cheltenham: Edward Elgar.

  31. Edquist, C. (2005). Systems of Innovation – Perspectives and challenges. In: Fagerberg, J., Mowery, D., & Nelson, R. (Eds.) The Oxford handbook of innovation. Oxford: Oxford University Press.

  32. Esparza-Masana, R. (2015). Analysis of the specialization patterns for scientific and industrial activities in the EU regions in the framework of the smart specialization strategy. Riga: Scholar’s Press.

    Google Scholar 

  33. Esparza-Masana, R., & Fernández, T. (2019). Monitoring S3: Key dimensions and implications. Evaluation and Program Planning, 77.

  34. European Commission (2007). A lead marker initiative for Europe. Brussels: European Commission.

    Google Scholar 

  35. European Commission (2011). Green paper on the modernisation of EU public procurement policy: Towards a more efficient European procurement market. Brussels: European Commission.

    Google Scholar 

  36. European Commission (2012). Guide to research and innovation strategies for smart specialisation. Luxembourg: Publications Office of the European Union.

    Google Scholar 

  37. European Commission (2018a). Study on macro-regional strategies and their links with Cohesion Policy. Luxembourg: Publications office of the European Union.

    Google Scholar 

  38. European Commission (2018b). Study to monitor the economic development of the collaborative economy at sector level in the 28 EU member states. Luxembourg: Publications office of the European Union.

    Google Scholar 

  39. Ewens, H., & Van der Voet, J. (2019). Organizational complexity and participatory innovation: participatory budgeting in local government. Public Management Review, 21(12), 1848–1866.

    Article  Google Scholar 

  40. Farole, T., Rodríguez-Pose, A., & Storper, M. (2011). Human geography and institutions that underline economic growth. Progress in Human Geography, 35(1), 58–80.

    Article  Google Scholar 

  41. Fellnhofer, K. (2017). Facilitating entrepreneurial discovery in smart specialization via stakeholder participation with online mechanisms for knowledge-based policy advice. Cogent Business and Management, 4, 1296802.

    Article  Google Scholar 

  42. Fellnhofer, K. (2018). Visualised bibliometric mapping on smart specialisation: A co-citation analysis. International Journal of Knowledge-Based Development, 9(1), 76–99.

    Article  Google Scholar 

  43. Ferry, M., Kah, S., & Bachtler, J. (2016). Maximisation of synergies between European Structural and Investment Funds and other EU instruments to attain Europe 2020. Study of the Policy Department Structural and Cohesion Policies. Brussels: European Parliament.

  44. Fischhoff, B. & Scheufele, D. A. (2013). The science of science communication. Proceedings of the National Academy of Sciences, 110 (sup. 3): 14031–14032.

  45. Flanagan, K., Uyarra, E., & Laranja, M. (2011). Reconceptualising the ‘policy mix’ for innovation. Research Policy, 40(5), 702–713.

    Article  Google Scholar 

  46. Floc’hlay, B. and Plottu, E. . (1998). Democratic evaluation from empowerment evaluation to public decision-making. Evaluation, 4(3), 261–277.

    Article  Google Scholar 

  47. Foray, D., David, P.A., & Hall, B. (2009). Smart specialisation – The concept. Knowledge Economists Policy Briefs, n. 9.

  48. Foray, D. (2014). From smart specialisation to smart specialisation policy. European Journal of Innovation Management, 17(4), 492–507.

    Article  Google Scholar 

  49. Foray, D. (2015). Smart specialisation: Opportunities and challenges for regional innovation policies. Abingdon: Routledge – Regional Studies Association.

  50. Foray, D. (2018a). Smart specialisation strategies and industrial modernisation in European regions – theory and practice. Cambridge Journal of Economics, 42(6), 1505–1520.

    Google Scholar 

  51. Foray, D. (2018b). Smart specialisation strategies as a case of mission-oriented policy – a case study on the emergence of new policy practices. Industrial and Corporate Change, 27, 817–832.

    Article  Google Scholar 

  52. Frenkel, A., & Maital, S. (2014). Mapping national innovation ecosystems: Foundations for policy consensus. Cheltenham: Edward Elgar Publishing.

    Google Scholar 

  53. Geissel, B. (2009). How to improve the quality of democracy? Experiences with participatory innovations at the local level in Germany. German Politics and Society, 27(4), 51–71.

    Article  Google Scholar 

  54. Georghiu, R., Andreescu, L., & Curaj, A. (2016). A foresight toolkit for smart specialization and entrepreneurial discovery. Futures, 80, 33–44.

    Article  Google Scholar 

  55. Ghazinoory, S., Amiri, M., Ghazinoori, S., & Parisa, A. (2019). Economics of Innovation and New Technology, 28(4), 365–385.

    Article  Google Scholar 

  56. Gianelle, C., Guzzo, F. & Marinelli E. (2019). Smart specialisation evaluation: Setting the scene. JRC Policy Insights, JRC116110. Seville: Joint Research Centre – European Commission.

  57. Gianelle, C., Guzzo, F. & Mieskowski, K. (2019). Smart specialisation: what gets lost in translation from concept to practice? Regional Studies, forthcoming.

  58. Gil, N. & Pinto, J. (2016). The complexity of pluralism: Designing collective action in megaprojects. On-line publication.

  59. Grillitsch, M., & Asheim, B. (2018). Place-based innovation policy for industrial diversification in regions. European Planning Studies, 26, 1638–1662.

    Article  Google Scholar 

  60. Guzzo, F., Gianelle, C., & Marinelli, E. (2018). Smart specialisation at work: The policy makers’ view on strategy design and implementation. S3 Working Paper Series, n. 15/2018. Seville: Joint Research Centre (European Commission).

  61. Hanberger, A. (2011). The real functions of evaluation and response systems. Evaluation, 17(4), 327–349.

    Article  Google Scholar 

  62. Hassink, R., & Gong, H. (2019). Six critical questions about smart specialization. European Planning Studies, 27(10), 2049–2065.

    Article  Google Scholar 

  63. Hoekman, J., Frenken, K., & Van Oort, F. (2009). The geography of collaborative knowledge production in Europe. Annals of Regional Science, 43, 721–738.

    Article  Google Scholar 

  64. Hong, S. (2015). Citizen participation in budgeting: A trade-off between knowledge and inclusiveness? Public Administration Review, 75(4), 572–582.

    Article  Google Scholar 

  65. Hooghe, L. & Marks, G. (2001). Types of multi-level governance. European Integration Online Papers, 5(11).

  66. Iacobucci, D., & Guzzini, E. (2016). Relatedness and connectivity in technological domains: Missing links in S3 design and implementation. European Planning Studies, 24, 1511–1526.

    Article  Google Scholar 

  67. Interreg Europe (2017). The ins and outs of the Vanguard Initiative. Policy Briefs from the Policy Learning Platform on research and innovation. Lille: Interreg Europe Joint Secretariat.

  68. Isaksen, A., Kyllingstad, N., Rypestøl, J.O., & Schulze-Krogh, A.C. (2018). Differentiated regional entrepreneurial discovery process. A conceptual and empirical illustration from three emerging clusters. European Planning Studies, 26(11): 2200–2215.

  69. Isaksen, A., Martin, R., & Trippl, M. (2018). New avenues for regional innovation systems: Theoretical advances, empirical cases and policy lessons. Cham: Springer.

    Google Scholar 

  70. Jakobsen, M.L.F. & Thrane, C. (2016). Public innovation and organisational structure: Searching (in vain) for the optimal design. In Torfing, J. and Triantafillou, P. (eds.) Enhancing public innovation by transforming public governance. Cambridge: Cambridge University Press.

  71. Jaffe, A., Trajtenberg, M., & Henderson, R. (1993). Geographic localization of knowledge spillovers as evidenced by patent citations. Quarterly Journal of Economics, 108(3)

  72. Johnson, J. A., & Hamernik, D. L. (2015). Science and society: Challenges of communicating science. Animal Frontiers, 5(3), 6–12.

    Google Scholar 

  73. Kagermann, H., Wahlster, W., & Helbig, J. (2013). Recommendations for implementing the strategic initiative Industrie 4.0: Securing the future of German manufacturing industry. Report from the Industry 4.0 Working Group. Munich: National Academy of Science and Engineering

  74. Kahan, D. M., Peters, E., Wittlin, M., Slovic, P., Ouellette, L. L., Braman, D., & Mandel, G. (2012). The polarizing impact of science literacy and numeracy on perceived climate change risks. Nature Climate Change, 2, 732–735.

    Article  Google Scholar 

  75. Kahan, D. M. (2015). What is the “science of science communication”? Journal of Science Communication, 14(3), 1–10.

    Article  Google Scholar 

  76. Kay, A. (2007). Tense layering and synthetic policy paradigms: The politics of health insurance in Australia. Australian Journal of Political Science, 42(4), 391–408.

    Article  Google Scholar 

  77. Kleibrink, A., Gianelle, C., & Doussineau, M. (2016). Monitoring innovation and territorial development in Europe: Emergent strategic management. European Planning Studies, 24(8), 1438–1458.

    Article  Google Scholar 

  78. Kleibrink, A., Larédo, P., & Philipp, S. (2017). Promoting innovation in transition counties: A trajectory for smart specialisation. JRC Science for Policy Report. Seville: European Commission.

    Google Scholar 

  79. Kroll, H. (2015). Efforts to implement smart specialisation in practice: Leading unlike horses to the water. European Planning Studies, 23(10), 2079–2098.

    Article  Google Scholar 

  80. Landabaso, M. (1997). The promotion of innovation in regional policy: Proposal for a regional innovation strategy. Entrepreneurship and Regional Development, 9, 1–24.

    Article  Google Scholar 

  81. Laranja, M., Uyarra, E., & Flanagan, K. (2008). Policies for science, technology and innovation: translating rationales into regional policies in a multi-level setting. Research Policy, 37(5), 823–835.

    Article  Google Scholar 

  82. Leeuw, F., & Furubo, J. E. (2008). Evaluation systems: What are they and why study them? Evaluation, 14(2), 157–169.

    Article  Google Scholar 

  83. Leshner, A. I. (2007). Outreach training needed. Science, 315(5809), 161.

    Article  Google Scholar 

  84. Love, J. H., & Roper, S. (2015). SME innovation, exporting and growth: A review of existing evidence. International small business journal, 33(1), 28–48.

    Article  Google Scholar 

  85. Maggioni, M. A., & Uberti, T. E. (2009). Knowledge networkers across Europe: Which distance matters? Annals of Regional Science, 43, 691–720.

    Article  Google Scholar 

  86. Magro, E., & Wilson, J. R. (2013). Complex innovation policy systems: Towards an evaluation mix. Research Policy, 42(9), 1647–1656.

    Article  Google Scholar 

  87. Magro, E. & Wilson, J.R. (2015). Evaluating territorial strategies. In Valdaliso, J.M., and Wilson J.R. (eds.) Strategies for shaping territorial competitiveness. Oxon: Routledge.

  88. Magro, E. & Wilson, J.R. (2019). Policy-mix evaluation: Governance challenges from new place-based innovation policies. Research Policy, 48(10).

  89. Manchester Institute of Innovation Research. (2013). The compendium of evidence on innovation policy. Manchester: Manchester Institute of Innovation Research.

    Google Scholar 

  90. Martin, R., & Sunley, P. (2003). Deconstructing clusters: Chaotic concept or policy panacea? Journal of Economic Geography, 3, 5–35.

    Article  Google Scholar 

  91. Martin, B. R. (2016). Twenty challenges for innovation studies. Science and Public Policy, 43(3), 432–450.

    Article  Google Scholar 

  92. Martin, R. B. (2016). R&D policy instruments: A critical review of what we do and what we don’t. Industry and innovation, 23(2), 157–176.

    Article  Google Scholar 

  93. Mazzucato, M. (2018). Mission-oriented innovation policies: Challenges and opportunities. Industrial and Corporate Change, 27, 803–815.

    Article  Google Scholar 

  94. McCann, P., & Ortega-Argilés, R. (2013). Transforming European regional policy: a results-driven agenda and smart specialisation. Oxford Review of Economic Policy, 29, 405–431.

    Article  Google Scholar 

  95. Miörner, J., Zukauskaite, E., Trippl, M., & Moodysson, J. (2018). Creating institutional preconditions for knowledge flows in cross-border regions. Environment and Planning C: Politics and Space, 36(2), 201–218.

    Google Scholar 

  96. Morgan, K. (2015). Smart specialisation: Opportunities and challenges for regional innovation policy. Regional Studies, 49(3), 480–482.

    Article  Google Scholar 

  97. Moulaert, F. (2013). The international handbook on social innovation: Collective action, social learning and transdisciplinary research. Cheltenham: Edward Elgar.

    Google Scholar 

  98. Mundell, R. (1962). The Appropriate Use of Monetary and Fiscal Policy for Internal and External Stability. Staff Papers - International Monetary Fund 9 (1):70

  99. Muscio, A., Reid, A., & Rivera Leon, L. (2015). An empirical test of the regional innovation paradox: Can smart specialisation overcome the paradox in Central and Eastern Europe? Journal of Economic Policy Reform, 18, 153–171.

    Article  Google Scholar 

  100. OECD (2010). The innovation policy mix. In: OECD Science, Technology and Industry Outlook 2010. Paris: OECD.

  101. OECD. (2011). Demand side innovation policy. Paris: OECD.

    Google Scholar 

  102. Papaconstantinou, G. & Polt, W. (1998). Policy evaluation and technology: An overview. In: OECD proceedings: Policy Evaluation in Innovation and Technology: Towards Best Practices. Washington DC: OECD Washington Centre.

  103. Papamichail, G., Rosiello.A, & Wield, D. (2019). Capacity-building barriers to S3 implementation: An empirical framework for catch-up regions. Innovation: The European Journal of Social Science Research, 32: 66–84.

  104. Pérez, S., Conte A., & Harrap, N. (2014). Synergies between EU R&I funding programmes; Policy suggestions from the launching event of the Stairway to Excellence project’. S3 Policy Brief Series, N. 12/2014.

  105. Peters, B.G. & Borrás, S. (2009). Governance and European integration. In: Egan, M., Nugent, N. and Peterson, W. (eds.) Studying the European Union: Current and Future agendas. London: MacMillan.

  106. Piirainen, K. A., Tanner, A. N., & Alkærsig, L. (2017). Regional foresight and dynamics of smart specialisation: A typology of regional diversification patterns. Technological Forecasting and Social Change, 115, 289–300.

    Article  Google Scholar 

  107. Priest, S. (2014). Critical sense of science. Bulletin of Science, Technology and Society, 33(5–6), 138–145.

    Google Scholar 

  108. Przeor, M. (2019). Smart specialisation: monitoring and evaluation. Presentation in Smart specialisation: monitoring and evaluation; state of the play and next steps. Brussels (BE), 24 January 2019.

  109. Radosevic, S. (2017). Assessing EU smart specialisation policy in a comparative perspective. In Radovic, S., Curaj, A., Gheorghiu, R., Andreescu, L., & Wade, I. (eds)., Advabces in the theory and practice of smart specialisation. London: Academic Press.

  110. Radosevic, S., & Stancova, K. C. (2018). Internationalising smart specialisation: Assessment and issues in the case of EU new member states. Journal of Knowledge Economy, 9(1), 263–293.

    Article  Google Scholar 

  111. Rip, A. (2002). Regional innovation systems and the advent of strategic science. The Journal of Technology Transfer, 27(1), 123–131.

    Article  Google Scholar 

  112. Risse, T. (2012). Governance configurations in areas of limited statehood. Actors, models, institutions and resources. SFB-Governance Working Papers, n. 32. Berlin: Free University of Berlin.

  113. Rodríguez-Pose, A. (2013). Do institutions matter for regional development? Regional Studies, 47(7), 1034–1047.

    Article  Google Scholar 

  114. Rodríguez-Pose, A., di Cataldo, M., & Rainoldi, A. (2015). The role of government instiutions for smart specialisation and regional development. S3 Policy Briefs, n. 4. Seville: Joint Research Centre – European Commission.

  115. Rodríguez-Pose, A., & Wilkie, C. (2015). Institutions and the entrepreneurial discovery process for smart specialisation. In Papers in evolutionary economic geography n. 15.23. Utrecht: Utrecht University.

  116. Rodrik, D. (2004). Industrial policy for the twenty-first century. Kennedy School of Government Working Papers, n. RWP04–047. Boston: Harvard University.

  117. Sabel, C., Herrigel, G., Kazis, R., & Deeg, R. (1987). How to keep mature industries innovative. Technology Review, April: 27–35.

  118. Sackett, D. L., Rosenberg, W. M., Gray, J. A., Haynes, R. B., & Richardson, W. S. (1996). Evidence-based medicine: What it is and what it isn’t? British Medical Journal, 312, 71–72.

    Article  Google Scholar 

  119. Schneider, P. (2018). Managerial challenges of industry 4.0: An empirically backed research agenda for a nascent field. Review of Managerial Science, 12, 803–848.

    Article  Google Scholar 

  120. Silva, J., & Bultitude, K. (2009). Best practice in communications training for public engagement with science, technology, engineering and mathematics. Journal of Science Communication, 8(2), 1–13.

    Article  Google Scholar 

  121. Sotarauta, M. (2018). Smart specialisation and place leadership: Dreaming about shared visions, falling into policy traps? Regional Studies, Regional Science, 5, 190–203.

    Article  Google Scholar 

  122. Sörvik, J., & Kleibrink, A. (2015). Mapping innovation priorities and specialisation patterns in Europe. Brussels: European Commission.

    Google Scholar 

  123. Taeihagh, A., Givoni, M., & Bañares-Alcántara, R. (2013). Which policy first? A network-centric approach for the analysis and ranking of policy measures. Environment and Planning B: Planning and Design, 40(4), 595–616.

    Article  Google Scholar 

  124. Todeva, E., & Ketikidis, P. (2017). Regional entrepreneurship and innovation management: Actors, helices, and consensus space. Management Dynamics in the Knowledge Economy, 5(1), 57–76.

    Article  Google Scholar 

  125. Tödling, F., & Trippl, M. (2005). On size fits all? Research Policy, 34, 1203–1219.

    Article  Google Scholar 

  126. Torfing, J. (2018). Collaborative innovation in the public sector: The argument. Public Management Review, 21(1), 1–11.

    Article  Google Scholar 

  127. Trippl, F., Zukauskaite, E., & Healy, A. (2019) Shaping smart specialization: The role of place-specific factors in advances, intermediate and less-developed European region. Regional Studies, forthcoming.

  128. Uyarra, E., Marzocchi, C., & Sorvik, J. (2018). How outward looking is smart specialisation? Rationales, drivers and barriers. European Planning Studies, 26(12), 2344–2363.

    Article  Google Scholar 

  129. Van den Broek, J., Benneworth, P., & Rutten, R. (2018). Border blocking effects in collaborative firm innovation. European Planning Studies, 26, 1330–1346.

    Article  Google Scholar 

  130. Van der Heijden, J. (2011). Institutional layering: A review of the use of the concept. Politics, 31(1), 9–18.

    Article  Google Scholar 

  131. Varga, A., Sebestyén, T., Szabó, N., & Szerb, L. (2020). Estimating the economic impacts of knowledge network and entrepreneurship development in smart specialization policy. Regional Studies, 54(1), 48–59.

    Article  Google Scholar 

  132. Vocaskova, J. (2020). Smart specialisation pilot actions and interregional cooperation 2021–2027. Presentation in webinar for Interreg Central Europe SMART_Watch project. 28 April 2020.

  133. Voorberg, W. H., Bekkers, V. J. J. M., & Tummers, L. G. (2015). A systematic review of co-creation and co-production; Embarking on the social innovation journey. Public Management Review, 17(9), 1333–1357.

    Article  Google Scholar 

  134. Wang, S., Wan, J., Li, D., & Zhang, C. (2016). Implementing smart factory of industry 4.0: An outlook. International Journal of Distributed Sensor Networks, 2016, 1–10.

    Google Scholar 

  135. Wegrich, K. (2019). The blind spots of collaborative innovation. Public Management Review, 21(1), 12–20.

    Article  Google Scholar 

  136. Weinberger, N., Decker, M., Fleischer, T., & Schippl, J. (2013). A new monitoring process of future topics of innovation and technological analysis: informing Germany’s innovation policy. European Journal of Futures Research, 1(1), 1–9.

    Article  Google Scholar 

Download references

Author information



Corresponding author

Correspondence to Ricard Esparza-Masana.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Esparza-Masana, R. Towards Smart Specialisation 2.0. Main Challenges When Updating Strategies. J Knowl Econ (2021).

Download citation


  • Smart specialisation strategies
  • Innovation policy
  • European regional funds

JEL codes

  • O38
  • R58