Abstract
This chapter highlights the near absence of research into the nonacademic impact of ICT4D research within the ICT4D literature. It draws on studies in international development to review the literature on the impact of research on development policy and practice and reflects on the implications for ICT4D research. Noting the cultural and professional differences between researchers and practitioners as well as their differing perspectives of impact, it goes on to describe the dominant themes in the literature. ICT4D research is characterised as lacking in certain respects, which would tend to inhibit its capacity for policy impact, but having overcome these, further adjustments to research conduct and culture are implied for such impact to emerge. Consequential recommendations include revised incentive structures for academic institutions as well as closer engagement between researchers and practitioners.
You have full access to this open access chapter, Download chapter PDF
Similar content being viewed by others
Keywords
- International Development
- Performance Measurement System
- Policy Community
- Knowledge Broker
- Policy Influence
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
This chapter highlights the near absence of research into the nonacademic impact of ICT4D research within the ICT4D literature. It draws on studies in international development to review the literature on the impact of research on development policy and practice and reflects on the implications for ICT4D research. Noting the cultural and professional differences between researchers and practitioners as well as their differing perspectives of impact, it goes on to describe the dominant themes in the literature. ICT4D research is characterised as lacking in certain respects, which would tend to inhibit its capacity for policy impact, but having overcome these, further adjustments to research conduct and culture are implied for such impact to emerge. Consequential recommendations include revised incentive structures for academic institutions as well as closer engagement between researchers and practitioners.
1 Introduction
The first phase of the SIRCA programme categorised the impact of research into academic impact and socio-economic impact, with the latter comprising impacts on socio-economic benefits, policy and capacity building. It was argued, moreover, that achieving academic impact does not automatically lead to achieving socio-economic impact. To date, it is evident that information society impact research in the global south has focused almost exclusively on the impact of ICTs, largely ignoring the socio-economic impact of the research itself. Where the impact of research has been addressed, this has deliberated almost entirely on academic impact, with discussions about where to publish research findings and how to maximise the citations that such publications receive. However, policy research programmes do not normally use traditional academic citations in peer-reviewed journals as a principal monitoring and evaluation tool. While there is considerable reflection on the socio-economic impact of ICTs, there is a paucity of research on the socio-economic impact of research. What does all this say about the value of information society research in the global south? That its main purpose is to support academic careers?
In SIRCA I, we pointed out that achieving socio-economic impacts from research required different skills, roles and processes from those that are used to achieve academic impact. Given the near absence of research into the nonacademic impact of ICT4D research within the ICT4D literature, we need to look elsewhere for a better understanding of these skills, roles and processes in order to establish some sense of how they might operate effectively for achieving socio-economic impact with information society research.
The related but wider field of international development has shown greater interest in the impact of research, with governments and major agencies calling for a clearer articulation of socio-economic benefits from the research that they fund. There are also calls for social policy formulations to be based more on evidence, and this implies a heightened role for the research that will be capable of delivering such evidence. For example, the UK government, which invests around £3 billion annually in research, requires funding applicants to demonstrate the contribution of their research to society and the economy. The UK’s Department for International Development (DFID) has stated that without a greater focus on getting research into use, the potential for improving lives through research and innovation will not be fully realised.
An examination of the impact of information society research in the global south therefore needs to address all dimensions of impact, and in the light of the foregoing, this should feature an assessment of its impact on policy and practice. In this chapter, we review the literature on the impact of development research on policy and practice and reflect on the implications for ICT4D research, drawing on the SIRCA programme experiences. We begin with some important observed differences between the two worlds of research and policymaking that have emerged as consistent themes in the literature and which give shape to much of the advisory outcomes of the review. This is followed by a summary of the other themes. Finally, we present an assessment of what they might mean for ICT4D research and for the SIRCA programme.
2 Opposing Perspectives
2.1 Two Communities
The two worlds of academic research and policy formulation are characterised in the literature as being very different. Some observers comment that researchers, practitioners and policymakers live in parallel universes (Court and Maxwell 2005; Court and Young 2006; Stone 2009). Stone (2009) argues that researchers and policymakers operate with different values, languages, timeframes, reward systems and professional ties to such an extent that they live in separate worlds. Moreover, for some, researchers cannot understand why there is resistance to policy change despite clear and convincing evidence, while policymakers bemoan the inability of many researchers to make their findings accessible and digestible in time for policy decisions (Court and Young 2006).
According to Grejin (2008), researchers often live in very separate worlds from policymakers, civil society organisations and practitioners. As a result, research-based evidence is often only a minor factor when policies for development are formulated and practices shaped. Too often new public policies are rolled out nationally with little trialling or evaluation. In effect, governments experiment on the whole population at once. Even where there is plenty of evidence, there may be a failure to ensure that the evidence being collected and analysed is made relevant to the needs of decision-makers and is acted upon (Mulgan and Puttick 2013). Additionally, as Datta (2012) suggests, researchers in any one field tend not to speak with one voice, and not all researchers see policy engagement as part of their role. Shanley and López (2009) go further by claiming that strong organisational disincentives dissuade researchers from engaging in outreach beyond the scientific community. Others indicate that researchers working in universities and other publicly funded institutions report structural barriers to engaging in knowledge translation activities, suggesting that a failure to transfer knowledge has been attributed to the “two communities” problem—an explanation that points to cultural differences between researchers and users as barriers to such engagement (Jacobson et al. 2004). As a result, says Carden (2009), policymakers lack confidence in their own researchers.
Despite these misgivings relating to the prospects for development research having an influence over development practice and policymaking, there is room for optimism. As de Vibe et al. (2002) put it, notwithstanding the assumption that there is a clear divide between researchers and policymakers (the two communities model), which underpins the traditional view of the link between research and policy, literature on the research-policy link is now shifting away from these assumptions, towards a more dynamic and complex view that emphasises a two-way process between research and policy, shaped by multiple relations and reservoirs of knowledge. Accordingly, much of what emerges from the literature review presented here by way of recommendations for bringing these two communities closer together argues the case for overcoming these perceived gaps, and offers prescriptions for doing so.
2.2 What Is Impact?
An understanding of the impact of research on policy and practice requires agreement on what “impact” means for researchers and for policymakers. For academics, the impact of their research is usually reflected by the impact factor that is assigned to the journal in which the research report is published. The impact factor of an academic journal is a measure of the average number of citations that have been made to its recently published articles. It is frequently used as a proxy for the relative importance of a journal within its field; journals with higher impact factors are considered to be more important (influential) than those with a lower impact factor. Thomson Reuters, the academic publisher, computes the impact factor of a journal by dividing the number of current year citations to the source items published in that journal during the previous 2 years.Footnote 1
In contrast to the academic perspective of research impact, practitioners hold a very different view. For example, Young (2008) claims that for research to have any impact, the results must inform and shape policies and programmes and be adopted into practice. Researchers wishing to maximise the impact of their work have to attract the interest of policymakers and practitioners and then convince them that a new policy or different approach is valuable and then foster the behavioural changes that are necessary to put them into practice (Young 2008). For Sumner et al. (2009), impact is multilayered and refers to use (i.e. consideration) or actual outcome(s) of social change. It can be visible or invisible, progressive or regressive, intended or unintended and immediate or long term. The Research Council of the UK acknowledges academic impact—as the demonstrable contribution that excellent research makes to academic advances—but it also emphasises the need for economic and societal impacts as the demonstrable contribution that excellent research makes to society and the economy by, among other things, increasing the effectiveness of public services and policy.
The difference between these contrasting interpretations of what is meant by impact has serious implications for the discourse surrounding the research-policy nexus. According to Shanley and López (2009), appropriation of the word “impact” to designate a journal’s ranking constitutes a potential misrepresentation of what impact is. The effect of this can be seen from their survey of 268 researchers in 29 countries which revealed that the largest percentage (34 %) ranked scientists as the most important audience for their work and that engagement with the media, production of training and educational materials and popular publications as outlets for scientific findings was perceived as inconsequential in measuring scientific performance. They conclude that directly and inadvertently, academic and nonacademic research institutions discourage impact-oriented research by prioritising the number and frequency of publications in peer-reviewed journals (Shanley and López 2009).
Chief among the barriers between research and policy impact is the reward and incentive system of the academy, i.e. promotion and tenure. This is seen as a system that, in general, continues to value traditional types of within-group activity, e.g. publication in peer-reviewed journals, presentations at disciplinary conferences and receipt of research grants from government agencies, over the more broadly directed outreach and production activities associated with the transfer of knowledge. While the importance of knowledge transfer may be endorsed in rhetoric, the rewards, resources and priorities reflect the enduring value accorded to the more traditional academic activities. In many disciplines, knowledge transfer—the exchange, synthesis and application of knowledge—is noted to pose risks to an academic career. This is because the activities that make up much of the work of knowledge transfer are not widely accepted as legitimate forms of scholarship (Jacobson et al. 2004).
Gendron (2008) goes even further in developing a critique of the excessive spread of performance measurement practices in academia, whereby productivity is measured through performance indicators predicated on hard data such as grants, citations and the number of publications. This has given rise to an identity representation of academics as performers. Journal rankings and performance measurement schemes are becoming increasingly influential within many fields of research, thereby consolidating the prevalence of performativity on the life and research endeavours of many academics. The influence of journal rankings leads to researchers being assessed on the basis of their “hits” instead of on the substance of their work. Thus, the mania surrounding the practice of performance measurement stifles innovation while engendering and/or reinforcing pressures of superficiality and conformity (Gendron 2008).
Perhaps as a consequence of the serious shortcomings within peer-reviewed journals and the academic reward system, in the world of policy research, the mechanisms of academic peer review and conventional citation counting are regarded as too limited. Although rankings and rating systems applying to both journals and individual academics are acknowledged to provide a useful proxy guide to the quality of a research study, the validity of such rankings for such purposes is noted to be subject to considerable debate (DFID 2013a, b, c). Moreover, not all well-designed and robustly applied research is to be found in peer-reviewed journals, and not all studies in peer-reviewed journals are of high quality. Journal rankings do not always include publications from southern academic organisations or in online journals. Accordingly, policy research programmes will not usually use conventional academic citations in peer-reviewed journals as a primary monitoring and evaluation tool (Hovland 2007). Potentially, this robs the policy arena of a hugely valuable resource because, as Shanley and López (2009) put it, until communication and impact are seriously integrated into (academic) performance measurement systems, it is likely that only a limited number of independently motivated scientists will engage in the time-consuming processes needed to disseminate research effectively.
Despite the foregoing observations, there is again room for optimism when contemplating the possibility of stronger links between research academics and policymakers. Firstly, it can be seen that the two perspectives of impact held by each are not mutually exclusive; research that is highly regarded in peer-reviewing processes and published in high-ranking journals retains its potential for influencing policy. Indeed, high-quality research is a prerequisite for policy influence, although its publication alone seems insufficient for it to do so. This is despite the observation that there is an assumption among some actors that research communication is often an unnecessary add-on, or a dispensable luxury (Harvey et al. 2012). There is also a less polarised perspective of research that it exists as a continuum between research that is used for more conceptual purposes of raising awareness and increasing understanding and knowledge at one end and the more instrumental uses of research such as changes to policy and practice, at the other end (Nutley et al. 2007).
Secondly, pressures on higher education funding mean that academics are increasingly being asked to demonstrate the public benefit of their work. For example, the UK government’s 2014 Research Excellence Framework will for the first time explicitly assess the impact of research beyond academia. The framework defines impact as “any effect on, change or benefit to the economy, society, culture, public policy or services, health, the environment or quality of life, beyond academia”. Submissions for funding will be assessed under this category through impact case studies and details of the strategy for achieving impact.Footnote 2 A turn to nonacademic impact has the potential, therefore, of encouraging academics to engage more closely with the wider processes of social transformation. Attributing value to this type of impact is certainly intended to change research culture (Williams 2012). In the development field, the UK’s DFID has made it clear that DFID-funded research programmes are expected to plan and implement a research uptake strategy and that research uptake strategies should encompass stakeholder engagement, capacity building, communication and monitoring and evaluation (DFID 2013a, b, c).
In addition to funding incentives, theoretical advancements in communication for development now favour a move from a top-down to a more inclusive communication style. Yet the former trickle down and transfer paradigms continue to guide and dominate the behaviours of academics (Shanley and López 2009). However, the use of social knowledge as a resource for policymaking has become a means to mobilise researchers and policymakers in new political alliances, over and above old ideological and partisan differences that have separated academia from engagement with practice (Fisher and Holland 2003). Nevertheless, within both worlds of academia and policy, there is still lack of clarity or consensus on the meanings of research impact or influence, and researchers have very different ideas about who they are trying to influence, to what end and using which methods (Harvey et al. 2012).
3 Thematic Overview of the Literature
In this section, we provide a brief analysis of the literature that highlights the major interlocking themes that have evolved in relation to the impact of research on development practice and policy formulation. In order to retain a contemporary perspective, the referenced works are mostly from this century, with the exception of the seminal work of the late Professor Carol Weiss, whose observations underpin much of the rest of the literature. The compilation is dominated by grey literature from the practitioner and policy advisory domains.Footnote 3 We can only speculate as to why this is so, but it could reflect greater concern on the behalf of practitioners, policymakers and their advisors regarding the role of research within their deliberations than exists within the research community. Such a concern would resonate with some of the observations that have emerged regarding the research-policy nexus.
A further aspect is that policy and practice are largely conflated into the same thing; arguably, as de Vibe et al. (2002) put it, the practical recommendations of NGOs mirror to a large extent the macro policy discourse – in areas such as building local institutions, supporting civil society and strengthening social capital. The new orthodoxy within development, they argue, that has as its mantras of participation and empowerment is shared not only among NGO practitioners but also among bilateral and multilateral donors and governments.
3.1 Intent
Among the conditions required for research to have an influence on policy and practice, several observers emphasise the need for researchers to have the intent for it do so. According to Sen (2005), the highest likely impact of research on development outcomes is when there is a clear demand from research users and there is an effective supply of high-quality policy-relevant research, backed by the intent to influence among researchers. Even if such intent exists, though, the lack of other conditions, such as leadership and capacity within the user community, and the impact of high-quality policy-relevant research will be limited. Accordingly, although, as Wheeler (2007) says, there are growing expectations within development that research should inform policy; intent to influence is a necessary but insufficient supply-side factor in determining the development effectiveness of research.
Carden (2009) highlights three principles behind the design of a research programme that may allow for the maximum impact, which include the intent to influence, along with the creation of networks and effective communications. Intent to influence must be expressly included among the research objectives. Other essential elements of policy influence for development research are proposed by O’Neill (2005); they include intent, as the determination among researchers to do their work and report their results so as to inform policy decisions and improve policy outcomes.
3.2 Communication
Communication is by far the most cited factor in the literature on the impact of research on development policy and practice. The topic of communication for development is a subfield within international development studies, and it encompasses research communication, among other forms of communication.Footnote 4 The various forms of communication, who is involved in the communication process and when it occurs, are all themes that pervade the literature and intermingle with the other themes. For example, alongside the intent of a researcher to influence policy and practice, Ryan and Garrett (2003) and Sen (2005) stress the clear intent to communicate research as a supply-side factor for influence.
The need for communication between researchers and others is repeatedly stressed, sometimes implying an additional need for intermediaries to ensure it is done effectively. Newman et al. (2012) point to the recent interest in supporting evidence-informed policymaking in developing countries through building the capacity of researchers and research intermediaries to supply appropriately packaged research information (e.g. in the form of policy briefs) to policymakers. Court and Maxwell (2005) claim that successful evidence-based policymaking occurs when, among other things, the links are well made between researchers and policymakers, for example, through networks or by intermediaries.
The role of networks also emerges as a persistent theme throughout the literature, with Court and Young (2006) suggesting that there is often an underappreciation of the extent and ways that intermediary organisations and networks influence formal policy guidance documents. Research is more likely to contribute to policy, they say, if researchers and policymakers share common networks, trust each other and communicate effectively. Hovland’s (2003) literature review of research communication for poverty reduction emphasises support for research networks, especially electronic and/or regional networks, while Masset et al. (2011) maintain that a networked policy research community is a precondition for increasing the likelihood of policy change. Stone (2009) suggests that the uptake of research is contingent on policy community networking, describing the long-term strategies of the Overseas Development Institute (ODI), a London-based think tank of policy entrepreneurship that extends to longer-term influence through creating human capital, building networks and engaging policy communities.
The nature and quality of communication between and among researchers and policymakers is also important. A crucial capacity for researchers is the ability to communicate in a language that policymakers can understand (Greijn 2008) as well as being an effective communicator, with specifically, the ability to find common ground and to communicate well with various audiences. Modifying or creating policies based on evidence requires “translating” the technical language of research so that it is comprehensible for the relevant agents in the policymaking process. Good communication is also important when seeking partners, building alliances and working in networks (Langou 2008). Other measures for improving research communication include improving skills for achieving the right format and timing of communication, constructing appropriate platforms from which to communicate and promoting participative communication for empowerment (Hovland 2003).
For some, research communication is primarily a public relations or marketing exercise, the “communication” product that comes in the final stages of a linear research process. Increasingly, however, development practitioners and researchers have recognised the importance of iterative and participatory communication processes. This is according to Harvey et al. (2012), who reason that research communication has evolved away from solely linear and top-down models of influencing (e.g. getting research onto the desks of the most senior decision-makers) to more complex and multisited theories of change. They see a proliferation in roles and actors for communicating research in development which push the boundaries of conventional ideas of research and challenge how research agendas are set and how knowledge is generated and shared. For some researchers, this implies new and unfamiliar ways of working, as revealed in the survey of researchers by Shanley and López (2009) in which performance measurement systems revealed robust institutional preferences against communicating with the public. This is underscored by the finding of Jacobson, Butterill and Goering (2004) that plain language communication with the public is not widely accepted as a legitimate form of scholarship and also by Carden’s (2009) claim that researchers are uncomfortable communicating with officials and politicians in the policy community.
Datta (2012) goes on to argue that researchers no longer have a monopoly over knowledge production and communication and that traditional approaches to communicating research to policymakers are inadequate. Researchers now share the field of knowledge production and communication with many others, and where appropriate, those who view their role in relation to policy should be prepared to engage with stakeholders affected by policy issues and to expose their findings to human interaction, review and scrutiny by others.
Good communication is vital for researchers (Saxena 2005), and policy is only influenced when the evidence is credible and well communicated (Court and Maxwell 2005). At its best, communication starts early in the research; it is designed into the research plan and is carried out as the project unfolds (Carden 2009). The UK’s DFID has stipulated that many of the research programmes which it funds should spend at least 10 % of their budget on communication activities, and this appears to have had a positive impact on the uptake of research by both policy and practice (Shaxson 2010). Additionally important in the current context, aside from academic impact and impact on policy and practice, enhanced capacity is regarded as a legitimate research outcome. Among the enhanced capacities that are claimed to be particularly important for young scholars in developing countries are communication with policymakers (e.g. policy briefs), communication with the general public and communication with the media (OECD 2011).
3.3 Information and Communication Technologies
Another factor closely linked to communication is the rising use of information and communication technologies (ICTs), a development that some see as blurring the once-stark line dividing academia and professional and amateur writers; e.g. op-ed writers, bloggers, etc. (Lewin and Patterson 2012). Hovland (2003), in calling for improved communication between researchers and policymakers and other researchers and end users (i.e. the poor and organisations working with them), suggests incorporating communication activities into project design and using new ways of communicating through ICTs. Others see more of a transformative role of ICTs for those practitioners and researchers who are increasingly recognising the importance of iterative and participatory communication processes within development that use ICTs for the rapid, multisited, multimedia and participant-driven production and communication of research as it unfolds (Harvey et al. 2012).
DFID’s report on social media engagement focuses on policy actors—people whose work is wholly or partially involved in developing or seeking to influence national and regional development policies—who use a range of ICTs to get information, including social media. The “echo-chamber” effect of social media, referring to the overlap between individuals and organisations working in allied or similar fields, works to amplify its content, giving rise to enormous reach. For instance, the 50 biggest followers of the Twitter account @DFID_Research have a combined reach of 2.4 million; @IDS_UK’s 50 biggest followers number 3.6 million, and @odi_development’s 50 biggest followers have a combined reach of 4.3 million.Footnote 5
Despite such evidence of reach, it appears that many UK academics are reluctant to adopt Web 2.0 tools for their work. A major disincentive for the academic community to adopt them for research activities is the lack of institutional incentives for using them or for publishing online. One study found that UK researchers are discouraged from publishing online by the policy of having international peer-reviewed journal citations, rather than online citations, count towards academic promotion (Brown 2012).
DFID argues that online media accessed through digital devices—PCs, pads and mobile phones—play a central role in all areas of knowledge and research. It is therefore crucially important to understand the online behaviour of the target audiences for development research as well as the wide range of available platforms and tools which can be exploited by project teams. However, conventional wisdom holds that this kind of open sharing and joint activity is at odds with the nature of the research process, where the tradition is for solo teams of researchers to prepare their findings privately before putting them out to review and where, especially in an academic and commercial context, advancement and success is seen to depend on secrecy.
3.4 Intermediaries
Against a background of individual cultural differences, systemic inadequacies in professional and institutional incentive structures and the apparent weaknesses in academic communication skills and processes, it is not surprising to discover other people and organisations taking up the role of delivering research-based knowledge to practitioners and policymakers with the aim of strengthening their activities. The work of Court and Young (2006) emphasises the importance of links—of communities, networks and intermediaries (e.g. the media and campaigning groups)—in affecting policy change. Existing theory, they say, stresses the role of translators and communicators, a view echoed by Harvey, Lewin and Fisher (2012) who suggest that evolving notions of what constitutes expert or valid knowledge have affected development research institutes in the global north, bringing an increased focus on the roles of intermediaries and networks. In this context, researchers are being joined by other actors, such as research communication specialists, not necessarily involved in undertaking research but who seek to strengthen the use of research within change processes (Harvey et al. 2012; Court and Maxwell 2005). As researchers typically have little or no influence over the capacity of their audience to use their research findings, others maintain that further investment should be made in supporting the pull through and absorption of research through, for example, the use of intermediaries or knowledge brokers to mediate relationships or transmit knowledge between academics and research users (Stevens et al. 2013).
The role of knowledge intermediaries is examined by Shaxson (2010) in recognising the contribution that knowledge intermediary organisations make not only in synthesising, interpreting and communicating research results to individuals and organisations in policy and practice but also in understanding the demand for knowledge from them. The role of knowledge intermediaries in international development is discussed at length, encompassing: enabling access to and making information edible, creating demand for information, enabling marginalised voices, creating alternative framings, connecting spheres of action, enabling accountability, informing, linking, matchmaking, facilitating collaboration and building sustainable institutions. For each of these activities, there are measures of impact, and these do include citation analyses among many others. However it is noticeable that measures of impact are shifting from content analysis issues such as hit rates, downloads and citations and more to measures of inclusivity and stakeholder involvement in project and programme plans and institutional strategies (Shaxson 2010).
Of relevance in the current context, one observer points out that developing countries often lack the intermediary institutions that carry research to policy (Carden 2009). Rich countries have abundant research institutes, think tanks, university departments and independent media that perform as knowledge brokers—the transactors who connect research findings to policy issues—but which are often absent in developing countries. As a result, the mechanisms of policy influence are missing. As a means of overcoming this limitation, Carden (2009) suggests that in IDRC experience, there is often a South to South learning effect, with lessons from one developing country or region applied to another with IDRC’s intermediary help. However, as Jones et al. (2013) point out, it is not necessary to be labelled as a knowledge intermediary in order to act as one; what matters is developing a clear understanding of the different intermediary functions that could be used and the resource implications of each.
3.5 Policy Entrepreneurs
In a refinement of the role of intermediaries, the concept of policy entrepreneurs has emerged as a role for researchers wishing to influence policy. A policy entrepreneur is an individual who invests time and resources to advance a position or policy. One of their most important functions is to change people’s beliefs and attitudes about a particular issue (Stone 2009). Four critical skills have been identified: being able to understand politics and identify key players, being able to synthesise research by simple compelling stories, being a good networker and being able to build programmes that bring all these factors together (Masset et al. 2011). As the product of the researcher is not usually in a format that can be used by policymakers, an intermediary—research broker or policy entrepreneur—with a flair for interpreting and communicating the technical or theoretical work is needed. This is usually an individual but sometimes an organisation which plays such a role (Stone 2009). Additionally, as research-based evidence often plays a very minor role in policy processes, if researchers want to be good policy entrepreneurs, they also need to synthesise simple, compelling stories from the results of the research (Young 2008).
3.6 Networks
The theme of networks has already emerged in our discussions on communication and intermediaries, but there are additional aspects throughout the literature that heighten their relevance to the present discussion. Following Weiss (1977), it has been widely recognised that although research may not have direct influence on specific policies, the production of research may still exert a powerful indirect influence through introducing new terms and shaping the policy discourse. Weiss describes this as a process of percolation, in which research findings and concepts circulate and are gradually filtered through various policy networks. Some of the literature on the research-policy link therefore focuses explicitly on various types of networks, such as policy streams, policy communities, epistemic communities, think tank networks and advocacy coalitions.Footnote 6 Networks and inter-organisational linkages sit solidly among the determining influences as to why some ideas are picked up and acted on, while others are ignored and disappear (de Vibe et al. 2002).
Lewin and Patterson (2012) indicate that the diffusion of the Internet has transformed global news media and communication systems into interactive horizontal networks that connect local and global individuals and issues, and Stone (2009) argues that such networks facilitate the role of policy entrepreneur that is played by intermediary organisations such as the ODI. Given the importance of collaboration between researchers and policymakers within research programmes that are intended to influence policy and practice, it is no surprise to find an emphasis on the establishment and operation of networks that make such collaboration possible and more effective. As Carden comments, collaborations have proven the diverse and sometimes surprising rewards of organising research in networks of shared purpose (Carden 2009). National, regional and global networks are playing an increasing role in development policy, and two institutional models seem to be particularly effective, think tanks and national and regional networks, which are frequently cited as being influential (Young 2005).
3.7 Incentives
Among the interlocking factors that influence the impact of research on policy and practice in the international development literature, incentives stand out as a decisive determinant. Senior management and academics at research institutions need to provide strong leadership in supporting cultural changes around the impact agenda (Stevens et al. 2013). They should consider how best to accommodate impact within internal structures, job descriptions, annual appraisal and promotional criteria and pay awards and professional development opportunities. Other commentators have called on research institutions to provide researchers with the right incentives to engage effectively with users of research (Datta 2012), and a shift in incentive structures is called for that reward actual impact rather than only “high-impact” journals to ensure science is shared with those who need it. Incentives for researchers to produce outputs that reach a broader swath of society are so low that if engaged in at all, this occurs as an afterthought once results are published (Shanley and López 2009).
The incentives for officials also come under scrutiny insofar as research-policy links are dramatically shaped by the political context. The policy process and the production of research are in themselves political processes which are influenced by a range of factors including the attitudes and incentives among officials, their room for manoeuvre, local history and power relations. Understanding the degree of political contestation as well as the attitudes and incentives of officials is important in explaining some public policies (Young 2005).
3.8 Political Context
The influence that the political context has on the research and policy or practice nexus receives significant coverage in the literature, repeatedly identified as a determining factor for whether research-based and other forms of evidence are likely to be adopted by policymakers and practitioners. Research is more likely to contribute to policy if the evidence fits within the political and institutional limits and pressures of policymakers and if it resonates with their assumptions (Court and Young 2006). Accordingly, researchers must know and understand key stakeholders in the policymaking process and understand the way in which the door can be opened to politicians and public interest (Taylor 2005). They need to grasp and adapt to the dynamics of the political debate and bring to the fore relevant evidence at the right time (Greijn 2008). It becomes necessary therefore to create an enabling environment for improved communication of research as failure to use research is not always due to lack of communication but can instead be due to lack of a favourable political environment. In fact, the success (or failure) of communication at an individual, local or project level is largely determined by wider systems, including the political environment. It is noted that academics and think tanks have a far greater chance of being heard when there are like-minded influential politicians in the dominant advocacy coalition (Hovland 2003).
Understanding possible pathways of policy change, the role of formal and informal institutional checks and balances on power can help develop a clear road map for policy advocacy. This also means that knowledge producers need to be more self-aware of the political nature of their engagement in policy processes. Any act of producing knowledge is, by definition, a political one, and those producing knowledge need to engage with the policy process with their eyes wide open (Jones et al. 2013).
3.9 Demand
According to Mulgan and Puttick (2013), one of the most striking factors impeding the effective use of evidence is the absence of organisations tasked with linking the supply and demand of evidence. International development researchers are therefore encouraged to understand the demand for research among policymakers and practitioners, by, for example, mapping the existing information-demand and information-use environment. It has been said that if global public goods research is to be made applicable as well as accessible to national environments from the international system, it must be responsive to demand. This is one approach to engaging with users of research, by taking user realities and preferences into account in development research and communication and by gauging the extent of demand for new ideas by policymakers and society more generally. Some argue that to be effective, research must be located more securely within the context of wider knowledge or innovation systems, implying that the effectiveness and impact of research will be driven by continuous interactions between supply drivers and demand drivers (Hovland 2003).
Research on knowledge transfer, particularly in the field of policy development, has led to several models of the process. The science-push or knowledge-driven model conceptualises it as a unidirectional and logical flow of information from researchers to policymakers resulting in specific policy decisions, whereas the demand-pull or problem-solving model views the process as occurring through the commissioning of information from researchers by policymakers with the intent of addressing a well-defined policy problem. The interactive model construes knowledge transfer as a reciprocal and mutual activity, one that involves researchers and users in the development, conduct, interpretation and application of research and research-based knowledge (Jacobson et al. 2004). DFID acknowledges a preference to move from a linear, supply driven, transfer-of-technology model to a more interactive, demand-driven or collaborative model (Adolph et al. 2010).
Apart from understanding the demand for research, researchers are advised to participate in activities that would stimulate demand for their outputs, such as raising awareness and building capacity within policy circles. In this regard, Shaxson (2010) observes that we know more about how to improve the supply of evidence than we do about how to improve the demand for it, particularly in the policy sphere. Strategies that focus on improving awareness and absorption of research inside government and on expanding research management expertise and developing a culture of policy learning can ameliorate problems on the demand side (Stone 2009). Newman et al. (2012) address capacity to demand research evidence at three levels: individual, organisational and environmental. Capacity-strengthening interventions that stimulate research demand include diagnostic processes, training, mentoring, linking schemes, organisational policies and societal interventions. However, a better understanding is required of what type of mechanisms are most suitable to strengthen user demand for research and to encourage the development of new user participation models in research design and implementation (Adolph et al. 2010).
3.10 Engagement
Several of the themes in the literature—such as effective communication, the role of intermediaries, participation in networks and stimulating demand—converge around the next emergent theme, that of engagement. Some observers use the term to denote the need for closer relationships between researchers and research users, especially policymakers. O’Neill (2005) suggests direct engagement by researchers with the policy community as one of three essential elements of policy influence for development research, saying that the research community must become participants in democratic governance, active at every level. Likewise, Hovland (2003) points towards platforms of broad engagement from which to communicate, such as a public campaign, for research to be more likely to be heard, a suggestion echoed by Datta (2012), who argues that public engagement processes that draw on a range of methods and approaches to elicit a diversity of views are likely to work better. A report by the DFID project on research to action regards engagement as individuals moving from simply accessing or consuming the content and services offered by an online platform to becoming more involved in the platform, recommending or promoting it and actively co-creating the content.
Despite these assertions, Datta (2012) notes that not all researchers see policy engagement as part of their role, suggesting that engagement processes may be more suited to those who see themselves as issue advocates who aim to influence policy in a particular direction and honest brokers who clarify and potentially expand the policy options available to decision-makers. Moreover, despite the new expectations that urge engagement in knowledge transfer, many researchers still accord it a low priority (Jacobson et al. 2004). As we have seen in Shanley and López’s (2009) survey, fewer than 5 % of academics regard engagement with the media as an outlet for scientific findings as having any consequence for measuring scientific performance at their institutions. Also, according to performance measurement systems, scientists are intentionally discouraged from producing materials for civil society.
4 Summary
At the risk of oversimplifying a complex issue, we can summarise the major lessons to learn from the literature on the impact of research on development policy and practice as follows. It seems that development policy and practice can benefit from the knowledge that research generates, but several interlocking preconditions exist for it to do so:
-
Researchers need to have the intent of influencing policy and practice.
-
They need to produce high-quality research.
-
Academic incentive and reward systems need to move away from a focus on publishing and citation counting and more towards the promotion of research that achieves social and economic impact.
-
Research results need to be better communicated to wider audiences, including the public, civil society and policymakers.
-
ICTs need to be used more effectively to improve research communication and to allow researchers to engage with other stakeholders in processes of knowledge sharing.
-
Intermediaries between researchers and practitioners—individuals and/or organisations—effectively promote research findings to wider audiences where researchers themselves do not (for whatever reason).
-
The role of policy entrepreneurs is fostered among suitable researchers and research institutions.
-
Formal or informal networks of researchers, practitioners and policymakers exist to facilitate interchanges among stakeholders and promote the take-up of research results.
-
Researchers engage with the political context of their work.
-
Researchers engage with the users of their research in order to understand the demand-side dynamics of the use of their research in practice and policy circles.
-
Policymakers, politicians and their advisers need to cultivate closer relationships with academic researchers in order to make full use of their capacity for producing evidence in support of policy decisions.
-
There is effective engagement between researchers, practitioners and policymakers that serves to overcome the various barriers between them.
5 Implications for ICT4D Research
This chapter is premised on the claim that information society impact research in the global south has focused almost exclusively on the impact of ICTs, to the exclusion of the impact of the research outside academia. Even here, despite more than a decade of research, identifying the particular contribution of ICTs to specific development goals has proven to be extremely difficult (Kleine 2010). Furthermore, while the contribution in terms of technology diffusion and use—especially of mobile phones—is easy to detect, the focus has only recently shifted towards the question of development impact (Heeks 2010).
Heeks (2010) implies that the absence of ICT4D research impact on practice and policymaking is due at least in part to substandard research in the ICT4D field. He argues that the poor quality of ICT impact assessment to date derives from its lack of conceptual foundations. Furthermore, it seems that there are few researchers in ICT4D who are drawn from the development studies discipline, resulting in the use of an impoverished understanding of development within ICT4D research. Any subsequent discussion of ICTs’ contribution to development in the absence of development studies’ ideas to define and understand development may make little sense and could result in techno-centric project design as well as making it much harder to connect to development policymakers and practitioners (Heeks 2010). Such a condition contravenes one of the fundamental findings from the literature that ICT4D researchers need to produce high-quality research if they wish to influence policy and practice. But, as we have seen, this is only the starting point.
Beyond these findings, little evidence has been found of any impact of ICT4D research on development policy or practice. DFID and IDRC have been jointly engaged in the ICT4D Research and Capacity Development Programme (2007–2011) which claims a desired output of sustained policy dialogue, defined as “ongoing, evidence-based dialogue among regulators, policy makers, researchers, civil society and the private sector; leading to well informed decision-making on policy issues relevant to ICT4D”.Footnote 7 According to the project documentation, there are numerous examples of national policies highlighting ICT in their delivery as a result of programmes funded through this ICT4D programme. However, it is not clear that these specific outputs were intended prior to the commencement of the programme. Neither did the SIRCA programme specify that any of the research projects it funded should declare a pre-existing intent to influence practice or policy or both, although a few actually did so. However, the SIRCA focus on building capacity for carrying out high-quality research clearly addresses the fundamental weakness of ICT4D research that Kleine (2010) and Heeks (2010) refer to. Nonetheless, with a better understanding of the conditions considered to be necessary for development research to influence policy and practice, it becomes an easier task to put forward some suggestions as to how ICT4D research could do the same.
In this regard, most of the lessons in the literature are as relevant to ICT4D as they are for international development research. They include (after Shanley and López 2009):
For research and academic institutions
-
Restructure institutional incentives to take into account actual impact.
-
Create incentives to invest in dissemination and an expanded range of research products.
-
Raise awareness and encourage social change agents, knowledge brokers and linkage mechanisms.
-
In hiring, balance consideration of publication record with capabilities such as originality, creativity, commitment, depth of field experience and impact orientation.
For researchers
-
Interact with stakeholders at various levels to ensure relevance of research questions and outputs.
-
Identify uptake pathways as part of project design.
-
Design projects to meet end users’ needs and aspirations.
-
Share and publish experiences of how research results have been “translated” or used for a non-scientific audience.
For journal editors and publishing organisations
-
Challenge researchers to propose ways to evaluate the real impact of their work.
-
Provide incentives to researchers to publish practitioner-oriented results of relevance to civil society.
-
Break the language barrier by publishing “mirror” papers, translations of the complete paper into the language of where the research was undertaken.
For donors
-
Recognise that sustainable change is a long-term process. Support longer-term project time frames (4–10 years) in which sufficient dialogue occurs at the initiation of projects.
-
Expand proposal requirements to include the sharing of relevant research results in an accessible format to appropriate audiences.
-
Verify that proposals designate sufficient funds for translation, printing, mailing costs and communication.
-
Remember that originality often occurs at the fringes. Identify and support small but innovative, locally driven initiatives.
It seems overly optimistic to imagine any infusion of intent to generate nonacademic impacts into ICT4D research without sufficient incentives for researchers to take it up, associated with appropriate capacity building that would enable them to do so. The UK Government’s 2014 Research Excellence Framework offers a model of how such an incentive scheme might work, although its effectiveness is yet to be proven, and difficulties can be foreseen in identifying and measuring the kind of impact that the scheme is seeking to induce. However, even with the financial incentives in place and with researchers formulating their strategies for closer engagement with practice and policy, with the backdrop of two communities and parallel universes described in the literature, it remains far from certain that the typical academic researcher will be either comfortable taking up the role of policy entrepreneur or even capable of implementing an effective communication strategy for presenting her research findings to a wider—nonacademic—audience. On top of this, there is the question of institutional incentives and the need to neutralise the obsession with academic performativity, citation counts and the tyrannical journal “impact factor”, which of course, from the perspective of practice and policy, is nothing of the kind. Given, the entrenched nature of such phenomena, there seems little hope of any early moves away from them, but a start can be made by raising the issue and by further airing the debate that has already surfaced in our literature.
For any ICT4D academic researchers wishing to extend their influence into practice and policy, there seems to be merit in providing them with the guidance and capacity-building structures and processes that would make it possible and easier for them to do so. There are three examples from elsewhere that suggest a means of doing this. Research to Action is an initiative that caters for the strategic and practical needs of people trying to improve the uptake of development research, in particular those funded by DFID.Footnote 8 It is for development researchers in general who would like to be more strategic and effective in their communications. Two activities of relevance are a workshop on Improving the Impact of Development Research Through Better Research Communication and Uptake (Shaxson 2010) and The Policy Influence Monitoring project, which monitors and evaluates grantees’ policy influence across Africa, South Asia, South East Asia and Latin America. It focuses on the factors and variables that inform how and when research influences policy.
Another interesting example of practical guidance for researchers intending to influence policy is the Science into Policy publication of the UK’s National Environment Research Council,Footnote 9 which helps scientists to recognise the relevance of science to policymakers, identify available opportunities, routes and best practice to influence policymaking, and communicate science in an appropriate and accessible way to the right policymakers, showing how it fits their policy needs. It explains key aspects of the UK policymaking process and provides case studies from the impact of environmental research to illustrate good practice in science to policy. The final example is Canada’s knowledge mobilisation network, ResearchImpact, that connects university research with research users across Canada to ensure that research helps to inform decision-making. Knowledge Mobilization Units work to match researchers with key policymakers in government, health and social service agencies to ensure that academic research is employed by policymakers and community groups to develop more effective, efficient and responsive public policies and social programmes.Footnote 10
These examples illustrate how the use of relatively simple and low-cost, high-value knowledge-based mechanisms might stimulate and aid researchers towards practice and policy influence, especially those in the SIRCA programme as they become mature and experienced researchers. A particular advantage is that the researchers are already operating within a supportive and vibrant network that consists of other early career researchers from 18 developing countries in three continents as well as the seasoned collaborators and mentors who have been working with each of them, plus of course the combined technical, research and administrative expertise in the Singapore Internet Research Centre of the Wee Kim Wee School of Communication and Information at Nanyang Technological University and IDRC. An opportunity now exists to leverage the strength of the SIRCA network towards achieving the full potential of the research that it has conducted for instilling the capacity—both individual and institutional—for influencing practice and policy. In this regard, it seems that generating institutional capacity for practice and policy influence might be better organised within specialised research units as opposed to mainstream university faculties, where traditional processes are more entrenched. ODI, for example, has established itself as an organisational policy entrepreneur by developing advisory ties to governments and international organisations and by institution building of policy communities via networking and partnerships.
6 Conclusions
The chapter has reviewed recent literature in the field of research on the impact of research on practice and policymaking in international development. The findings have considerable significance for ICT4D research, which has been assessed overall as lacking, firstly in that the general level of quality is questionable and secondly because there is little if any evidence of any impact on practice and policymaking in ICT4D. The first two phases of the SIRCA programme have successfully targeted the first problem. The second problem remains and is in need of major cultural and institutional shifts if a satisfactory solution is to emerge. However, some of the changes that are necessary, those relating to intent, communication, engagement and networking, can be initiated relatively easily by promoting the transitions that ICT4D researchers will have to make in order to increase the relevance of their work to wider audiences.
Notes
- 1.
http://thomsonreuters.com/products_services/science/free/essays/impact_factor/. Accessed 6 March 2013.
- 2.
http://www.ref.ac.uk/media/ref/content/researchusers/REF\%20guide.pdf. Accessed 6 March 2013.
- 3.
Grey literature is defined as “that which is produced on all levels of government, academics, business and industry in print and electronic formats, but which is not controlled by commercial publishers”. It includes reports, theses, conference proceedings, bibliographies, technical and commercial documentation, official documents government reports and documents (Alberani et al. 1990).
- 4.
See, for example, the C4D Network; “a non-profit organisation dedicated to supporting the communication for development sector” with more than 1,200 members. http://c4dnetwork.ning.com/
- 5.
@DFID_Research R4D is the open-access portal to DFID-funded research. It houses over 30,000 research documents on international development. http://www.dfid.gov.uk/r4d
@IDS_UK. The Institute of Development Studies is a leading global charity for research, teaching and communications on international development. Brighton, UK http://www.ids.ac.uk/
@odi_development. UK’s leading independent think tank on international development and humanitarian issues London, UK http://www.odi.org.uk. All accessed 28 March 2013
- 6.
An epistemic community consists of colleagues who share a similar approach or a similar position on an issue (Haas 1991). Advocacy coalitions consist of various different actors, including different government agencies, associations, civil society organisations, think tanks, academics, media institutions and prominent individuals (Sabatier and Jenkins-Smith 1999).
- 7.
http://www.dfid.gov.uk/r4d/Project/60422/Default.aspx. Accessed 11 March 2013.
- 8.
- 9.
- 10.
http://www.researchimpact.ca/home/. Accessed 12 March 2013.
References
Adolph, B., Jones, S.H., & Proctor, F. (2010). Learning lessons on research communication and uptake. London: Triple Line Consulting Ltd for DFID.
Alberani, V., Pietrangeli, P. D. C., & Mazza, A. M. R. (1990). The use of grey literature in health sciences: a preliminary survey. Bulletin of the Medical Library Association, 78(4), 358–363.
Brown, C. (2012, August). Are southern academics virtually connected? GDNet. http://depot.gdnet.org/cms/files//GDNet_study_of_adoption_of_web_2_tools_v2.pdf. Accessed 28 Mar 2013.
Carden, F. (2009). Knowledge to policy: Making the most of development research. Los Angeles/Ottawa: SAGE Publications Inc/IDRC.
Court, J., & Maxwell, S. (2005). Policy entrepreneurship for poverty reduction: Bridging research and policy in international development. Journal of International Development, 17, 713–725.
Court, J., & Young, J. (2006). Bridging research and policy in international development: An analytical and practical framework. Development in Practice, 16(1), 85–90.
Datta, A. (2012). Deliberation, dialogue and debate: Why researchers need to engage with others to address complex issues. IDS Bulletin, 43(5), 9–16.
de Vibe, M., Hovland I., & Young, J. (2002, September). Bridging research and policy: An annotated bibliography (ODI Working Paper 174). London: ODI.
DFID. (2013a). Assessing the strength of evidence (DFID Practice Paper). https://www.gov.uk/government/uploads/system/uploads/attachment\_data/file/158000/HtN\_-\_Strength_of_Evidence.pdf
DFID. (2013b). Social media engagement. A report of the activities on the R4D project. https://www.gov.uk/government/uploads/system/uploads/attachment\_data/file/200088/Research\_uptake\_guidance.pdf. Accessed 28 Nov 2013.
DFID. (2013c). Research uptake: A guide for DFID-funded research programmes. http://r4d.dfid.gov.uk/pdf/outputs/Communication/R4D\%20Social\%20Media\%20Engagement\%20Report_HR.pdf. Accessed 28 Mar 2013.
Fisher, E., & Holland, J. D. (2003). Social development as knowledge building: Research as a sphere of policy influence. Journal of International Development, 15, 911–924.
Gendron, Y. (2008). Constituting the academic performer: The spectre of superficiality and stagnation in academia. European Accounting Review, 17(2), 97–127.
Greijn, H. (2008). Linking research-based evidence to policy and practice. Research, Policy and Practice, Capacity, (35), 3.
Haas, E. B. (1991). When knowledge is power: Three models of change in international organisations. US: University of California Press.
Harvey, B., Lewin, T., & Fisher, C. (2012, September). Is development research communication coming of age? IDS Bulletin, 43(5), 1–8.
Heeks, R. (2010). Do information and communication technologies (ICTs) contribute to development? Journal of International Development, 22, 625–640.
Hovland, I. (2003). Communication of research for poverty reduction: A literature review (Working Paper 227). London: Overseas Development Institute.
Hovland, I. (2007). Making a difference: M&E of policy research. London: Overseas Development Institute.
Jacobson, N., Butterill, D., & Goering, P. (2004). Organizational factors that influence university-based researchers’ engagement in knowledge transfer activities. Science Communication, 25(3), 246–259.
Jones, H., Jones, N., Shaxson, L., & Walker, D. (2013, January). Knowledge, policy and power in international development: A practical framework for improving policy. London: ODI Background Note.
Kleine, D. (2010). ICT4WHAT? Using the choice framework to operationalise the capability approach to development. Journal of International Development, 22(5), 674–692.
Langou, G. D. (2008). Developing capacities for policy influence. Research, Policy and Practice, Capacity, (35), 14.
Lewin, T., & Patterson, Z. (2012). Approaches to development research communication. IDS Bulletin, 43(5), 38–44.
Masset, E., Mulmi, R., & Sumner, A. (2011). Does research reduce poverty? Assessing the welfare impacts of policy-oriented research in agriculture. Brighton: Institute of Development Studies, the University of Sussex.
Mulgan, G., & Puttick, R. (2013). Making evidence useful. The case for new institutions. The Economic and Social Research Council and NESTA Foundation. http://www.nesta.org.uk/library/documents/MakingEvidenceUseful.pdf.
Newman, K., Fisher, C., & Shaxson, L. (2012). Stimulating demand for research evidence: What role for capacity-building? IDS Bulletin, 43(5), 17–24.
Nutley, S., Walter, I., & Davies, H. (2007). Using evidence: How research can inform public services. Bristol: Policy Press.
O’Neil, M. (2005). What determines the influence that research has on policy-making? Journal of International Development, 17, 761–764.
OECD. (2011, April). Opportunities, challenges and good practices in international research cooperation between developed and developing countries. Paris: OECD.
Ryan, J. G., & Garrett, J. L. (2003). The impact of economic policy research: Lessons on attribution and evaluation for IFPRI. Washington, DC: International Food Policy Research Institute.
Sabatier, P., & Jenkins-Smith, H.C. (1999). The advocacy coalition framework: An assessment. In P. Sabatier (ed.), Theories of the policy process. Boulder: Westview Press.
Saxena, N. C. (2005). Bridging research and policy in India. Journal of International Development, 17, 737–746.
Sen, K. (2005). Rates of return to research: A literature review and critique. Manchester: DFID.
Shanley, P., & López, C. (2009). Out of the loop: Why research rarely reaches policy makers and the public and what can be done. Biotropica, 41(5), 535–544.
Shaxson, L. (2010, November 29–30). Improving the impact of development research through better research communications and uptake. Report of the AusAID, DFID and UKCDS funded workshop, London.
Stevens, H., Dean, A., & Wykes, M. (2013). DESCRIBE project final project report. Exeter: University of Exeter. http://www.exeter.ac.uk/media/universityofexeter/research/inspiringresearch/describeproject/pdfs/2013_06_04_DESCRIBE_Final_Report_FINAL.pdf
Stone, D. (2009). RAPID knowledge: Bridging research and policy at the overseas development institute. Public Administration and Development, 29, 303–315.
Sumner, A., Ishmael-Perkins, N., & Lindstrom, J. (2009). Making science of influencing: Assessing the impact of development research (IDS Working Paper 335). Brighton: IDS.
Taylor, M. (2005). Bridging research and policy: A UK perspective. Journal of International Development, 17, 747–757.
Weiss, C. (1977). Research for policy’s sake: The enlightenment function of social research. Research, Policy Analysis, 3(4 Fall), 531.
Wheeler, J. (2007). Creating spaces for engagement: Understanding research and social change. Brighton: Development Research Centre on Citizenship, Participation and Accountability.
Williams, G. (2012). The disciplining effects of impact evaluation practices: Negotiating the pressures of impact within an ESRC–DFID project. Transactions of the Institute of British Geographers, 37, 489–495.
Young, J. (2005). Research, policy and practice: Why developing countries are different. Journal of International Development, 17, 727–734.
Young, J. (2008). Impact of research on policy and practice. Research, Policy and Practice, Capacity, (35), 4–7.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Open Access This chapter is distributed under the terms of the Creative Commons Attribution Noncommercial License, which permits any noncommercial use, distribution, and reproduction in any medium, provided the original author(s) and source are credited.
Copyright information
© 2015 The Author(s)
About this chapter
Cite this chapter
Harris, R. (2015). The Impact of Research on Development Policy and Practice: This Much We Know. In: Chib, A., May, J., Barrantes, R. (eds) Impact of Information Society Research in the Global South. Springer, Singapore. https://doi.org/10.1007/978-981-287-381-1_2
Download citation
DOI: https://doi.org/10.1007/978-981-287-381-1_2
Publisher Name: Springer, Singapore
Print ISBN: 978-981-287-380-4
Online ISBN: 978-981-287-381-1
eBook Packages: Humanities, Social Sciences and LawSocial Sciences (R0)