, Volume 113, Issue 3, pp 1681–1695 | Cite as

Types of evidence cited in Australian Government publications

  • Samantha Vilkins
  • Will J. GrantEmail author


Demand on researchers to justify the impact of their work outside academia is increasing. Both increasing research use in policy and measuring current use are multi-faceted problems, though there are many potential benefits to researchers and policymakers alike. This bibliometric study aimed to gain insight into the research and reference practices of Australian policymakers, and investigate how this approach compares to previous interview and survey studies. We analysed 4649 references from 80 government publications from eight departments from 2010 to 2017, including references to 1836 articles from peer-reviewed journals, noting each author, title, year, parent publication, source type and access level. The number and type of evidence sourced varied per publication, with the most common sources being peer-reviewed journal articles, federal government reports, and Australian business information. This differs from previous large-scale qualitative studies which found policymakers are most likely to speak directly to colleagues for information, and far less inclined to seek out academic research. The study also found a possible increased chance for academic research to be cited if it was open access. Despite criticisms of citation analysis, at least in the field of research utilisation we cannot solely rely on interview or survey data, as cited evidence use differs from reported evidence use. Both the characteristics of evidence sources in policy and the effect of open access publishing on research use in policy are clearly worth investigating further, particularly longitudinally, which would require increased accessibility of government publications.


Bibliometrics Government Policy Impact assessment 


  1. Amara, N., Ouimet, M., & Landry, R. (2004). New evidence on instrumental, conceptual, and symbolic utilization. Science Communication, 26(1), 75–106. doi: 10.1177/1075547004267491.CrossRefGoogle Scholar
  2. Analysis & Policy Observatory. (2017). Available at:
  3. Antelman, K. (2004). Do open-access articles have a greater research impact? College & Research Libraries, 65(5), 372–382.CrossRefGoogle Scholar
  4. Archambault, E. (2013). The tipping point: Open access comes of age. In ISSI 2013 Proceedings of 14th International Society of Scientometrics and Informetrics Conference (pp. 1165–1680).Google Scholar
  5. Archambault, E., Amyot, D., Deschamps, P., Nicol, A., Rebout, L., & Roberge, G. (2013). Proportion of open access peer-reviewed papers at the European and World Levels20042011.Google Scholar
  6. Australian Public Service Commission. (2012). Appendix 2—APS agencies. Available at:
  7. Basit, T. (2017). Manual or electronic? The role of coding in qualitative data analysis. Educational Research, 45(2), 143–154. doi: 10.1080/0013188032000133548.CrossRefGoogle Scholar
  8. Bollen, J., Van de Sompel, H., Smith, J. A., & Luce, R. (2005). Toward alternative metrics of journal impact: A comparison of download and citation data. Information Processing and Management, 41, 1419–1440. doi: 10.1016/j.ipm.2005.03.024.CrossRefGoogle Scholar
  9. Bornmann, L., Haunschild, R., & Marx, W. (2016). Policy documents as sources for measuring societal impact. Scientometrics, 109(3), 1477–1495. doi: 10.1007/s11192-016-2115-y.CrossRefGoogle Scholar
  10. Clarivate Analytics. (2017). Journal List, Science Citation Index. Available at:
  11. Curtin University. (2017). Measure research impact and quality. Available at:
  12. Deakin University. (2016). Demonstrate your research impact. Available at:
  13. Department of Education, Training & Youth Affairs. (2000). The impact of educational research. Available at:
  14. Department of Industry Innovation and Science. (2017). Measuring impact and engagement of university research. Available at:
  15. Dolenc, J., Hünenberger, P., & Renn, O. (2016). Editorial: Metrics in research—For better or worse? Infozine. doi: 10.3929/ethz-a-010748857.Google Scholar
  16. Elsevier. (2017). Content. Available at:
  17. Gargouri, Y., Larivière, V., Gingras, Y., & Harnad, S. (2012). Green and gold open access percentages and growth, by discipline. 17th international conference on science and technology indicators. Montreal.Google Scholar
  18. Haunschild, R., & Bornmann, L. (2017). How many scientific papers are mentioned in policy- related documents? An empirical investigation using Web of Science and Altmetric data. Scientometrics, 110(3), 1209–1216. doi: 10.1007/s11192-016-2237-2.CrossRefGoogle Scholar
  19. Head, B., Ferguson, M., Cherney, A., & Boreham, P. (2014). Are policy-makers interested in social research? Exploring the sources and uses of valued information among public servants in Australia. Policy and Society, 33(2), 89–101. doi: 10.1016/j.polsoc.2014.04.004.CrossRefGoogle Scholar
  20. King’s College London and Digital Science. (2015). The nature, scale and beneficiaries of research impact: An initial analysis of Research Excellence Framework (REF) 2014 impact case studies. Bristol, United Kingdom: HEFCE. Available online:
  21. Lester, J. P. (1993). The utilization of policy analysis by State Agency Officials. Knowledge: Creation, Diffusion, Utilization, 14(3), 267–290.CrossRefGoogle Scholar
  22. Levitt, J. M., & Thelwall, M. (2008). Is multidisciplinary research more highly cited? A macrolevel study. Journal of the American Society for Information Science and Technology, 59(12), 1973–1984. doi: 10.1002/asi.CrossRefGoogle Scholar
  23. Macrae, D. (1969). Growth and decay curves in scientific citations. American Sociological Review, 34(5), 631–635.CrossRefGoogle Scholar
  24. MacRoberts, M., & MacRoberts, B. (1996). Problems of citation analysis. Scientometrics, 36(3), 435–444. doi: 10.1007/BF02129604.CrossRefGoogle Scholar
  25. McCabe, M. J., Snyder, C. M. (2014). Identifying the effect of open access on citations using a panel of science journals. Economic Inquiry. doi: 10.1111/ecin.12064.
  26. Miles, M. B., & Huberman, A. M. (1984). Qualitative data analysis: A sourcebook for new methods. Newbury Park: Sage.Google Scholar
  27. Norris, M., Oppenheim, C., & Rowland, F. (2008). The citation advantage of open-access articles. Journal of the American Society for Information Science and Technology, 59(12), 1963–1972. doi: 10.1002/asi.CrossRefGoogle Scholar
  28. Ouimet, M., & Ziam, S. (2009). The absorption of research knowledge by public civil servants. Evidence & Policy, 5(4), 331–350.CrossRefGoogle Scholar
  29. Rich, R. F., & Oh, C. H. (2000). Rationality and use of information in policy decisions. Science Communication, 22(2), 173–211.CrossRefGoogle Scholar
  30. Ritter, A. (2009). How do drug policy makers access research evidence? International Journal of Drug Policy, 20, 70–75. doi: 10.1016/j.drugpo.2007.11.017.CrossRefGoogle Scholar
  31. Saha, S. (2003). Impact factor: A valid measure of journal quality? Journal of the Medical Library Association, 91(1), 42–46.Google Scholar
  32. Universities Australia. (2016). Research Engagement and Impact Assessment Consultation Paper. Available online:
  33. Wallin, J. A. (2005). Bibliometric methods: Pitfalls and possibilities. Basic & Clinical Pharmacology & Toxicology, 97(5), 261–275. doi: 10.1111/j.1742-7843.2005.pto_139.x.CrossRefGoogle Scholar
  34. Weiss, C. H. (1980). Knowledge creep and decision accretion. Knowledge, I(3), 381–404.CrossRefGoogle Scholar

Copyright information

© Akadémiai Kiadó, Budapest, Hungary 2017

Authors and Affiliations

  1. 1.Australian National Centre for the Public Awareness of ScienceAustralian National UniversityCanberraAustralia

Personalised recommendations