Reviewing Literature for and as Research

  • Nigel D’Souza
  • Geoff Wong
Part of the Innovation and Change in Professional Education book series (ICPE, volume 17)


The literature review has become an important tool to summarise and synthesise knowledge from the growing volume of research in medical education. The diversity of literature review methodologies has proliferated to an extent that can appear bewildering, particularly within qualitative and mixed methods approaches, some of which originate from non-medical disciplines. Matching the appropriate review technique to the research question(s) will determine its success. This chapter describes the breadth of quantitative, qualitative and mixed methods review techniques which may be used in educational research and looks at their strengths and weaknesses. Case scenarios are used to illustrate how specific review techniques can be used to address different research questions. Common essential steps to conducting a literature review, regardless of review technique, are described to provide some practical guidance.


Knowledge synthesis Systematic review Meta-ethnography Realist review Medical education Health knowledge, attitudes, practice Evidence-based medicine Research design/standards 


  1. 1.
    Tricco, A. C., et al. (2016). A scoping review identifies multiple emerging knowledge synthesis methods, but few studies operationalize the method. Journal of Clinical Epidemiology, 73, 19–28.CrossRefGoogle Scholar
  2. 2.
    Higgins, J. P. T., & Green, S. E.. Cochrane handbook for systematic reviews of interventions. The Cochrane Collaboration.Google Scholar
  3. 3.
    Moher, D., et al. (2009). Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. Annals of Internal Medicine, 151(4), 264–269.CrossRefGoogle Scholar
  4. 4.
    Training, C. (2016). Learn how to conduct, edit, and read systematic reviews. Cited 2016. Available from:
  5. 5.
    Portsmouth, U.o. (2016). PgCert systematic reviews in health. Cited 2016. Available from:
  6. 6.
    Group, O.L.o.E.W. (2011). The Oxford 2011 levels of evidence. Oxford: Oxford Centre for Evidence-Based Medicine.Google Scholar
  7. 7.
    Walker, E., Hernandez, A. V., & Kattan, M. W. (2008). Meta-analysis: Its strengths and limitations. Cleveland Clinic Journal of Medicine, 75(6), 431.CrossRefGoogle Scholar
  8. 8.
    Pawson, R. (2002). Evidence-based policy: In search of a method. Evaluation, 8(2), 157–181.CrossRefGoogle Scholar
  9. 9.
    Berkwits, M., & Aronowitz, R. (1995). Different questions beg different methods. Journal of General Internal Medicine, 10(7), 409–410.CrossRefGoogle Scholar
  10. 10.
    France, E. F., et al. (2015). Protocol-developing meta-ethnography reporting guidelines (eMERGe). BMC Medical Research Methodology, 15(1), 1–14.CrossRefGoogle Scholar
  11. 11.
    France, E. F. (2016). The eMERGe project – developing a meta-ethnography reporting guideline. Available from:
  12. 12.
    Britten, N., et al. (2002). Using meta ethnography to synthesise qualitative research: A worked example. Journal of Health Services Research & Policy, 7, 209–215.CrossRefGoogle Scholar
  13. 13.
    Kastner, M., Tricco, A., Soobiah, C., Lillie, E., Perrier, L., Horsley, T., et al.. What is the most appropriate knowledge synthesis method to conduct a review? Protocol for a scoping review. BMC Medical Research Methodology, 12(1).Google Scholar
  14. 14.
    Tricco, A. C., et al. (2016). Knowledge synthesis methods for integrating qualitative and quantitative data: A scoping review reveals poor operationalization of the methodological steps. Journal of Clinical Epidemiology, 73, 29–35.CrossRefGoogle Scholar
  15. 15.
    Benoot, C., Hannes, K., & Bilsen, J. (2016). The use of purposeful sampling in a qualitative evidence synthesis: A worked example on sexual adjustment to a cancer trajectory. BMC Medical Research Methodology, 16, 21.CrossRefGoogle Scholar
  16. 16.
    Wong, G., et al. (2013). RAMESES publication standards: Realist syntheses. BMC Medicine, 11, 21.CrossRefGoogle Scholar
  17. 17.
    Pawson, R. (2006). Evidence-based policy. A realist perspective. London: Sage.CrossRefGoogle Scholar
  18. 18.
    Astbury, B., & Leeuw, F. L. (2010). Unpacking black boxes: Mechanisms and theory building in evaluation. American Journal of Evaluation, 31(3), 363–381.CrossRefGoogle Scholar
  19. 19.
    Dalkin, S. M., et al. (2015). What’s in a mechanism? Development of a key concept in realist evaluation. Implementation Science, 10(1), 1.CrossRefGoogle Scholar
  20. 20.
    Wong, G., Greenhalgh, T., & Pawson, R. (2010). Internet-based medical education: A realist review of what works, for whom and in what circumstances. BMC Medical Education, 10, 12.CrossRefGoogle Scholar
  21. 21.
    Pawson, R., & Tilley, N.. (1997). Realist evaluation. Los Angeles: Sage.Google Scholar
  22. 22.
    Bordage, G., & Dawson, B. (2003). Experimental study design and grant writing in eight steps and 28 questions. Medical Education, 37(4), 376–385.CrossRefGoogle Scholar
  23. 23.
    Moher, D., et al. (2015). Preferred reporting items for systematic review and meta-analysis protocols (PRISMA-P) 2015 statement. System Review, 4, 1.CrossRefGoogle Scholar
  24. 24.
    Greenhalgh, T., & Peacock, R. (2005). Effectiveness and efficiency of search methods in systematic reviews of complex evidence: Audit of primary sources. BMJ, 331, 1064–1065.CrossRefGoogle Scholar
  25. 25.
    Bearman, M. (2016). Quality and literature reviews: Beyond reporting standards. Medical Education, 50(4), 382–384.CrossRefGoogle Scholar
  26. 26.
    Moher, D., et al. (2009). Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. Annals of Internal Medicine, 151, 264–269.CrossRefGoogle Scholar

Copyright information

© Springer Nature Singapore Pte Ltd. 2019

Authors and Affiliations

  • Nigel D’Souza
    • 1
  • Geoff Wong
    • 2
  1. 1.Wessex DeaneryWinchesterUnited Kingdom
  2. 2.Nuffield Department of Primary Care Health SciencesOxfordUK

Personalised recommendations