Abstract
Retrieval and collection of accurate and detailed data is an obviously crucial aspect of any umbrella review, overview of reviews, or meta-epidemiologic study. Yet there is limited evidence guiding this key reviewing step, and many reviewers overlook its importance and ensuing need for accurate planning and undertaking. Nonetheless, the available evidence and expert opinion are coherently supporting a set of best practices to ensure the validity, thoroughness, and usability of retrieved data. In particular, data abstraction should be performed by two or more independent reviewers, on formally developed and piloted report forms. The utmost transparency should be sought, for instance, storing reviewing details in online data repositories for scrutiny or subsequent use. Finally, the risk of duplication when conducting an umbrella review or overview of reviews (e.g., considering twice the same trial results because of being reported by two separate systematic reviews) should be minimized, unless this is one of the meta-epidemiologic goals of the reviewing and research synthesis effort.
Nothing has such power to broaden the mind as the ability to investigate systematically and truly [1].
Marcus Aurelius, 121–180 AD, Rome
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsReferences
https://www.brainyquote.com/quotes/quotes/m/marcusaure118558.html. Accessed 28 June 2018.
Guyatt G, Rennie D, Meade MO, Cook DJ. Users’ guide to the medical literature. A manual for evidence-based clinical practice. 2nd ed. New York: McGraw-Hill Professional; 2008.
Biondi-Zoccai G, editor. Network meta-analysis: evidence synthesis with mixed treatment comparison. Hauppauge: Nova; 2014.
Higgins JPT, Green S, editors. Cochrane handbook for systematic reviews of interventions. Version 5.1.0 [updated March 2011]. The Cochrane Collaboration. 2011. Available from: www.cochrane-handbook.org. Last accessed 28 June 2018.
The Joanna Briggs Institute. The Joanna Briggs Institute Reviewers’ manual. Methodology for JBI umbrella reviews. Adelaide: The University of Adelaide; 2014.
Li L, Tian J, Tian H, Sun R, Liu Y, Yang K. Quality and transparency of overviews of systematic reviews. J Evid Based Med. 2012;5:166–73.
Systematic Review Data Repository (SRDR). Available at: srdr.ahrq.gov. Last accessed 28 June 2018.
Li T, Vedula SS, Hadar N, Parkin C, Lau J, Dickersin K. Innovations in data collection, management, and archiving for systematic reviews. Ann Intern Med. 2015;162:287–94.
Oxman AD, Guyatt GH. Validation of an index of the quality of review articles. J Clin Epidemiol. 1991;44:1271–8.
Shea BJ, Grimshaw JM, Wells GA, Boers M, Andersson N, Hamel C, Porter AC, Tugwell P, Moher D, Bouter LM. Development of AMSTAR: a measurement tool to assess the methodological quality of systematic reviews. BMC Med Res Methodol. 2007;7:10.
Diekemper RL, Ireland BK, Merz LR. Development of the documentation and appraisal review tool for systematic reviews. World J Meta Anal. 2015;3:142–50.
Pieper D, Buechter RB, Li L, Prediger B, Eikermann M. Systematic review found AMSTAR, but not r(evised)-AMSTAR, to have good measurement properties. J Clin Epidemiol. 2015;68:574–83.
La Torre G, Backhaus I, Mannocci A. Rating for narrative reviews: concept and development of the International Narrative Systematic Assessment tool. Senses Sci. 2015;2:31–5.
Stang A. Critical evaluation of the Newcastle-Ottawa scale for the assessment of the quality of nonrandomized studies in meta-analyses. Eur J Epidemiol. 2010;25:603–5.
Jørgensen AW, Maric KL, Tendal B, Faurschou A, Gøtzsche PC. Industry-supported meta-analyses compared with meta-analyses with non-profit or no support: differences in methodological quality and conclusions. BMC Med Res Methodol. 2008;8:60.
Centre for Reviews and Dissemination. Systematic reviews: CRD’s guidance for undertaking reviews in healthcare. York: University of York; 2009.
Eden J, Levit L, Berg A, Morton S. Finding what works in health care. Standards for systematic reviews. Washington, DC: The National Academies Press; 2011.
Agency for Healthcare Research and Quality. Methods guide for effectiveness and comparative effectiveness reviews. Rockville: Agency for Healthcare Research and Quality (US); 2008.
Buscemi N, Hartling L, Vandermeer B, Tjosvold L, Klassen TP. Single data extraction generated more errors than double data extraction in systematic reviews. J Clin Epidemiol. 2006;59:697–703.
Horton J, Vandermeer B, Hartling L, Tjosvold L, Klassen TP, Buscemi N. Systematic review data extraction: cross-sectional study showed that experience did not increase accuracy. J Clin Epidemiol. 2010;63:289–98.
Bachmann LM, Coray R, Estermann P, Ter Riet G. Identifying diagnostic studies in MEDLINE: reducing the number needed to read. J Am Med Inform Assoc. 2002;9:653–8.
Elamin MB, Flynn DN, Bassler D, Briel M, Alonso-Coello P, Karanicolas PJ, Guyatt GH, Malaga G, Furukawa TA, Kunz R, Schünemann H, Murad MH, Barbui C, Cipriani A, Montori VM. Choice of data extraction tools for systematic reviews depends on resources and review complexity. J Clin Epidemiol. 2009;62:506–10.
Ip S, Hadar N, Keefe S, Parkin C, Iovin R, Balk EM, Lau J. A web-based archive of systematic review data. Syst Rev. 2012;1:15.
Doctor evidence. Available at: drevidence.com. Last accessed 28 June 2018.
Berlin JA. Does blinding of readers affect the results of meta-analyses? University of Pennsylvania Meta-analysis Blinding Study Group. Lancet. 1997;350:185–6.
Jadad AR, Moore RA, Carroll D, Jenkinson C, Reynolds DJ, Gavaghan DJ, McQuay HJ. Assessing the quality of reports of randomized clinical trials: is blinding necessary? Control Clin Trials. 1996;17:1–12.
Gøtzsche PC, Hróbjartsson A, Maric K, Tendal B. Data extraction errors in meta-analyses that use standardized mean differences. JAMA. 2007;298:430–7.
Jonnalagadda SR, Goyal P, Huffman MD. Automating data extraction in systematic reviews: a systematic review. Syst Rev. 2015;4:78.
Marshall IJ, Kuiper J, Wallace BC. Automating risk of bias assessment for clinical trials. IEEE J Biomed Health Inform. 2015;19:1406–12.
Tramèr MR, Reynolds DJ, Moore RA, McQuay HJ. Impact of covert duplicate publication on meta-analysis: a case study. BMJ. 1997;315:635–40.
Caldwell DM, Welton NJ, Ades AE. Mixed treatment comparison analysis provides internally coherent treatment effect estimates based on overviews of reviews and can reveal inconsistency. J Clin Epidemiol. 2010;63:875–82.
Biondi-Zoccai GG, Lotrionte M, Abbate A, Testa L, Remigi E, Burzotta F, Valgimigli M, Romagnoli E, Crea F, Agostoni P. Compliance with QUOROM and quality of reporting of overlapping meta-analyses on the role of acetylcysteine in the prevention of contrast associated nephropathy: case study. BMJ. 2006;332:202–9.
Nowbar AN, Mielewczik M, Karavassilis M, Dehbi HM, Shun-Shin MJ, Jones S, Howard JP, Cole GD, Francis DP, DAMASCENE Writing Group. Discrepancies in autologous bone marrow stem cell trials and enhancement of ejection fraction (DAMASCENE): weighted regression and meta-analysis. BMJ. 2014;348:g2688.
Peruzzi M, De Falco E, Abbate A, Biondi-Zoccai G, Chimenti I, Lotrionte M, Benedetto U, Delewi R, Marullo AG, Frati G. State of the art on the evidence base in cardiac regenerative therapy: overview of 41 systematic reviews. Biomed Res Int. 2015;2015:613782.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer International Publishing AG, part of Springer Nature
About this chapter
Cite this chapter
Testa, L., Bollati, M. (2018). Abstracting Evidence. In: Biondi-Zoccai, G. (eds) Diagnostic Meta-Analysis. Springer, Cham. https://doi.org/10.1007/978-3-319-78966-8_8
Download citation
DOI: https://doi.org/10.1007/978-3-319-78966-8_8
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-78965-1
Online ISBN: 978-3-319-78966-8
eBook Packages: MedicineMedicine (R0)