Skip to main content

Abstract

Meta-analysis is the statistical method for synthesizing studies on the same topic and is often used in clinical psychology to quantify the efficacy of treatments. A major threat to the validity of meta-analysis is publication bias, which implies that some studies are less likely to be published and are therefore less often included in a meta-analysis. A consequence of publication bias is the overestimation of the meta-analytic effect size that may give a false impression with respect to the efficacy of a treatment, which might result in (avoidable) suffering of patients and waste of resources. Guidelines recommend to routinely assess publication bias in meta-analyses, but this is currently not common practice. This chapter describes popular and state-of-the-art methods to assess publication bias in a meta-analysis and summarizes recommendations for applying these methods. We also illustrate how these methods can be applied to two meta-analyses that are typical for clinical psychology such that psychologists can readily apply the methods in their own meta-analyses.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 99.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 129.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 179.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    A vector is R terminology for a particular data structure that contains in our case seven numeric values with the studies’ standardized mean difference (yi) and corresponding sampling variance (vi).

  2. 2.

    The funnel plot based on the data of the meta-analysis by Archer et al. (2012) is available in the annotated R codes (https://osf.io/qjk9b/)

  3. 3.

    The study’s mean, sample size, and standard deviation of both groups are available on page 73 of Cowlishaw et al. (2012).

  4. 4.

    Statistical power of the studies is computed using the estimate of the fixed-effect model as proxy for the true effect size and a two-tailed hypothesis with significance level 0.05 (Stanley et al., 2017).

  5. 5.

    Moderate heterogeneity is defined in terms of the I2-statistic that is commonly used in meta-analysis to quantify the heterogeneity. The I2-statistic (Higgins & Thompson, 2002) indicates the proportion of total variance that can be attributed to heterogeneity in true effect size. Moderate heterogeneity is I2 = 0.5 according to the rules-of-thumb proposed in Higgins et al. (2003).

  6. 6.

    Research is currently ongoing to study whether this assumption can be relaxed by not only weighing statistically significant and nonsignificant studies differently in p-uniform* but also allow more complex weighting schemes. For example, marginally significant studies (i.e., studies with p-values just above the significance threshold) may have a different probability of being published than other nonsignificant studies. Weighing these studies differently may improve estimation and drawing inferences.

References

Download references

Author Note

We would like to thank Claudia Kapp, Manuel Heinrich, and Johannes Heekerens for commenting on a previous version of this chapter.

The authors made the following contributions. Robbie C.M. van Aert: Conceptualization, Formal analysis, Writing—Original Draft Preparation, Writing—Review & Editing; Helen Niemeyer: Conceptualization, Writing—Review & Editing.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Robbie C. M. van Aert .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

van Aert, R.C.M., Niemeyer, H. (2022). Publication Bias. In: O'Donohue, W., Masuda, A., Lilienfeld, S. (eds) Avoiding Questionable Research Practices in Applied Psychology. Springer, Cham. https://doi.org/10.1007/978-3-031-04968-2_10

Download citation

Publish with us

Policies and ethics