Abstract
Publication bias is an issue of great concern across a range of scientific fields. Although less documented in the behavior science fields, there is a need to explore viable methods for evaluating publication bias, in particular for studies based on single-case experimental design logic. Although publication bias is often detected by examining differences between meta-analytic effect sizes for published and grey studies, difficulties identifying the extent of grey studies within a particular research corpus present several challenges. We describe in this article several meta-analytic techniques for examining publication bias when published and grey literature are available as well as alternative meta-analytic techniques when grey literature is inaccessible. Although the majority of these methods have primarily been applied to meta-analyses of group design studies, our aim is to provide preliminary guidance for behavior scientists who might use or adapt these techniques for evaluating publication bias. We provide sample data sets and R scripts to follow along with the statistical analysis in hope that an increased understanding of publication bias and respective techniques will help researchers understand the extent to which it is a problem in behavior science research.
Similar content being viewed by others
Notes
Arguments are inputs associated with functions in r code. A function can have several or no arguments.
References
Aydin, O., & Yassikaya, M. Y. (2021). Validity and reliability analysis of the PlotDigitizer software program for data extraction from single-case graphs. Perspectives on Behavior Science. Advance online publication. https://doi.org/10.1007/s40614-021-00284-0
Babb, S., Raulston, T. J., McNaughton, D., Lee, J., & Weintraub, R. (2020). The effects of social skill interventions for adolescents with autism: A meta-analysis. Remedial & Special Education. Advance online publication. https://doi.org/10.1177/0741932520956362
Barnard-Brak, L., Watkins, L., & Richman, D. M. (2021). Autocorrelation and estimates of treatment effect size for single-case experimental design data. Behavioral Interventions. Advance online publication. https://doi.org/10.1002/bin.1783
Becraft, J. L., Borrero, J. C., Sun, S., & McKenzie, A. A. (2020). A primer for using multilevel models to meta-analyze single case design data with AB phases. Journal of Applied Behavior Analysis, 53(3), 1799–1821. https://doi.org/10.1002/jaba.698.
Begg, C. B., & Mazumdar, M. (1994). Operating characteristics of a rank correlation test for publication bias. Biometrics, 50(4), 1088–1101.
Borenstein, M., Hedges, L. V., Higgins, J. P., & Rothstein, H. R. (2011). Introduction to meta-analysis. John Wiley & Sons.
Bowman-Perrott, L., Burke, M. D., Zhang, N., & Zaini, S. (2014). Direct and collateral effects of peer tutoring on social and behavioral outcomes: A meta-analysis of single-case research. School Psychology Review, 43(3), 260–285. https://doi.org/10.1080/02796015.2014.12087427.
Branch, M. N. (2019). The “reproducibility crisis”: Might the methods used frequently in behavior-analysis research help? Perspectives on Behavior Science, 42(1), 77–89. https://doi.org/10.1007/s40614-018-0158-5.
Carpenter, C. J. (2012). A trim and fill examination of the extent of publication bias in communication research. Communication Methods & Measures, 6(1), 41–55. https://doi.org/10.1080/19312458.2011.651347.
Craig, A. R., & Fisher, W. W. (2019). Randomization tests as alternative analysis methods for behavior-analytic data. Journal of the Experimental Analysis of Behavior, 111(2), 309–328. https://doi.org/10.1002/jeab.500.
Dickersein, K. (2005). Publication bias: Recognizing the problem, understanding its origins and scope, and preventing harm. In H. R. Rothstein, A. J. Sutton, & M. Borenstein (Eds.), Publication bias in meta-analysis: Prevention, assessment and adjustments (pp. 11–33). John Wiley & Sons. https://doi.org/10.1002/0470870168
Dowdy, A., Hantula, D. A., Travers, J. C., & Tincani, M. (2021). Meta-analytic based methods to detect publication bias in behavior science research: Supplementary files. https://osf.io/6r95p/?view_only=fcaae84f33d144f1a3a892171cce7937
Dowdy, A., Tincani, M., & Schneider, W. J. (2020). Evaluation of publication bias in response interruption and redirection: A meta-analysis. Journal of Applied Behavior Analysis, 53(4), 2151–2171. https://doi.org/10.1002/jaba.724.
Duval, S. (2005). The trim and fill method. In H. R. Rothstein, A. J. Sutton, & M. Borenstein (Eds.), Publication bias in meta-analysis: Prevention, assessment and adjustments (pp. 127–144). John Wiley & Sons. https://doi.org/10.1002/0470870168
Duval, S., & Tweedie, R. (2000a). A nonparametric “trim and fill” method of accounting for publication bias in meta-analysis. Journal of the American Statistical Association, 95(449), 89–98. https://doi.org/10.2307/2669529.
Duval, S., & Tweedie, R. (2000b). Trim and fill: A simple funnel-plot–based method of testing and adjusting for publication bias in meta-analysis. Biometrics, 56(2), 455–463. https://doi.org/10.1111/j.0006-341x.2000.00455.x.
Egger, M., Smith, G. D., Schneider, M., & Minder, C. (1997). Bias in meta-analysis detected by a simple, graphical test. BMJ, 315(7109), 629–634. https://doi.org/10.1136/bmj.315.7109.629.
Fernández-Castilla, B., Declercq, L., Jamshidi, L., Beretvas, N., Onghena, P., & Van den Noortgate, W. (2020). Visual representations of meta-analyses of multiple outcomes: Extensions to forest plots, funnel plots, and caterpillar plots. Methodology, 16(4), 299–315. https://doi.org/10.5964/meth.4013.
Franco, A., Malhotra, N., & Simonovits, G. (2014). Publication bias in the social sciences: Unlocking the file drawer. Science, 345(6203), 1502–1505. https://doi.org/10.1126/science.1255484.
Garwood, J. D., McKenna, J. W., Roberts, G. J., Ciullo, S., & Shin, M. (2021). Social studies content knowledge interventions for students with emotional and behavioral disorders: A meta-analysis. Behavior Modification, 45(1), 147–176. https://doi.org/10.1177/0145445519834622
Gilroy, S. P., & Kaplan, B. A. (2019). Furthering open science in behavior analysis: An introduction and tutorial for using GitHub in research. Perspectives on Behavior Science, 42(3), 565–581. https://doi.org/10.1007/s40614-019-00202-5.
Hales, A. H., Wesselmann, E. D., & Hilgard, J. (2019). Improving psychological science through transparency and openness: An overview. Perspectives on Behavior Science, 42(1), 13–31. https://doi.org/10.1007/s40614-018-00186-8.
Harbord, R. M., Egger, M., & Sterne, J. A. (2006). A modified test for small-study effects in meta-analyses of controlled trials with binary endpoints. Statistics in Medicine, 25(20), 3443–3457. https://doi.org/10.1002/sim.2380.
Higgins, J. P., & Thompson, S. G. (2002). Quantifying heterogeneity in a meta-analysis. Statistics in Medicine, 21(11), 1539–1558. https://doi.org/10.1002/sim.1186.
Jacobs, K. W. (2019). Replicability and randomization test logic in behavior analysis. Journal of the Experimental Analysis of Behavior, 111(2), 329–341. https://doi.org/10.1002/jeab.501.
Johnson, A. H., & Cook, B. G. (2019). Preregistration in single-case design research. Exceptional Children, 86(1), 95–112. https://doi.org/10.1177/0014402919868529.
Leavitt, K. (2013). Publication bias might make us untrustworthy, but the solutions may be worse. Industrial & Organizational Psychology, 6(3), 290–295. https://doi.org/10.1111/iops.12052.
Ledford, J. R., & Pustejovsky, J. E. (2021). Systematic review and meta-analysis of stay-play-talk interventions for improving social behaviors of young children. Journal of Positive Behavior Interventions. Advance online publication. https://doi.org/10.1177/1098300720983521
Light, R. J., Singer, J. D., & Willett, J. B. (1994). The visual presentation and interpretation of meta-analyses. In H. M. Cooper & L. V. Hedges (Eds.), The handbook of research synthesis (pp. 439–454). Russell Sage Foundation.
Lin, L., & Chu, H. (2018). Quantifying publication bias in meta-analysis. Biometrics, 74(3), 785–794. https://doi.org/10.1111/biom.12817.
Lishner, D. A. (2021). Sorting the file drawer: A typology for describing unpublished studies. Perspectives on Psychological Science. Advance online publication. https://doi.org/10.1177/1745691620979831
Macaskill, P., Walter, S. D., & Irwig, L. (2001). A comparison of methods to detect publication bias in meta-analysis. Statistics in Medicine, 20(4), 641–654. https://doi.org/10.1002/sim.698.
MacGillivray, H. L. (1986). Skewness and asymmetry: Measures and orderings. Annals of Statistics, 14(3), 994–1011.
Marks-Anglin, A., & Chen, Y. (2020). A historical review of publication bias. Research Synthesis Methods, 11(6), 725–742. https://doi.org/10.1002/jrsm.1452.
McCormack, J. C., Elliffe, D., & Virués-Ortega, J. (2019). Quantifying the effects of the differential outcomes procedure in humans: A systematic review and a meta-analysis. Journal of Applied Behavior Analysis, 52(3), 870–892. https://doi.org/10.1002/jaba.578.
Moeyaert, M., Manolov, R., & Rodabaugh, E. (2020). Meta-analysis of single-case research via multilevel models: Fundamental concepts and methodological considerations. Behavior Modification, 44(2), 265–295. https://doi.org/10.1177/0145445518806867.
Newland, M. C. (2019). An information theoretic approach to model selection: A tutorial with Monte Carlo confirmation. Perspectives on Behavior Science, 42(3), 583–616. https://doi.org/10.1007/s40614-019-00206-1.
Orwin, R. G. (1983). A fail-safe N for effect size in meta-analysis. Journal of Educational Statistics, 8(2), 157–159. https://doi.org/10.2307/1164923.
Peters, J. L., Sutton, A. J., Jones, D. R., Abrams, K. R., & Rushton, L. (2006). Comparison of two methods to detect publication bias in meta-analysis. JAMA, 295(6), 676–680. https://doi.org/10.1001/jama.295.6.676.
Pustejovsky, J. E. (2018). Using response ratios for meta-analyzing single-case designs with behavioral outcomes. Journal of School Psychology, 68, 99–112. https://doi.org/10.1016/j.jsp.2018.02.00.
Pustejovsky, J. E., & Tipton, E. (2018). Small-sample methods for cluster-robust variance estimation and hypothesis testing in fixed effects models. Journal of Business & Economic Statistics, 36(4), 672–683. https://doi.org/10.1080/07350015.2016.1247004.
R Core Team (2021). R: A language and environment for statistical computing. R Foundation for Statistical Computing, Vienna, Austria. https://www.R-project.org/
Rodgers, M. A., & Pustejovsky, J. E. (2020). Evaluating meta-analytic methods to detect selective reporting in the presence of dependent effect sizes. Psychological Methods, 26(2), 141–160. https://doi.org/10.1037/met0000300
Rosenberg, M. S. (2005). The file-drawer problem revisited: A general weighted method for calculating fail-safe numbers in meta-analysis. Evolution, 59(2), 464–468.
Rosenthal, R. (1979). The file drawer problem and tolerance for null results. Psychological Bulletin, 86(3), 638–641. https://doi.org/10.1037/0033-2909.86.3.638.
Rothstein, H. R., Sutton, A. J., & Borenstein, M. (2005). Publication bias in meta-analysis. In H. R. Rothstein, A. J. Sutton, & M. Borenstein (Eds.), Publication bias in meta-analysis: Prevention, assessment and adjustments (pp. 1–7). John Wiley & Sons. https://doi.org/10.1002/0470870168
Sham, E., & Smith, T. (2014). Publication bias in studies of an applied behavior-analytic intervention: An initial analysis. Journal of Applied Behavior Analysis, 47(3), 663–678. https://doi.org/10.1002/jaba.146.
Shin, M., Bryant, D. P., Powell, S. R., Jung, P., Ok, M. W., & Hou, F. (2020). A meta-analysis of single-case research on word-problem instruction for students with learning disabilities. Remedial & Special Education. Advance online publication. https://doi.org/10.1177/0741932520964918
Simonsohn, U. (2015). Small telescopes: Detectability and the evaluation of replication results. Psychological Science, 26(5), 559–569. https://doi.org/10.1177/0956797614567341.
Simonsohn, U., Nelson, L. D., & Simmons, J. P. (2014). P-curve: a key to the file-drawer. Journal of Experimental Psychology: General, 143(2), 534. https://doi.org/10.1037/a0033242.
Sterling, T. D. (1959). Publication decisions and their possible effects on inferences drawn from tests of significance—or vice versa. Journal of the American Statistical Association, 54(285), 30–34. https://doi.org/10.2307/2282137.
Sterne, J. A., & Egger, M. (2001). Funnel plots for detecting bias in meta-analysis: Guidelines on choice of axis. Journal of Clinical Epidemiology, 54(10), 1046–1055. https://doi.org/10.1016/s0895-4356(01)00377-8.
Sterne, J. A., Gavaghan, D., & Egger, M. (2000). Publication and related bias in meta-analysis: Power of statistical tests and prevalence in the literature. Journal of Clinical Epidemiology, 53(11), 1119–1129. https://doi.org/10.1016/s0895-4356(00)00242-0.
Tincani, M., & Travers, J. (2018). Publishing single-case research design studies that do not demonstrate experimental control. Remedial & Special Education, 39(2), 118–128. https://doi.org/10.1177/0741932517697447.
Tincani, M., & Travers, J. (2019). Replication research, publication bias, and applied behavior analysis. Perspectives on Behavior Science, 42(1), 59–75. https://doi.org/10.1007/s40614-019-00191-5.
Tipton, E., & Pustejovsky, J. E. (2015). Small-sample adjustments for tests of moderators and model fit using robust variance estimation in meta-regression. Journal of Educational & Behavioral Statistics, 40(6), 604–634. https://doi.org/10.3102/1076998615606099.
Viechtbauer, W. (2010). Conducting meta-analyses in R with the metafor package. Journal of Statistical Software, 36(3), 1–48. https://doi.org/10.18637/jss.v036.i03.
Weaver, E. S., & Lloyd, B. P. (2019). Randomization tests for single case designs with rapidly alternating conditions: An analysis of p-values from published experiments. Perspectives on Behavior Science, 42(3), 617–645. https://doi.org/10.1007/s40614-018-0165-6.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Research reported in this publication was supported by grant number 2026513 from the National Science Foundation and the National Institute Of Mental Health of the National Institutes of Health under Award Number R43MH121230. The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.
Conflict of Interest
We have no known conflict known interest to disclose.
Additional information
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Dowdy, A., Hantula, D.A., Travers, J.C. et al. Meta-Analytic Methods to Detect Publication Bias in Behavior Science Research. Perspect Behav Sci 45, 37–52 (2022). https://doi.org/10.1007/s40614-021-00303-0
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s40614-021-00303-0