Advertisement

Journal of Experimental Criminology

, Volume 7, Issue 2, pp 183–198 | Cite as

Research design influence on study outcomes in crime and justice: a partial replication with public area surveillance

  • Brandon C. Welsh
  • Meghan E. Peel
  • David P. Farrington
  • Henk Elffers
  • Anthony A. Braga
Article

Abstract

Does the quality of research design have an influence on study outcomes in crime and justice? This was the subject of an important study by Weisburd et al. (2001). They found a moderate and significant inverse relationship between research design and study outcomes: weaker designs, as indicated by internal validity, produced stronger effect sizes. Using a database of evaluations (n = 136) from systematic reviews that investigated the effects of public area surveillance on crime, this paper carried out a partial replication of Weisburd et al.’s study. We view it as a partial replication because it included only area- or place-based studies (i.e., there were no individual-level studies) and these studies used designs at the lower end of the evaluation hierarchy (i.e., not one of the studies used a randomized experimental design). In the present study, we report findings that are highly concordant with the earlier study. The overall correlation between research design and study outcomes is moderate but negative and significant (Tau-b = –.175, p = .029). This suggests that stronger research designs are less likely to report desirable effects or, conversely, weaker research designs may be biased upward. We explore possible explanations for this finding. Implications for policy and research are discussed.

Keywords

Evaluation design Evidence-based crime policy Public area surveillance Systematic review 

Notes

Acknowledgments

We are grateful to Chet Britt, Dan Mears, Chris Sullivan, the journal editor, and the anonymous reviewers for helpful comments.

References

  1. Braga, A. A. (2005). Hot spots policing and crime prevention: a systematic review of randomized controlled trials. Journal of Experimental Criminology, 1, 317–342.CrossRefGoogle Scholar
  2. Braga, A. A. (2010). Setting a higher standard for the evaluation of problem-oriented policing initiatives. Criminology and Public Policy, 9, 173–182.CrossRefGoogle Scholar
  3. Braga, A. A., & Hinkle, M. (2010). The participation of academics in the criminal justice working group process. In J. M. Klofas, N. K. Hipple, & E. F. McGarrell (Eds.), The new criminal justice: American communities and the changing world of crime control (pp. 114–120). New York: Routledge.Google Scholar
  4. Cook, T. D., & Campbell, D. T. (1979). Quasi-experimentation: Design and analysis issues for field settings. Chicago: Rand McNally.Google Scholar
  5. Cook, T. D., Shadish, W. R., & Wong, V. C. (2008). Three conditions under which experiments and observational studies produce comparable causal estimates: New findings from within-study comparisons. Journal of Policy Analysis and Management, 27, 724–750.CrossRefGoogle Scholar
  6. Cornish, D. B., & Clarke, R. V. (2003). Opportunities, precipitators and criminal decisions: A reply to Wortley’s critique of situational crime prevention. In M. J. Smith & D. B. Cornish (Eds.), Theory for practice in situational crime prevention. Crime prevention studies, vol. 16 (pp. 41–96). Monsey: Criminal Justice Press.Google Scholar
  7. Eck, J. E. (1995). A general model of the geography of illicit retail marketplaces. In J. E. Eck & D. Weisburd (Eds.), Crime and place. Crime prevention studies, vol. 4 (pp. 67–94). Monsey: Criminal Justice Press.Google Scholar
  8. Eck, J. E. (2006). Preventing crime at places. In L. W. Sherman, D. P. Farrington, B. C. Welsh, & D. L. MacKenzie (Eds.), Evidence-based crime prevention, rev. ed (pp. 241–294). New York: Routledge.Google Scholar
  9. Ekblom, P., & Pease, K. (1995). Evaluating crime prevention. In M. Tonry & D. P. Farrington (Eds.), Building a safer society: Strategic approaches to crime prevention. Crime and justice: A review of research, vol. 19 (pp. 585–662). Chicago: University of Chicago Press.Google Scholar
  10. Farrington, D. P. (2003). Methodological quality standards for evaluation research. The Annals of the American Academy of Political and Social Science, 587, 49–68.CrossRefGoogle Scholar
  11. Farrington, D. P. & Welsh, B. C. (Eds.) (2001). What works in preventing crime? Systematic reviews of experimental and quasi-experimental research. Annals of the American Academy of Political and Social Science, 578 [full issue]Google Scholar
  12. Farrington, D. P., & Welsh, B. C. (2005). Randomized experiments in criminology: what have we learned in the last two decades? Journal of Experimental Criminology, 1, 9–38.CrossRefGoogle Scholar
  13. Farrington, D. P., & Welsh, B. C. (2006). A half-century of randomized experiments on crime and justice. In M. Tonry (Ed.), Crime and justice: A review of research, vol. 34 (pp. 55–132). Chicago: University of Chicago Press.Google Scholar
  14. Farrington, D. P., & Welsh, B. C. (2007). Improved street lighting and crime prevention: A systematic review. Stockholm: National Council for Crime Prevention.Google Scholar
  15. Farrington, D. P., Gottfredson, D. C., Sherman, L. W., & Welsh, B. C. (2006). The Maryland scientific methods scale. In L. W. Sherman, D. P. Farrington, B. C. Welsh, & D. L. MacKenzie (Eds.), Evidence-based crime prevention, rev. ed (pp. 13–21). New York: Routledge.Google Scholar
  16. Henry, G. T. (2009). Estimating and extrapolating causal effects for crime prevention policy and program evaluation. In J. Knutsson & N. Tilley (Eds.), Evaluating crime reduction. Crime prevention studies, vol. 24 (pp. 147–173). Monsey: Criminal Justice Press.Google Scholar
  17. Klofas, J. M., Hipple, N. K., & McGarrell, E. F. (2010). The new criminal justice. In J. M. Klofas, N. K. Hipple, & E. F. McGarrell (Eds.), The new criminal justice: American communities and the changing world of crime control (pp. 3–17). New York: Routledge.Google Scholar
  18. Lipsey, M. W. (2003). Those confounded moderators in meta-analysis: Good, bad, and ugly. The Annals of the American Academy of Political and Social Science, 587, 69–81.CrossRefGoogle Scholar
  19. Lipsey, M. W., & Wilson, D. B. (2001). Practical meta-analysis. Thousand Oaks: Sage.Google Scholar
  20. Mears, D. P. (2007). Towards rational and evidence-based crime policy. Journal of Criminal Justice, 35, 667–682.CrossRefGoogle Scholar
  21. Mears, D. P. (2010). American criminal justice policy: An evaluation approach to increasing accountability and effectiveness. New York: Cambridge University Press.CrossRefGoogle Scholar
  22. National Research Council. (2008). Parole, desistence from crime, and community integration. Washington, DC: National Academies Press.Google Scholar
  23. Newman, O. (1972). Defensible space: Crime prevention through urban design. New York: Macmillan.Google Scholar
  24. Olds, D. L. (2009). In support of disciplined passion. Journal of Experimental Criminology, 5, 201–214.CrossRefGoogle Scholar
  25. Painter, K. A., & Farrington, D. P. (1997). The crime reducing effect of improved street lighting: The Dudley project. In R. V. Clarke (Ed.), Situational crime prevention: Successful case studies (2nd ed., pp. 209–226). Guilderland: Harrow and Heston.Google Scholar
  26. Painter, K. A., & Farrington, D. P. (1999). Street lighting and crime: Diffusion of benefits in the Stoke-on-Trent project. In K. A. Painter & N. Tilley (Eds.), Surveillance of public space: CCTV, street lighting and crime prevention. Crime prevention studies, vol. 10 (pp. 77–122). Monsey: Criminal Justice Press.Google Scholar
  27. Petersilia, J. (2008). Influencing public policy: An embedded criminologist reflects on California prison reform. Journal of Experimental Criminology, 4, 335–356.CrossRefGoogle Scholar
  28. Petrosino, A., & Soyden, H. (2005). The impact of program developers as evaluators on criminal recidivism: Results from meta-analyses of experimental and quasi-experimental research. Journal of Experimental Criminology, 1, 435–450.CrossRefGoogle Scholar
  29. Shadish, W. R., Cook, T. D., & Campbell, D. T. (2002). Experimental and quasi-experimental designs for generalized causal inference. Boston: Houghton Mifflin.Google Scholar
  30. Sherman, L. W. (Ed.) (2003). Misleading evidence and evidence-led policy: Making social science more experimental. Annals of the American Academy of Political and Social Science, 589 [full issue]Google Scholar
  31. Sherman, L. W., Gottfredson, D. C., MacKenzie, D. L., Eck, J. E., Reuter, P., & Bushway, S. D. (1997). Preventing crime: What works, what doesn’t, what’s promising. Washington DC: National Institute of Justice, U.S. Department of Justice.Google Scholar
  32. Sherman, L. W., Strang, H., Angel, C., Woods, D., Barnes, G. C., Bennett, S., et al. (2005). Effects of face-to-face restorative justice on victims of crime in four randomized, controlled trials. Journal of Experimental Criminology, 1, 367–395.CrossRefGoogle Scholar
  33. Weisburd, D. (1994). Evaluating community policing: Role tensions between practitioners and evaluators. In D. P. Rosenbaum (Ed.), The challenge of community policing: Testing the promises (pp. 274–277). Thousand Oaks: Sage.Google Scholar
  34. Weisburd, D. (2005). Hot spots policing experiments and criminal justice research: Lessons from the field. The Annals of the American Academy of Political and Social Science, 599, 220–245.CrossRefGoogle Scholar
  35. Weisburd, D., Lum, C. M., & Petrosino, A. (2001). Does research design affect study outcomes in criminal justice? The Annals of the American Academy of Political and Social Science, 578, 50–70.CrossRefGoogle Scholar
  36. Weisburd, D., Lum, C. M., & Petrosino, A. (Eds.) (2003). Assessing Systematic Evidence in Crime and Justice: Methodological Concerns and Empirical Outcomes. Annals of the American Academy of Political and Social Science, 587 [full issue]Google Scholar
  37. Welsh, B. C., & Farrington, D. P. (2009a). Making public places safer: Surveillance and crime prevention. New York: Oxford University Press.Google Scholar
  38. Welsh, B. C., & Farrington, D. P. (2009b). Public area CCTV and crime prevention: An updated systematic review and meta-analysis. Justice Quarterly, 26, 716–745.CrossRefGoogle Scholar
  39. Welsh, B. C., Mudge, M. E. & Farrington, D. P. (2010). Reconceptualizing public area surveillance and crime prevention: Security guards, place managers, and defensible space. Security Journal, 23 299–319.Google Scholar

Copyright information

© Springer Science+Business Media B.V. 2010

Authors and Affiliations

  • Brandon C. Welsh
    • 1
  • Meghan E. Peel
    • 1
  • David P. Farrington
    • 2
  • Henk Elffers
    • 3
  • Anthony A. Braga
    • 4
    • 5
  1. 1.School of Criminology and Criminal JusticeNortheastern UniversityBostonUSA
  2. 2.Cambridge UniversityCambridgeUK
  3. 3.Netherlands Institute for the Study of Crime and Law EnforcementAmsterdamThe Netherlands
  4. 4.Rutgers UniversityNewarkUSA
  5. 5.Harvard UniversityCambridgeUSA

Personalised recommendations