Skip to main content

Advertisement

Log in

Is forensic science in crisis?

  • Original Research
  • Published:
Synthese Aims and scope Submit manuscript

Abstract

The results of forensic science are believed to be reliable, and are widely used in support of verdicts around the world. However, due to the lack of suitable empirical studies, we actually know very little about the reliability of such results. In this paper, I argue that phenomena analogous to the main culprits for the replication crisis in psychology (questionable research practices, publication bias, or funding bias) are also present in forensic science. Therefore forensic results are significantly less reliable than is commonly believed. I conclude that in order to obtain reliable estimates for the reliability of forensic results, we need to conduct studies analogous to the large-scale replication projects in psychology. Additionally, I point to some ways for improving the reliability of forensic science, inspired by the reforms proposed in response to the replicability crisis.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Notes

  1. This distinction was introduced in (Ashbaugh, 1999).

  2. This strategy of testing reliability is not common in meta-science. Scientific procedures are complicated and time-consuming and therefore it is hard to set up a suitable testing situation. One study utilizing a similar approach is ( Dongen et al., 2019).

  3. Objectivity is understood as following in the report: “By objective feature-comparison methods, we mean methods consisting of procedures that are each defined with enough standardized and quantifiable detail that they can be performed by either an automated system or human examiners exercising little or no judgment.” (PCAST, 2016, p. 47).

  4. This problem was also discussed in Butler (2015, pp. 458–459).

  5. It is unknown how often such misconduct takes place. Moreover, given how forensic research is organized, is not obvious if it can be ever reliably established.

  6. Similarly, it is not clear to what extent similar relations between different biases are present in academic science. As far as know, there is only anecdotal evidence present in the literature. For example, Stapel (2012) claims that some of his results based on fabricated data were later replicated by independent researchers.

  7. It should be clear that I do not propose here to replicate the results of type two studies. Such an approach would suffer from difficulties described above.

References

  • Aczel, B., Szaszi, B., Sarafoglou, A., Kekecs, Z., Kucharský, S., Benjamin, D., et al. (2019). A consensus-based transparency checklist. Nature Human Behaviour, 4(1), 4–6.

    Article  Google Scholar 

  • Alexander, K. L. (2015). Prosecutors criticize D.C. crime lab’s handling of some DNA evidence. https://www.washingtonpost.com/local/crime/dc-prosecutorscriticize- city-crime-labs-handling-of-some-dna-cases/2015/03/05/b5244f88-bea4-11e4-b274-e5209a3bc9a9_story.html.

  • Andreoletti, M. (2021). Replicability crisis and scientific reforms: Overlooked issues and unmet challenges. International Studies in the Philosophy of Science, 33(3), 135–151.

    Article  Google Scholar 

  • Ashbaugh, D. R. (1999). Quantitative-qualitative friction ridge analysis: An introduction to basic and advanced ridgeology. Practical aspects of criminal and forensic investigations. Milton Park: Taylor & Francis.

    Book  Google Scholar 

  • Atkinson, K. (2016). Austin Scrambles with Fallout of closed DNA lab. https://www.texastribune.org/2016/07/30/more-questions-austin-police-department-lab/.

  • Baker, M. (2016). 1500 scientists lift the lid on reproducibility. Nature, 533, 452–454.

    Article  Google Scholar 

  • Bakker, M., van Dijk, A., & Wicherts, J. (2012). The rules of the game called psychological science. Perspectives on Psychological Science, 7, 543–554.

    Article  Google Scholar 

  • Bennett, C. M., Baird, A. A., Miller, M. B., & Wolford, G. L. (2011). Feeling the future: experimental evidence for anomalous retroactive influences on cognition and affect. Journal of Personality and Social Psychology, 100(3), 407.

  • Bennett, C. M., Baird, A., Miller, M., & Wolford, G. (2010). Neural correlates of serendipitous and unexpected results neural correlates of interspecies perspective taking in the post-mortem atlantic salmon: An argument for proper multiple comparisons correction.

  • Berthelot, J., Le Goff, B., & Maugars, Y. (2011). The Hawthorne effect: Stronger than the placebo effect? Joint, bone, spine: Revue du Rhumatisme, 78(4), 335–6.

    Article  Google Scholar 

  • Bishop, D. V. M. (1990). How to increase your chances of obtaining a significant association between handedness and disorder. Journal of Clinical and Experimental Neuropsychology, 12(5), 812–816.

    Article  Google Scholar 

  • Bishop, D. (2019). Rein in the four horsemen of irreproducibility. Nature, 568, 435.

    Article  Google Scholar 

  • Bishop, D. V. M. (2020). The psychology of experimental psychologists: Overcoming cognitive constraints to improve research: The 47th Sir Frederic Bartlett Lecture. PMID: 31724919. Quarterly Journal of Experimental Psychology, 73(1), 1–19.

  • Bush, M., Bush, P. & Sheets, H. (2011). Statistical evidence for the similarity of the human dentition. Journal of Forensic Sciences, 56.

  • Butler, J. (2009). Fundamentals of forensic DNA typing. Cambridge: Academic Press.

    Google Scholar 

  • Butler, J. M. (2015). Advanced topics in forensic DNA typing: Interpretation. San Diego: Elsevier Academic Press.

    Google Scholar 

  • Camerer, C., Dreber, A., Forsell, E., Ho, T.-H., Huber, J., Johannesson, M., et al. (2016). Evaluating replicability of laboratory experiments in economics. Science, 351, 1433–1436.

    Article  Google Scholar 

  • Camerer, C., Dreber, A., Holzmeister, F., Ho, T.-H., Huber, J., Johannesson, M., et al. (2018). Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015. Nature Human Behaviour, 2, 637–644.

  • Chin, J. (2014). Psychological science’s replicability crisis and what it means for science in the courtroom. Psychology, Public Policy and Law, 20, 225–238.

    Article  Google Scholar 

  • Chin, J. M., McFadden, R., & Edmond, G. (2020). Forensic science needs registered reports. Forensic Science International: Synergy, 2, 41–45.

    Google Scholar 

  • Chin, J. M., Ribeiro, G., & Rairden, A. (2019). Open forensic science. Journal of Law and the Biosciences, 255–288.

  • Chinn, J. (2012). Fingerprint expert’s mistake leads to wrongful conviction in Indiana. Retrieved February 11, 2022, from https://californiainnocenceproject.org/2012/10/fingerprint-experts-mistake-leads-to-wrongful-conviction-inindiana/.

  • Cole, S. A. (2014). Individualization is dead, long live individualization! Reforms of reporting practices for fingerprint analysis in the United States. Law, Probability and Risk, 13(2), 117–150.

    Article  Google Scholar 

  • Collaboration, Open Science. (2015). Estimating the reproducibility of psychological science. Science, 349 (6251).

  • Cooper, G., & Meterko, V. (2019). Cognitive bias research in forensic science: A systematic review. Forensic Science International, 297, 35–46.

    Article  Google Scholar 

  • Costakes, A. (2017). Department of justice to end national commission on forensic science. https://www.innocenceproject.org/department-justice-endsnational-commission-forensic-science/.

  • Dash, H., Shrivastava, P., & Das, S. (2020). Principles and practices of DNA analysis: A laboratory manual for forensic DNA typing.

  • Department of Justice (U.S.), Oversight and Review Division. (2011). A review of the FBI’s progress in responding to the recommendations in the office of the inspector general report on the fingerprint misidentification in the brandon mayfield case. Department of Justice, Office of the Inspector General, Oversight/Review Division: U.S.

    Google Scholar 

  • Dror, I., & Hampikian, G. (2011). Subjectivity and bias in forensic DNA mixture interpretation. Science & Justice: Journal of the Forensic Science Society, 51(4), 204–8.

    Article  Google Scholar 

  • Dror, I., Mogan, R., Rando, C., & Nakhaeizadeh, S. (2017). The bias snowball and the bias cascade effects: Two distinct biases that may impact forensic decision making. Journal of Forensic Sciences, 62.

  • Ebersole, C. R., Atherton, O. E., Belanger, A. L., Skulborstad, H. M., Allen, J. M., Banks, J. B., et al. (2016). Many labs 3: evaluating participant pool quality across the academic semester via replication. Special Issue: Confirmatory, Journal of Experimental Social Psychology, 67, 68–82.

    Google Scholar 

  • Edmond, G., Tangen, J., Searston, R. A., & Dror, I. (2015). Contextual bias and cross-contamination in the forensic sciences: The corrosive implications for investigations, plea bargains, trials and appeals. Law, Probability and Risk, 14, 1–25.

    Article  Google Scholar 

  • Eldridge, H., De Donno, M., & Champod, C. (2020). Testing the accuracy and reliability of palmar friction ridge comparisons—A black box study. Forensic Science International, 110457.

  • Errington, T. M., Mathur, M., Soderberg, C. K., Denis, A., Perfito, N., Iorns, E. et al. (2021). Investigating the replicability of preclinical cancer biology. eLife, 10.

  • Fagert, M., & Morris, K. (2015). Quantifying the limits of fingerprint variability. Forensic Science International, 254, 87–99.

    Article  Google Scholar 

  • Federal Bureau of Investigation. (2015). FBI testimony on microscopic hair analysis contained errors in at least 90 percent of cases in ongoing review. https://www.fbi.gov/news/pressrel/press-releases/fbi-testimony-on-microscopic-hair-analysis-contained-errors-in-at-least-90-percent-of-cases-in-ongoing-review.

  • Garrett, B., & Neufeld, P. (2009). Invalid forensic science testimony and wrongful convictions. Virginia Law Review, 95.

  • Gelman, Andrew, & Loken, E. (2019). The garden of forking paths : Why multiple comparisons can be a problem , even when there is no “ fishing expedition ” or “ p-hacking ” and the research hypothesis was posited ahead of time.

  • Giannelli, P. (2010). Scientific fraud. Criminal Law Bulletin.

  • Gigerenzer, G. (2004). Mindless statistics. The Journal of Socio-Economics, 33(5), 587–606.

    Article  Google Scholar 

  • Gill, P. (2014). Misleading DNA evidence: Reasons for miscarriages of justice. Elsevier

  • Gould, J., Carrano, J., Leo, R., & Young. J. (2013). Predicting erroneous convictions: A social science approach to miscarriages of justice. Criminology eJournal.

  • Gutiéerrez-Redomero, E., Alonso, M., Hernáandez-Hurtado, L., & Rodríguez-Villalba, J. (2010). Distribution of the minutiae in the fingerprints of a sample of the Spanish population. Forensic Science International, 208, 79–90.

    Article  Google Scholar 

  • Hahn, U., & Oaksford, M. (2005). How convinced should we be by negative evidence. In Proceedings of the 27th annual conference of the cognitive science society.

  • Heide, R., & Grünwald, P. (2017). Why optional stopping is a problem for Bayesians.

  • Himmelreich, C. (2009). Germany’s phantom serial killer: A DNA blunder. http://content.time.com/time/world/article/0,8599,1888126,00.html.

  • Houck, M., & Budowle, B. (2002). Correlation of microscopic and mitochondrial DNA hair comparisons. Journal of Forensic Sciences, 47(5), 964–7.

    Article  Google Scholar 

  • Hsu, A. S., Horng, A., Griffiths, T. L., & Chater, N. (2017). When absence of evidence is evidence of absence: Rational inferences from absent data. Cognitive Science, 41(S5), 1155–1167.

    Article  Google Scholar 

  • Iannelli, J. (2016). BSO crime lab could be mishandling crucial DNA evidence, whistleblower says. https://www.browardpalmbeach.com/news/bso-crime-labcould-be-mishandling-crucial-dna-evidence-whistleblower-says-7881208.

  • Innocence Project (IP) Website. (2020). Retrieved October 21, 2020 from https://www.innocenceproject.org/allcases/.

  • Ioannidis, J. P. A. (2005). Why most published research findings are false. PLOS Medicine, 2(8), 55–69.

    Article  Google Scholar 

  • Johnson, D., Cheung, F., & Donnellan, M. (2014). Does cleanliness influence moral judgments? A direct replication of Schnall, Benton, and Harvey (2008). Social Psychology, 45, 209.

    Article  Google Scholar 

  • Jones, C. (2010). A reason to doubt: The suppression of evidence and the inference of innocence. Journal of Criminal Law & Criminology, 100, 415–474.

    Google Scholar 

  • Kasper, S. P. (2015). Latent print processing guide. Elsevier.

  • Kedron, P., Li, W., Fotheringham, A. S., & Goodchild, M. F. (2021). Reproducibility and replicability: Opportunities and challenges for geospatial research. International Journal of Geographical Information Science, 35, 427–445.

  • Kimpton, C., Oldroyd, N., Watson, S., Frazier, R. R., Johnson, P. E., Millican, E. et al. (1996). Validation of highly discriminating multiplex short tandem repeat amplification systems for individual identification. ELECTROPHORESIS 17.

  • Klein, R. A., Vianello, M., Hasselman, F., Adams, B. G., Adams, R. B., Alper, S., et al. (2018). Many labs 2: Investigating variation in replicability across samples and settings. Advances in Methods and Practices in Psychological Science, 1(4), 443–490.

  • Kloosterman, A., Sjerps, M., & Quak, A. (2014). Error rates in forensic DNA analysis: Definition, numbers, impact and communication. Forensic Science International. Genetics, 12, 77–85.

    Article  Google Scholar 

  • Koehler, J. (2016a). Forensics or Fauxrensics? Ascertaining accuracy in the forensic sciences.

  • Koehler, J. (2016b). Intuitive error rate estimates for the forensic sciences.

  • Krane, D. E., Ford, S., Gilder, J. R., Inman, K., Jamieson, A., Koppl, R. G. et al. (2008). Sequential unmasking: A means of minimizing observer effects in forensic DNA interpretation. Journal of Forensic Sciences, 53.

  • Krimsky, S. (2006). Science in the private interest: Has the lure of profits corrupted biomedical research? IEEE Technology and Society Magazine, 25, 10–11.

    Article  Google Scholar 

  • Kruse, C. (2013). The Bayesian approach to forensic evidence: Evaluating, communicating, and distributing responsibility. Social Studies of Science, 43(5), 657–680.

    Article  Google Scholar 

  • Kücken, M., & Champod, C. (2013). Merkel cells and the individuality of friction ridge skin. Journal of Theoretical Biology, 317, 229–237.

    Article  Google Scholar 

  • Kukucka, J., & Kassin, S. (2014). Do confessions taint perceptions of handwriting evidence? An empirical test of the forensic confirmation bias. Law and human behavior, 38(3), 256–70.

    Article  Google Scholar 

  • Langenberg, G. (2009). A Performance study of the ACE-V process: A pilot study to measure the accuracy, precision, reproducibility, repeatability, and biasability of conclusions resulting from the ACE-V process. Journal of Forensic Identification, 59, 219–257.

    Google Scholar 

  • Linden, A. H. (2019). Heterogeneity of research results: New perspectives on psychological science.

  • Ling, S., Kaplan, J., & Berryessa, C. M. (2021). The importance of forensic evidence for decisions on criminal guilt. Science & Justice, 61(2), 142–149.

    Article  Google Scholar 

  • Manna, N. (2020). A scientist in Fort Worth’s crime lab says rules were broken. Now a judge wants answers. https://www.star-telegram.com/news/local/fortworth/article245756430.html.

  • Merton, R. K. (1973). The sociology of science: Theoretical and empirical investigations. University of Chicago Press.

  • Miller, L. S. (1987). Procedural bias in forensic science examinations of human hair. Law and Human Behavior.

  • Moretti, T., Baumstark, A., Defenbaugh, D., Keys, K., & Smerick, J. (2001). Validation of short tandem repeats (STRs) for forensic usage: Performance testing of fluorescent multiplex STR systems and analysis of authentic and simulated forensic samples. Journal of Forensic Sciences, 46, 647–60.

    Article  Google Scholar 

  • Murrie, D., Boccaccini, M., Guarnera, L., & Rufino, K., (2013). Are Forensic Experts Biased by the Side That Retained Them? Psychological Science, 24.

  • Murrie, D., Gardner, B. O., Kelley, S., & Dror, I. (2019). Perceptions and estimates of error rates in forensic science: A survey of forensic analysts. Forensic Science International, 302, 109887.

    Article  Google Scholar 

  • Nakhaeizadeh, S., Dror, I., & Morgan, R. (2014). Cognitive bias in forensic anthropology: Visual assessment of skeletal remains is susceptible to confirmation bias. Science & Justice: Journal of the Forensic Science Society, 54(3), 208–14.

    Article  Google Scholar 

  • National Academies of Sciences, Engineering, and Medicine. (2018). Open science by design: Realizing a vision for 21st century research. The National Academies Press.

  • National Academies of Sciences, Engineering, and Medicine. (2019). Reproducibility and replicability in science. The National Academies Press.

  • National Commission on Forensic Science. (2016). Ensuring that forensic analysis is based upon task-relevant information.

  • National Institute of Justice (U.S.). (2011). The fingerprint sourcebook.

  • National Research Council. (2009). Strengthening forensic science in the United States: A path forward. 1–328.

  • Neumann, C., Evett, I., & Skerrett, J. E. (2012). Quantifying the weight of evidence from a forensic fingerprint comparison: A new paradigm. Journal of The Royal Statistical Society Series A-statistics in Society, 175, 371–415.

    Article  Google Scholar 

  • Neumann, C., Champod, C., Yoo, M., Genessay, T., & Langenburg, G. (2014). Improving the understanding and the reliability of the concept of ”sufficiency” in friction ridge examination.

  • Nosek, B., & Lakens, D. (2014). Registered reports a method to increase the credibility of published results. Social Psychology, 45, 137.

    Article  Google Scholar 

  • Oaksford, M., & Hahn, U. (2004). A Bayesian approach to the argument from ignorance. Canadian Journal of Experimental Psychology, 58, 75–85.

    Article  Google Scholar 

  • Organization of Scientific Area Committees for Forensic Science. (2017). Guideline for the articulation of the decision-making process leading to an expert opinion of source identification in friction ridge examinations.

  • Osborne, N., Woods, S., Kieser, J., & Zajac, R. (2014). Does contextual information bias bitemark comparisons? Science & Justice, 54 .

  • Pacheco, I., Cerchiai, B., & Stoiloff, S. (2014). Miami-Dade research study for the reliability of the ACE-V process: Accuracy & precision in latent fingerprint examinations.

  • Pashler, H., & Wagenmakers, E.-J. (2012). Editors’ introduction to the special section on replicability in psychological science: A crisis of confidence? Perspectives on Psychological Science, 7, 528–530.

    Article  Google Scholar 

  • Passalacqua, N. V., Pilloud, M. A., & Belcher, W. R. (2019). Scientific integrity in the forensic sciences: Consumerism, conflicts of interest, and transparency. Science & Justice, 59(5), 573–579.

    Article  Google Scholar 

  • Peels, R. (2019). Replicability and replication in the humanities. Research Integrity and Peer Review, 4(1), 2.

    Article  Google Scholar 

  • Perry, B., Neltner, M., & Allen, T. S. (2013). A paradox of bias: Racial differences in forensic psychiatric diagnosis and determinations of criminal responsibility. Race and Social Problems, 5, 239–249.

    Article  Google Scholar 

  • Possley, M. (2019). Richard Jackson. Retrieved February 11, 2022, from https://www.law.umich.edu/special/exoneration/Pages/casedetail.aspx?caseid$=$3318.

  • President’s Council of Advisors on Science and Technology. (2016). Forensic science in criminal courts: Ensuring scientific validity of feature-comparison methods.

  • Protzko, J., Krosnick, J., Nelson, L., Nosek, B., Axt, J., Berent, M., et al. (2020). High replicability of newly-discovered social-behavioral findings is achievable.

  • Rawson, R. D., Ommen, R. K., Kinard, G., Johnson, J., & Yfantis, A. (1984). Statistical evidence for the individuality of the human dentition. Journal of Forensic Sciences, 29(1), 245–53.

  • Reich, D. (2018). Who we are and how we got here. Ancient DNA and the new science of the human past: Oxford University Press.

    Google Scholar 

  • Robertson, C. T., Kesselheim, A. (2016). Blinding as a solution to bias: Strengthening biomedical science, forensic science, and law.

  • Romero, F. (2018). Who should do replication labor? Advances in Methods and Practices in Psychological Science, 1(4), 516–537.

    Article  Google Scholar 

  • Romero, F., & Sprenger, J. (2019). Scientific self-correction: The Bayesian way.

  • Scargle, J. (2000). Publication bias: The “File-Drawer” problem in scientific inference. Journal of Scientific Exploration, 14, 91–106.

  • Schauer, J. M., Hedges, L. V. (2020). Assessing heterogeneity and power in replications of psychological experiments. Psychological Bulletin.

  • Scientific Working Group on DNA Analysis Methods. (2017). Interpretation guidelines for autosomal STR typing by forensic DNA testing laboratories.

  • Scientific Working Group on Friction Ridge Analysis. (2002). Friction ridge examination methodology for latent print examiners.

  • Serra-Garcia, M., Gneezy, U. (2021). Nonreplicable publications are cited more than replicable ones. Science Advances, 7(21), eabd1705.

  • Shaer, M. (2015). The false promise of DNA testing. https://www.theatlantic.com/magazine/archive/2016/06/a-reasonable-doubt/480747/.

  • Shaw, M., Cloos, L., Luong, R., Elbaz,S., & Flake, J. (2020). Measurement practices in large-scale replications: Insights from Many Labs 2. Canadian Psychology/Psychologie canadienne, 61 .

  • Shea, B., Niezgoda, S., & Chakraborty, R. (2001). CODIS STR loci data from 41 sample populations. Journal of Forensic Sciences, 46, 453–89.

    Google Scholar 

  • Sheets, H., Bush, P., & Bush, M. (2012). Bitemarks: Distortion and covariation of the maxillary and mandibular dentition as impressed in human skin. Forensic Science International, 223(1–3), 202–7.

    Article  Google Scholar 

  • Simmons, J. P., Nelson, L. D., & Simonsohn, U. (2011). False-positive psychology: Undisclosed flexibility in data collection and analysis allows presenting anything as significant. Psychological Science, 22(11), 1359–1366.

    Article  Google Scholar 

  • Smalarz, L., Madon, S., Yang, Y., M. Guyll, M., & Buck, S. E. (2016). The perfect match: Do criminal stereotypes bias forensic evidence analysis? Law and Human Behavior, 40(4), 420–9.

  • Smaldino, P. E., & McElreath, R. (2016). The natural selection of bad science. Royal Society Open Science, 3(9), 160384.

    Article  Google Scholar 

  • Smit, N., Morgan, R., & Lagnado, D. (2018). A systematic analysis of misleading evidence in unsafe rulings in England and Wales. Science & Justice: Journal of the Forensic Science Society, 58(2), 128–137.

    Article  Google Scholar 

  • Stanley, T. D., Carter, E. C., & Doucouliagos, H. (2018). What meta-analyses reveal about the replicability of psychological research. Psychological Bulletin, 144, 1325–1346.

    Article  Google Scholar 

  • Stapel, D. (2012). Ontsporing. Prometheus Amsterdam.

  • Sui, D. Z., & Kedron, P. (2020). Reproducibility and replicability in the context of the contested identities of geography. Annals of the American Association of Geographers, 111, 1275–1283.

    Article  Google Scholar 

  • Swazey, J., Anderson, M., & Lewis, K. (1993). Ethical problems in academic research. American Scientist, 81, 542–553.

    Google Scholar 

  • Tangen, J., Thompson, M. B., & McCarthy, D. J. (2011). Identifying fingerprint expertise. Psychological Science, 22, 995–997.

    Article  Google Scholar 

  • Taroni, F., Bozza, S., Hicks, T., & Garbolino, P. (2019). More on the question ‘When does absence of evidence constitute evidence of absence?’ How Bayesian confirmation theory can logically support the answer. Forensic Science International, 301, e59–e63.

  • The National Registry of Exonerations (NRE) website. (2020). Retrieved October 21 2020, from https://www.law.umich.edu/special/exoneration/Pages/about.aspx.

  • Thompson, W. (2005). Subjective interpretation, laboratory error and the value of forensic DNA evidence: Three case studies. Genetica, 96, 153–168.

    Article  Google Scholar 

  • Thompson, W. (2009). Painting the target around the matching profile: The Texas sharpshooter fallacy in forensic DNA interpretation. Law, Probability and Risk, 8, 257–276.

    Article  Google Scholar 

  • Thompson, N., & Scurich, W. C. (2018). When does absence of evidence constitute evidence of absence? Forensic Science International, 291, 291.

    Article  Google Scholar 

  • Thompson, W. C., Taroni, F., & Aitken, C. G. G. (2003). How the probability of a false positive affects the value of DNA evidence. Journal of Forensic Sciences, 48(1), 47–54.

    Article  Google Scholar 

  • Thornton, S. (2018). Karl popper. https://plato.stanford.edu/entries/popper/.

  • Todd, D. M., Beatty, L. G., & Zeng, Z. (2021). Correctional populations in the United States, 2019—Statistical tables. https://bjs.ojp.gov/library/publications/correctional-populations-united-states-2019-statistical-tables.

  • Ulery, B., Hicklin, R., Buscaglia, J., & Roberts, M. (2011). Accuracy and reliability of forensic latent fingerprint decisions. Proceedings of the National Academy of Sciences of the United States of America, 108, 7733–8.

  • van Dongen, N., Van Doorn, J., Gronau, Q., van Ravenzwaaij, D., Hoekstra, R., Haucke, M., et al. (2019). Multiple perspectives on inference for two simple statistical scenarios. The American Statistician, 73, 328–339.

  • Vandenbroucke, J. (1988). Passive smoking and lung cancer: A publication bias? British Medical Journal (Clinical research ed.), 296, 391–392.

    Article  Google Scholar 

  • Vazire, S. (2016). Editorial. Social Psychological and Personality. Science, 7(1), 3–7.

    Google Scholar 

  • Walsh, K., et al. (2017). Estimating the prevalence of wrongful convictions. Retrieved October 21, 2020, from https://www.ncjrs.gov/pdffiles1/nij/grants/251115.pdf.

  • Whittaker, D. (1975). Some laboratory studies on the accuracy of bite mark comparison. International Dental Journal, 25, 166–71.

    Google Scholar 

  • Wicherts, J., Veldkamp, C. L. S., Augusteijn, H., Bakker, M., van Aert, R. C. M., & van Assen, M. V. (2016). Degrees of freedom in planning, running, analyzing, and reporting psychological studies: A checklist to avoid p-Hacking. Frontiers in Psychology, 7.

  • Wilholt, T. (2008). Bias and values in scientific research. Studies in History and Philosophy of Science Part A, 40(1), 92–101.

    Article  Google Scholar 

  • Witte, E., Zenker, F. (2017). From discovery to justification: Outline of an ideal research program in empirical psychology. Frontiers in Psychology, 8.

  • Yarkoni, T. (2020). The generalizability crisis. Behavioral and Brain Sciences, 1–37.

Download references

Acknowledgements

I would like to thank Rafał Urbaniak, Mattia Androleti, Jan Sprenger, Gustavo Cevolani, Davide Coraci, Weronika Majek, Patryk Dziurosz-Serafinowicz, Pavel Janda, Paweł Pawłowski, Robert Różański and the anonymous referees for their useful comments.

Funding

This research has been funded by the Polish National Science Centre [grant number 2016/22/E/HS1/00304].

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Michał Sikorski.

Ethics declarations

Conflict of interest

There are no conflicts of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Sikorski, M. Is forensic science in crisis?. Synthese 200, 188 (2022). https://doi.org/10.1007/s11229-022-03685-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s11229-022-03685-z

Keywords

Navigation