Skip to main content

Detecting Racial Bias in Jury Selection

Abstract

To support the 2019 U.S. Supreme Court case “Flowers v. Mississippi”, APM Reports collated historical court records to assess whether the State exhibited a racial bias in striking potential jurors. This analysis used backward stepwise logistic regression to conclude that race was a significant factor, however this method for selecting relevant features is only a heuristic, and additionally cannot consider interactions between features. We apply Optimal Feature Selection to identify the globally optimal subset of features and affirm that there is significant evidence of racial bias in the strike decisions. We also use Optimal Classification Trees to segment the juror population subgroups with similar characteristics and probability of being struck, and find that three of these subgroups exhibit significant racial disparity in strike rate, pinpointing specific areas of bias in the dataset.

This is a preview of subscription content, access via your institution.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Data Availability

Data will be made available on reasonable request.

References

  1. Buolamwini J, Gebru T (2018) Gender shades: intersectional accuracy disparities in commercial gender classification. In: Conference on fairness, accountability and transparency. pp 77–91

  2. Obermeyer Z, Powers B, Vogeli C, Mullainathan S (2019) Dissecting racial bias in an algorithm used to manage the health of populations. Science 366(6464):447–453

    Article  Google Scholar 

  3. Kusner MJ, Loftus J, Russell C, Silva R (2017) Counterfactual fairness. In: Guyon I, Luxburg UV, Bengio S, Wallach H, Fergus R, Vishwanathan S, Garnett R (eds) Advances in Neural Information Processing Systems, Curran Associates, Inc., vol 30. https://proceedings.neurips.cc/paper/2017/file/a486cd07e4ac3d270571622f4f316ec5-Paper.pdf

  4. Tiwari A (2017) Bias and fairness in machine learning. Unpublished

  5. Adadi A, Berrada M (2018) Peeking inside the black-box: a survey on explainable artificial intelligence (XAI). IEEE Access 6:52138–52160

    Article  Google Scholar 

  6. Guidotti R, Monreale A, Ruggieri S, Turini F, Giannotti F, Pedreschi D (2018) A survey of methods for explaining black box models. ACM Computing Surveys (CSUR) 51(5):1–42

    Article  Google Scholar 

  7. Molnar C (2022) Interpretable machine learning, 2nd edn. https://christophm.github.io/interpretable-ml-book

  8. Houser KA (2019) Can AI solve the diversity problem in the tech industry: mitigating noise and bias in employment decision-making. Stan Tech L Rev 22:290

    Google Scholar 

  9. Arnold D, Dobbie W, Yang CS (2018) Racial bias in bail decisions. Q J Econ 133(4):1885–1932

    Article  Google Scholar 

  10. Alesina A, La Ferrara E (2014) A test of racial bias in capital sentencing. Am Econ Rev 104(11):3397–3433. https://doi.org/10.1257/aer.104.11.3397

  11. Batson v. Kentucky, 476 U.S. 79 (1986) https://supreme.justia.com/cases/federal/us/476/79/

  12. Anwar S, Bayer P, Hjalmarsson R (2012) The impact of jury race in criminal trials. Q J Econ 127(2):1017–1055

    Article  Google Scholar 

  13. Frampton TW (2018) The Jim crow jury. Vand L Rev 71:1593

    Google Scholar 

  14. Harress C (2021) From the Arbery to Rittenhouse cases, mostly white juries persist. Why? https://www.reckonsouth.com/news/2021/11/from-the-arbery-to-rittenhouse-cases-mostly-white-juries-persist-why.html

  15. Most D (2021) “White Skin Privilege” in Rittenhouse, Arbery Verdicts. https://www.bu.edu/articles/2021/white-skin-privilege-in-rittenhouse-arbery-verdicts/

  16. Flowers v. Mississippi, No. 17–9572, 588 U.S. ___ (2019) https://supreme.justia.com/cases/federal/us/588/17-9572/

  17. Tokson M (2019) Flowers v. Mississippi. https://harvardlawreview.org/2019/11/flowers-v-mississippi/

  18. Foster v. Chatman, 578 U.S. ___ (2016) https://supreme.justia.com/cases/federal/us/578/14-8349/

  19. Craft W (2018a) APM Reports Jury Data. https://github.com/APM-Reports/jury-data

  20. Craft W (2018b) Peremptory strikes in Mississippi’s fifth circuit court district. https://features.apmreports.org/files/peremptory_strike_methodology.pdf

  21. Mehrabi N, Morstatter F, Saxena N, Lerman K, Galstyan A (2019) A survey on bias and fairness in machine learning. arXiv preprint arXiv:190809635

  22. Hastie T, Tibshirani R, Friedman J (2009) The elements of statistical learning: data mining, inference, and prediction. Springer Science & Business Media

  23. Tibshirani R (1996) Regression shrinkage and selection via the lasso. J Roy Stat Soc: Ser B (Methodol) 58(1):267–288

    Google Scholar 

  24. Zou H, Hastie T (2005) Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society: Series B (Statistical Methodology) 67(2):301–320

    Article  Google Scholar 

  25. Bertsimas D, Copenhaver MS (2018) Characterization of the equivalence of robustification and regularization in linear and matrix regression. Eur J Oper Res 270(3):931–942

    Article  Google Scholar 

  26. Bertsimas D, King A, Mazumder R (2016) Best subset selection via a modern optimization lens. The Ann Stat 813–852

  27. Bertsimas D, Pauphilet J, VanParys B (2017) Sparse classification and phase transitions: a discrete optimization perspective. arXiv preprint arXiv:171001352

  28. Bertsimas D, Pauphilet J, VanParys B (2019) Sparse regression: scalable algorithms and empirical performance. arXiv preprint arXiv:190206547

  29. Bertsimas D, Van Parys B et al (2020) Sparse high-dimensional regression: Exact scalable algorithms and phase transitions. Ann Stat 48(1):300–323

    Article  Google Scholar 

  30. Breiman L, Friedman J, Stone CJ, Olshen RA (1984) Classification and regression trees. CRC Press

  31. Bertsimas D, Dunn J (2017) Optimal classification trees. Mach Learn 106(7):1039–1082

    Article  Google Scholar 

  32. Bertsimas D, Dunn J (2019) Machine learning under a modern optimization lens. Dynamic Ideas LLC

  33. Fisher RA (1992) Statistical methods for research workers. In: Breakthroughs in statistics. Springer, pp 66–70

  34. Holm S (1979) A simple sequentially rejective multiple test procedure. Scand J Stat 65–70

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ying Daisy Zhuo.

Ethics declarations

Conflict of Interest

The authors declare no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

This article is part of the Topical Collection on Analytics and Artificial Intelligence for Social Goods

Appendix

Appendix

See Table 2.

Table 2 A list of the top features identified by Optimal Feature Selection and their definitions

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Dunn, J., Zhuo, Y.D. Detecting Racial Bias in Jury Selection. Oper. Res. Forum 3, 38 (2022). https://doi.org/10.1007/s43069-022-00151-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s43069-022-00151-x

Keywords

  • Interpretable machine learning
  • Bias detection
  • Decision trees
  • Sparse regression