Tree-Based Forecasting Methods
The past two chapters have provided the necessary technical background for a consideration of statistical procedures that can be especially effective in criminal justice forecasting. The joint probability distribution model, data partitioning, and asymmetric costs should now be familiar. These features combine to make tree-based methods of recursive partitioning the fundamental building blocks for the machine learning procedures discussed. The main focus is random forests. Stochastic gradient boosting and Bayesian trees are discussed briefly as worthy competitors to random forests. Although neural nets and deep learning are not tree-based, they are also considered. Current claims about remarkable performance need to be dispassionately addressed, especially in comparison to tree-based methods.
- Chen, T., & Guestrin, C. (2016) XGBoost: a scalable tree boosting system. arXiv:1603.02754v3 [cs.LG]Google Scholar
- Goodyear, D. (2018) Can the manufacturer of tasers provide the answer to police abuse? New Yorker Magazine. August 27, 2018, downloaded at https://www.newyorker.com/magazine/2018/08/27/can-the-manufacturer-of-tasers-provide-the-answer-to-police-abuse Google Scholar
- Ridgeway, G. (2007) Generalized boosted models: a guide to the gbm package. cran.r-project.org/web/packages/gbm/vignettes/gbm.pdf.Google Scholar
- Scharre, P. (2018) Army of None. New York: Norton.Google Scholar
- Tan, M., Chen, B., Pang, R., Vasudevan, V. & Le, Q.L. (2018) MnasNet: platform-aware neural architecture search for Mobile. asXiv:1807.11626v1 [cs.CV].Google Scholar