Advertisement

A Proposed Gradient Tree Boosting with Different Loss Function in Crime Forecasting and Analysis

  • Alif Ridzuan KhairuddinEmail author
  • Razana Alwee
  • Habibollah Haron
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 1073)

Abstract

Gradient tree boosting (GTB) is a newly emerging artificial intelligence technique in crime forecasting. GTB is a stage-wise additive framework that adopts numerical optimisation methods to minimise the loss function of the predictive model which later enhances it predictive capabilities. The applied loss function plays critical roles that determine GTB predictive capabilities and performance. GTB uses the least square function as its standard loss function. Motivated by this limitation, the study is conducted to observe and identify a potential replacement for the current loss function in GTB by applying a different existing standard mathematical function. In this study, the crime models are developed based on GTB with a different loss function to compare its forecasting performance. From this case study, it is found that among the tested loss functions, the least absolute deviation function outperforms other loss functions including the GTB standard least square loss function in all developed crime models.

Keywords

Loss function Gradient tree boosting Artificial intelligence Crime forecasting and multivariate crime analysis 

Notes

Acknowledgement

This work was supported by FRGS research grant granted by Malaysian Ministry of Education with grant number R.J130000.7828.4F825 for School of Computing, Universiti Teknologi Malaysia (UTM).

References

  1. 1.
    Ismail, S., Ramli, N.: Short-term crime forecasting in Kedah. Procedia - Soc. Behav. Sci. 91, 654–660 (2013)CrossRefGoogle Scholar
  2. 2.
    Rather, A.M., Sastry, V., Agarwal, A.: Stock market prediction and Portfolio selection models: a survey. OPSEARCH 54, 1–22 (2017)MathSciNetCrossRefGoogle Scholar
  3. 3.
    Alwee, R.: Swarm optimized support vector regression with autoregressive integrated moving average for modeling of crime rate. Unpublished dissertation in partial fulfillment of the requirements for the degree of Doctor of Philosophy, Universiti Teknologi Malaysia, Johor Bahru, Malaysia (2014)Google Scholar
  4. 4.
    Friedman, J.H.: Greedy function approximation: a gradient boosting machine. Ann. Stat. 29, 1189–1232 (2001)MathSciNetCrossRefGoogle Scholar
  5. 5.
    Chandrasekar, A., Raj, A.S., Kumar, P.: Crime prediction and classification in San Francisco City. CS229 Technical report: Machine Learning. Stanford Computer Science Department: Stanford University (2015)Google Scholar
  6. 6.
    Nguyen, T.T., Hatua, A., Sung, A.H.: Building a learning machine classifier with inadequate data for crime prediction. J. Adv. Inf. Technol., 8 (2017)Google Scholar
  7. 7.
    Natekin, A., Knoll, A.: Gradient boosting machines, a tutorial. Front. Neurorobotics 7, 21 (2013)CrossRefGoogle Scholar
  8. 8.
    Guelman, L.: Gradient boosting trees for auto insurance loss cost modeling and prediction. Expert Syst. Appl. 39, 3659–3667 (2012)CrossRefGoogle Scholar
  9. 9.
    Breiman, L.: Arcing the edge. Technical report 486, Statistics Department, University of California, Berkeley (1997)Google Scholar
  10. 10.
    Freeman, E.A., Moisen, G.G., Coulston, J.W., Wilson, B.T.: Random forests and stochastic gradient boosting for predicting tree canopy cover: comparing tuning processes and model performance. Can. J. For. Res. 46, 323–339 (2015)CrossRefGoogle Scholar
  11. 11.
    Pedregosa, G.V.F., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: machine learning in Python. J. Mach. Learn. Res. 12, 2825–2830 (2011)MathSciNetzbMATHGoogle Scholar
  12. 12.
    Yang, W., Wang, K., Zuo, W.: Neighborhood component feature selection for high-dimensional data. J. Comput. 7, 161–168 (2012)Google Scholar
  13. 13.
    Castelli, M., Sormani, R., Trujillo, L., Popovič, A.: Predicting per capita violent crimes in urban areas: an artificial intelligence approach. J. Ambient. Intell. Hum. Comput. 8, 29–36 (2017)CrossRefGoogle Scholar
  14. 14.
    Elith, J., Leathwick, J.R., Hastie, T.: A working guide to boosted regression trees. J. Anim. Ecol. 77, 802–813 (2008)CrossRefGoogle Scholar
  15. 15.
    Chen, T., Guestrin, C.: XGBoost: a scalable tree boosting system. In: Proceedings of the 22nd ACM International Conference on Knowledge Discovery and Data Mining (SIGKDD), pp. 785–794 (2016)Google Scholar
  16. 16.
    Chen, T., He, T.: Higgs boson discovery with boosted trees. In: NIPS Workshop on High-energy Physics and Machine Learning, pp. 69–80 (2015)Google Scholar
  17. 17.
    Chen, Y., Jia, Z., Mercola, D., Xie, X.: A gradient boosting algorithm for survival analysis via direct optimization of concordance index. Comput. Math. Methods Med. 2013, 8 (2013)MathSciNetzbMATHGoogle Scholar
  18. 18.
    Ding, F., Ge, Q., Jiang, D., Fu, J., Hao, M.: Understanding the dynamics of terrorism events with multiple-discipline datasets and machine learning approach. PloS One 12 (2017). Article no. e0179057Google Scholar
  19. 19.
    Dubey, N., Chaturvedi, S.K.: A survey paper on crime prediction technique using data mining. Int. J. Eng. Res. Appl. 4(3), 396–400 (2014)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  • Alif Ridzuan Khairuddin
    • 1
    Email author
  • Razana Alwee
    • 1
  • Habibollah Haron
    • 1
  1. 1.Applied Industrial Analytics Research Group (ALIAS), School of Computing, Faculty of EngineeringUniversiti Teknologi MalaysiaJohor BahruMalaysia

Personalised recommendations