Skip to main content

CrimeForecaster: Crime Prediction by Exploiting the Geographical Neighborhoods’ Spatiotemporal Dependencies

Part of the Lecture Notes in Computer Science book series (LNAI,volume 12461)

Abstract

Crime prediction in urban areas can improve the allocation of resources (e.g., police patrols) towards a safer society. Recently, researchers have been using deep learning frameworks for urban crime forecasting with better accuracies as compared to previous work. However, these studies typically partition a metropolitan area into synthetic regions, e.g., grids, which neglects the geographical semantics of a region, nor captures the spatial correlation across the regions, e.g., precincts, neighborhoods, blocks, and postal division. In this paper, we design and implement an end-to-end spatiotemporal deep learning framework, dubbed CrimeForecaster, which captures both the temporal recurrence and the spatial dependency simultaneously within and across regions. We model temporal dependencies by using the Gated Recurrent Network with Diffusion Convolution modules to capture the cross-region dependencies at the same time. Empirical experiments on two real-world datasets showcase the effectiveness of CrimeForecaster, where CrimeForecaster outperforms the current state-of-the-art algorithm by up to 21%. We also collect and publish a ten-year crime dataset in Los Angeles for future use by the research community.

Keywords

  • Social good
  • Crime precition
  • Spatiotemporal learning

Z. Lin—The work was conducted and completed while the author was an intern at USC.

This is a preview of subscription content, access via your institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • DOI: 10.1007/978-3-030-67670-4_4
  • Chapter length: 16 pages
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
eBook
USD   84.99
Price excludes VAT (USA)
  • ISBN: 978-3-030-67670-4
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
Softcover Book
USD   109.99
Price excludes VAT (USA)
Fig. 1.
Fig. 2.
Fig. 3.
Fig. 4.
Fig. 5.

Notes

  1. 1.

    https://github.com/sunjiao123sun/CrimeForecaster.

  2. 2.

    https://drive.google.com/open?id=1nbqr0gdp_bO2QaIRN6qPx-7Tvd5pKa0P.

  3. 3.

    https://drive.google.com/open?id=1nbqr0gdp_bO2QaIRN6qPx-7Tvd5pKa0P.

References

  1. Crime data of 2015 in The City of Chicago. https://data.cityofchicago.org/Public-Safety/Crimes-2015/vwwp-7yr9

  2. Boundary information of Chicago (2020). https://data.cityofchicago.org/widgets/bbvz-uum9

  3. Census Bureau (2020). https://www.census.gov/

  4. Crosstown Los Angeles (2020). https://xtown.la/

  5. Abadi, M., et al.: Tensorflow: a system for large-scale machine learning. In: 12th \(\{\)USENIX\(\}\) Symposium on Operating Systems Design and Implementation (\(\{\)OSDI\(\}\) 16), pp. 265–283 (2016)

    Google Scholar 

  6. Chang, C.C., Lin, C.J.: LibSVM: a library for support vector machines. ACM Trans. Intell. Syst. Technol. (TIST) 2(3), 1–27 (2011)

    CrossRef  Google Scholar 

  7. Chung, J., Gulcehre, C., Cho, K., Bengio, Y.: Empirical evaluation of gated recurrent neural networks on sequence modeling. arXiv preprint arXiv:1412.3555 (2014)

  8. Contreras, J., Espinola, R., Nogales, F.J., Conejo, A.J.: Arima models to predict next-day electricity prices. IEEE Trans. Power Syst. 18(3), 1014–1020 (2003)

    CrossRef  Google Scholar 

  9. Covington, P., Adams, J., Sargin, E.: Deep neural networks for YouTube recommendations. In: Proceedings of the 10th ACM Conference on Recommender Systems, pp. 191–198 (2016)

    Google Scholar 

  10. Featherstone, C.: Identifying vehicle descriptions in microblogging text with the aim of reducing or predicting crime. In: 2013 International Conference on Adaptive Science and Technology, pp. 1–8. IEEE (2013)

    Google Scholar 

  11. Gers, F.A., Schmidhuber, J., Cummins, F.: Learning to forget: Continual prediction with LSTM (1999)

    Google Scholar 

  12. Gorman, D.M., Speer, P.W., Gruenewald, P.J., Labouvie, E.W.: Spatial dynamics of alcohol availability, neighborhood structure and violent crime. J. Stud. Alcohol 62(5), 628–636 (2001)

    CrossRef  Google Scholar 

  13. Grover, A., Leskovec, J.: node2vec: scalable feature learning for networks. In: Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 855–864 (2016)

    Google Scholar 

  14. Hamilton, W., Ying, Z., Leskovec, J.: Inductive representation learning on large graphs. In: Advances in Neural Information Processing Systems, pp. 1024–1034 (2017)

    Google Scholar 

  15. He, J., Qi, J., Ramamohanarao, K.: A joint context-aware embedding for trip recommendations. In: 2019 IEEE 35th International Conference on Data Engineering (ICDE), pp. 292–303. IEEE (2019)

    Google Scholar 

  16. Hosmer Jr., D.W., Lemeshow, S., Sturdivant, R.X.: Applied Logistic Regression, vol. 398. Wiley, Hoboken (2013)

    CrossRef  Google Scholar 

  17. Huang, C., Zhang, C., Zhao, J., Wu, X., Chawla, N.V., Yin, D.: Mist: a multiview and multimodal spatial-temporal learning framework for citywide abnormal event forecasting. In: WWW 2019 (2019)

    Google Scholar 

  18. Huang, C., Zhang, C., Zhao, J., Wu, X., Yin, D., Chawla, N.: Mist: a multiview and multimodal spatial-temporal learning framework for citywide abnormal event forecasting. In: The World Wide Web Conference, pp. 717–728 (2019)

    Google Scholar 

  19. Huang, C., Zhang, J., Zheng, Y., Chawla, N.V.: Deepcrime: attentive hierarchical recurrent networks for crime prediction. In: Proceedings of the 27th ACM International Conference on Information and Knowledge Management, pp. 1423–1432, CIKM 2018, Association for Computing Machinery, New York, NY, USA (2018). https://doi.org/10.1145/3269206.3271793

  20. Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. arXiv preprint arXiv:1412.6980 (2014)

  21. Kipf, T.N., Welling, M.: Semi-supervised classification with graph convolutional networks. arXiv preprint arXiv:1609.02907 (2016)

  22. Li, Y., Yu, R., Shahabi, C., Liu, Y.: Diffusion convolutional recurrent neural network: data-driven traffic forecasting. arXiv preprint arXiv:1707.01926 (2017)

  23. Li, Y., Yu, R., Shahabi, C., Liu, Y.: Diffusion convolutional recurrent neural network: Data-driven traffic forecasting. In: International Conference on Learning Representations (ICLR 2018) (2018)

    Google Scholar 

  24. Morenoff, J.D., Sampson, R.J.: Violent crime and the spatial dynamics of neighborhood transition: Chicago, 1970–1990. Social Forces 76(1), 31–64 (1997)

    CrossRef  Google Scholar 

  25. Tang, J., Qu, M., Wang, M., Zhang, M., Yan, J., Mei, Q.: Line: large-scale information network embedding. In: Proceedings of the 24th International Conference on World Wide Web, pp. 1067–1077 (2015)

    Google Scholar 

  26. Vaswani, A., et al: Attention is all you need. In: Advances in Neural Information Processing Systems, pp. 5998–6008 (2017)

    Google Scholar 

  27. Wang, H., Kifer, D., Graif, C., Li, Z.: Crime rate inference with big data. In: Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 635–644 (2016)

    Google Scholar 

  28. Wang, J., Cong, G., Zhao, W.X., Li, X.: Mining user intents in Twitter: a semi-supervised approach to inferring intent categories for tweets. In: AAAI (2015)

    Google Scholar 

  29. Zhao, L., et al: T-GCN: a temporal graph convolutional network for traffic prediction. IEEE Trans. Intell. Transp. Syst. (2019)

    Google Scholar 

Download references

Acknowledgement

This work has been supported in part by Annenberg Leadership Initiative, the USC Integrated Media Systems Center, and unrestricted cash gifts from Microsoft and Google. The opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the sponsors.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jiao Sun .

Editor information

Editors and Affiliations

Appendix A Impact of the Adjacent Neighborhoods’ Information

Appendix A Impact of the Adjacent Neighborhoods’ Information

Fig. 6.
figure 6

Forecasting results for all crimes on three different neighborhoods in Los Angeles and Chicago. (Los Angeles: SO – Sherman Oaks; EP – Echo Park; SC – Studio City. Chicago: AG – Auburn Gresham; FP – Fuller Park; RD: RiverDale)

We have observed that the spatial correlation would help with the crime prediction in Sect. 1. Here we want to know experimentally about if the neighborhood information helps to predict the crimes for a target individual neighborhood. Formally, for each neighborhood \(r_i\), we would like to know if its neighborhoods’ information helps to predict the crime event in the next time slot given the previous records. Therefore, we train the same classification models on two different feature settings: a) using the previous crime occurrences of \(r_i\) itself as features, and b) using the previous crime occurrences of both \(r_i\) itself and the aggregates of \( r_i\)’s neighborhoods.

$$\begin{aligned} f\big (y_{i,l}^{K+1}|(y_{i,l}^{1}, \dots , y_{i,L}^{K})\big ) \quad \text {V.S.}\quad f\big (y_{i, l}^{K+1}|\text {agg}(\{Y_{\mathcal {N}(1)}, \dots , Y_{\mathcal {N}(M)}\})\big ) \end{aligned}$$

where \(Y_{\mathcal {N}(1)}=(y_{\mathcal {N}(1),l}^1, \dots , y_{\mathcal {N}(1),L}^K)\), and L is the number of crime categories. Here f is a prediction function, which can be classification models, deep learning models and etc. \(\mathcal {N}(m)\) denotes an adjacent neighbor of \(R_i\), i.e., \(W_{i,{\mathcal {N}(m)}} = 1\). In this experiment, we use classical machine learning classifiers: Logistic Regression (LR), K-Nearest Neighbor (KNN), Extra Tree (ET), Decision Tree (DT) and Random Forest (RF) classifiers.

In both Fig. 6 (a) and Fig. 6 (b), we can see that the performance of simple models improves a lot by simply appending its neighborhoods’ information. In the next following sessions, we explored other ways of dealing with the spatial information to make the model applicable to predict the crime at multiple locations, such as proposing MiST* by adding network embedding learned from LINE [25] or using the diffusion graph convolution in CrimeForecaster. CrimeForecaster shows great superiority over other methods. We show our experiment results in the Sect. 5.

Rights and permissions

Reprints and Permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this paper

Verify currency and authenticity via CrossMark

Cite this paper

Sun, J. et al. (2021). CrimeForecaster: Crime Prediction by Exploiting the Geographical Neighborhoods’ Spatiotemporal Dependencies. In: Dong, Y., Ifrim, G., Mladenić, D., Saunders, C., Van Hoecke, S. (eds) Machine Learning and Knowledge Discovery in Databases. Applied Data Science and Demo Track. ECML PKDD 2020. Lecture Notes in Computer Science(), vol 12461. Springer, Cham. https://doi.org/10.1007/978-3-030-67670-4_4

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-67670-4_4

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-67669-8

  • Online ISBN: 978-3-030-67670-4

  • eBook Packages: Computer ScienceComputer Science (R0)