Unsupervised Footwear Impression Analysis and Retrieval from Crime Scene Data
Footwear impressions are one of the most frequently secured types of evidence at crime scenes. For the investigation of crime series they are among the major investigative notes. In this paper, we introduce an unsupervised footwear retrieval algorithm that is able to cope with unconstrained noise conditions and is invariant to rigid transformations. A main challenge for the automated impression analysis is the separation of the actual shoe sole information from the structured background noise. We approach this issue by the analysis of periodic patterns. Given unconstrained noise conditions, the redundancy within periodic patterns makes them the most reliable information source in the image. In this work, we present four main contributions: First, we robustly measure local periodicity by fitting a periodic pattern model to the image. Second, based on the model, we normalize the orientation of the image and compute the window size for a local Fourier transformation. In this way, we avoid distortions of the frequency spectrum through other structures or boundary artefacts. Third, we segment the pattern through robust point-wise classification, making use of the property that the amplitudes of the frequency spectrum are constant for each position in a periodic pattern. Finally, the similarity between footwear impressions is measured by comparing the Fourier representations of the periodic patterns. We demonstrate robustness against severe noise distortions as well as rigid transformations on a database with real crime scene impressions. Moreover, we make our database available to the public, thus enabling standardized benchmarking for the first time.
KeywordsImage Retrieval Interest Point Periodic Pattern Translational Symmetry Crime Scene
This Project was supported by the Swiss Comission for Technology and Innovation (CTI) project 13932.1 PFES-ES. The authors thank the German State Criminal Police Offices of Niedersachsen and Bayern and the company forensity ag for their valuable support.
- 2.Cervelli, F., Dardi, F., Carrato, S.: A translational and rotational invariant descriptor for automatic footwear retrieval of real cases shoe marks. Eusipco (2010)Google Scholar
- 3.Dalal, N., Triggs, B.: Histograms of oriented gradients for human detection. In: IEEE Computer Society Conference on Computer Vision and Pattern Recognition, CVPR 2005, vol. 1, pp. 886–893. IEEE (2005)Google Scholar
- 6.Gueham, M., Bouridane, A., Crookes, D., Nibouche, O.: Automatic recognition of shoeprints using fourier-mellin transform. In: NASA/ESA Conference on Adaptive Hardware and Systems, AHS 2008, pp. 487–491. IEEE (2008)Google Scholar
- 7.Harris, C., Stephens, M.: A combined corner and edge detector. In: Alvey Vision Conference, Manchester, UK, vol. 15, p. 50 (1988)Google Scholar
- 11.Luostarinen, T., Lehmussola, A.: Measuring the accuracy of automatic shoeprint recognition methods. J. Forensic Sci. (2014)Google Scholar
- 12.Nibouche, O., Bouridane, A., Crookes, D., Gueham, M., et al.: Rotation invariant matching of partial shoeprints. In: 13th International Machine Vision and Image Processing Conference, IMVIP 2009, pp. 94–98. IEEE (2009)Google Scholar
- 16.Shi, J., Tomasi, C.: Good features to track. In: 1994 IEEE Computer Society Conference on Computer Vision and Pattern Recognition Proceedings CVPR 1994, pp. 593–600. IEEE (1994)Google Scholar
- 17.Su, H., Crookes, D., Bouridane, A., Gueham, M.: Local image features for shoeprint image retrieval. In: British Machine Vision Conference, vol. 2007 (2007)Google Scholar
- 19.Vedaldi, A., Fulkerson, B.: Vlfeat: An open and portable library of computer vision algorithms. In: Proceedings of the International Conference on Multimedia, pp. 1469–1472. ACM (2010)Google Scholar