Abstract
Neural networks have shown good results for detecting a certain pattern in a given image. In this paper, faster neural networks for pattern detection are presented. Such processors are designed based on cross-correlation in the frequency domain between the input matrix and the input weights of neural networks. This approach is developed to reduce the computation steps required by these faster neural networks for the searching process. The principle of divide and conquer strategy is applied through matrix decomposition. Each matrix is divided into smaller in size sub-matrices and then each one is tested separately using a single faster neural processor. Furthermore, faster pattern detection is obtained using parallel processing techniques to test the resulting submatrices at the same time using the same number of faster neural networks. In contrast to faster neural networks, the speed up ratio is increased with the size of the input matrix when using faster neural networks and matrix decomposition. Moreover, the problem of local sub-matrix normalization in the frequency domain is solved. The effect of matrix normalization on the speed up ratio of pattern detection is discussed. Simulation results show that local sub-matrix normalization through weight normalization is faster than sub-matrix normalization in the spatial domain. The overall speed up ratio of the detection process is increased as the normalization of weights is done off line.
Similar content being viewed by others
References
Anifantis D, Dermatas E, Kokkinakis G (1999) A neural network method for accurate face detection on arbitrary images. In: Proceedings of 6th IEEE international conference on electronics. Circuits and systems, Paphos, Cyprus, 5–8 September, pp 109–112
Bao J-P, Shen J-Y, Liu H-Y, Liu X-D (2006) A fast document copy detection model. Soft Comput J Fusion Found Methodol Appl 10(1):41–46
Ben-Yacoub S (1997) Fast object detection using MLP and FFT, IDIAP-RR 11, IDIAP
Ben-Yacoub S, Fasel B, Luettin J (1999) Fast face detection using MLP and FFT. In: Proceedings of the second international conference on audio and video-based biometric person authentication (AVBPA’99)
Bruce J, Veloso M (2003) Fast and accurate vision-based pattern detection and identification. In: Proceedings of ICRA’03, the 2003, IEEE International Conference on Robotics and Automation, Taiwan, May 2003, pp 1-6
El-Bakry HM, Zhao Q (2005) A fast neural algorithm for serial code detection in a stream of sequential data. Int J Inform Technol 2(1):71–90
Essannouni L, Ibn Elhaj E (2006) Face identification of video sequence. In: Proceedings of 2006 the second European international symposium on communications, control and signal processing, 13–15 March 2006, Marrakech, Morocco, 4 p
Fasel B (1998) Fast multi-scale face detection, IDIAP-Com 98-04
Feraud R, Bernier O, Viallet JE, Collobert M (2000) A fast and accurate face detector for indexation of face images. In:Proceedings of the fourth IEEE international conference on automatic face and gesture recognition, Grenoble, France, 28–30 March
Gonzalez RC, Woods RE (2002) Digital image processing. Prentice-Hall, USA
Ishak KA, Samad SA, Hussian A, Majlis BY (2004) A fast and robust face detection using neural networks. In: Proceedings of the international symposium on information and communication technologies. Multimedia University, Putrajaya, Malaysia, vol 2, 7–8 October, pp 5–8
James WC, John WT (1965) An algorithm for the machine calculation of complex Fourier series. Math Comput 19:297–301
Klette R, Zamperon P (1996) Handbook of image processing operators. Wiley, New York
Lewis JP (1988) Fast Normalized Cross Correlation. Available from http://www.idiom.com/~zilla/Papers/nvisionInterface/nip.html
Lang KJ, Hinton GE (1988) The development of time-delay neural network architecture for speech recognition, Technical Report CMU-CS-88-152. Carnegie-Mellon University, Pittsburgh, PA
Ramasubramanian P, Kannan A (2006) A genetic-algorithm based neural network short-term forecasting framework for database intrusion prediction system. Soft Comput J Fusion Found Methodol Appl 10(8):699-714, June
Rowley HA, Baluja S, Kanade T (1998) Neural network-based face detection. IEEE Trans Pattern Anal Mach Intell 20(1):23–38
Roth S, Gepperth A, Igel C (2006) Multi-objective neural network optimization for visual object detection, vol 16. Springer, Berlin
Schneiderman H, Kanade T (1998) Probabilistic modeling of local appearance and spatial relationships for object recognition, In IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Santa Barbara, CA, pp 45–51
Srisuk S, Kurutach W (2002) A new robust face detection in color images In: Proceedings of IEEE Computer Society International Conference on Automatic Face and Gesture Recognition, Washington DC, USA, 20–21 May, pp 306–311
Yang M, Kriegman DJ, Huja N (2002) Detecting faces in images: a survey. IEEE Trans Pattern Anal Mach Intell 24(1):34–58, January
Zhu Y, Schwartz S, Orchard M (2000) Fast face detection using subspace discriminate wavelet features,Proceedings of IEEE Computer Society International Conference on Computer Vision and Pattern Recognition (CVPR’00), South Carolina, vol 1, 13–15 June, pp 1636–1643
Author information
Authors and Affiliations
Corresponding author
Appendices
Appendix 1
1.1 An example proves that the cross-correlation between any two matrices is not commutative
Then, the cross-correlation between X and W can be obtained as follows:
On the other hand, the cross-correlation between W and X can be computed as follows:
which proves that X ⊗ W ≠ W ⊗ X.
Also, when one of the two matrices is symmetric, the cross-correlation between the two matrices is non-commutative as shown in the following example:
Then, the cross-correlation between X and W can be obtained as follows:
On the other hand, the cross-correlation between W and X can be computed as follows:
which proves that X ⊗ W ≠ W ⊗ X.
The cross-correlation between any two matrices is commutative only when the two matrices are symmetric as shown in the following example.
Then, the cross-correlation between X and W can be obtained as follows:
On the other hand, the cross-correlation between W and X can be computed as follows:
which proves that the cross-correlation is commutative (X ⊗ W = W ⊗ X) only under the condition when the two matrices X and W are symmetric.
Appendix 2
2.1 An example proves that the cross-correlation between any two matrices is different from their convolution
the result of their cross-correlation can be computed as illustrated from the previous example (first result) in Appendix 1. The convolution between W and X can be obtained as follows:
which proves that W ⊗ X ≠ W⋄X.
When the second matrix W is symmetric, the cross-correlation between W and X can be computed as follows:
while the convolution can be between W and X can be obtained as follows:
which proves that under the condition that the second matrix is symmetric (or the two matrices are symmetric) the cross-correlation between any the two matrices equals to their convolution.
Appendix 3
3.1 A cross-correlation example between a normalized matrix and other non-normalized one and vise versa
Then the normalized matrices \( \overline{X} , \) and \( \overline{W} \) can be computed as:
Now, the cross-correlation between a normalized matrix and the other non-normalized one can be computed as follows:
which means that \( \overline{X} \otimes W \ne X \otimes \overline{W} \).
However, the two results are equal only at the center element which equals to the dot product between the two matrices. The value of the center element (2, 2) = 6 as shown above and also in Appendix 4.
Appendix 4
4.1 A dot product example between a normalized matrix and other non-normalized one and vise versa
This is to validate the correctness of Eq. 51. The left hand side of Eq. 51 can be expressed as follows:
and also the right hand side of the same can be represented as:
\( \overline{X} \) and \( \overline{W} \) are defined as follows:
By substituting from Eq. 60 in Eqs. 58 and 59, then simplifying the results we can easily conclude that \( \overline{X}_{rc} W_{i} = X_{rc} \overline{W}_{i} . \)
Here is also a practical example:
Then the normalized matrices \( \overline{X}, \) and \( \overline{W} \) can be computed as:
Now, the dot product between a normalized matrix and the other non-normalized one can be performed as follows:
which means generally that the dot product between a normalized matrix X and non-normalized matrix W equals to the dot product between the normalized matrix W and non-normalized matrix X. On the other hand, the cross-correlation results are different as proved in Appendix 3.
Rights and permissions
About this article
Cite this article
El-Bakry, H.M. A novel high-speed neural model for fast pattern recognition. Soft Comput 14, 647–666 (2010). https://doi.org/10.1007/s00500-009-0433-1
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00500-009-0433-1