Gender Classification Using a New Pyramidal Neural Network

  • S. L. Phung
  • A. Bouzerdoum
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4233)


We propose a novel neural network for classification of visual patterns. The new network, called pyramidal neural network or PyraNet, has a hierarchical structure with two types of processing layers, namely pyramidal layers and 1-D layers. The PyraNet is motivated by two concepts: the image pyramids and local receptive fields. In the new network, nonlinear 2-D are trained to perform both 2-D analysis and data reduction. In this paper, we present a fast training method for the PyraNet that is based on resilient back-propagation and weight decay, and apply the new network to classify gender from facial images.


Support Vector Machine Facial Image Convolutional Neural Network Pyramidal Layer Image Pyramid 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    LeCun, Y., Bottou, L., Bengio, Y., Haffner, P.: Gradient-based learning applied to document recognition. Proceedings of the IEEE 86(11), 2278–2324 (1998)CrossRefGoogle Scholar
  2. 2.
    Garcia, C., Delakis, M.: Convolutional face finder: A neural architecture for fast and robust face detection. IEEE Trans. on Pattern Analysis and Machine Intelligence 26(11), 1408–1423 (2004)CrossRefGoogle Scholar
  3. 3.
    Fukushima, K.: Neocognitron: a hierarchical neural network capable of visual pattern recognition. Neural Networks 1(2), 119–130 (1988)CrossRefGoogle Scholar
  4. 4.
    Gonzalez, R.C., Woods, R.E.: Digital image processing. Prentice Hall, New York (2002)Google Scholar
  5. 5.
    Riedmiller, M., Braun, H.: A direct adaptive method of faster backpropagation learning: The rprop algorithm. In: IEEE International Conference on Neural Networks, San Francisco, pp. 586–591 (1993)Google Scholar
  6. 6.
    Gray, M., Lawrence, D.T., Golomb, B.A., Sejnowski, T.J.: A perceptron revealing the face of sex. Neural Computation 7(6), 1160–1164 (1995)CrossRefGoogle Scholar
  7. 7.
    Moghaddam, B., Yang, M.H.: Learning gender with support faces. IEEE Trans. on Pattern Analysis and Machine Intelligence 24(5), 707–711 (2002)CrossRefGoogle Scholar
  8. 8.
    Jain, A., Huang, J.: Integrating independent components and linear discriminant analysis for gender classification. In: IEEE International Conference on Automatic Face and Gesture Recognition, pp. 159–163 (2004)Google Scholar
  9. 9.
    Phillips, P.J., Moon, H., Rizvi, S.A., Rauss, P.J.: The FERET evaluation methodology for face-recognition algorithms. IEEE Trans. on Pattern Analysis and Machine Intelligence 22(10), 1090–1104 (2000)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • S. L. Phung
    • 1
  • A. Bouzerdoum
    • 1
  1. 1.University of WollongongWollongongAustralia

Personalised recommendations