Skip to main content
Log in

HSR: L 1/2-regularized sparse representation for fast face recognition using hierarchical feature selection

  • Extreme Learning Machine and Applications
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

In this paper, we propose a novel method for fast face recognition called L 1/2-regularized sparse representation using hierarchical feature selection. By employing hierarchical feature selection, we can compress the scale and dimension of global dictionary, which directly contributes to the decrease of computational cost in sparse representation that our approach is strongly rooted in. It consists of Gabor wavelets and extreme learning machine auto-encoder (ELM-AE) hierarchically. For Gabor wavelets’ part, local features can be extracted at multiple scales and orientations to form Gabor-feature-based image, which in turn improves the recognition rate. Besides, in the presence of occluded face image, the scale of Gabor-feature-based global dictionary can be compressed accordingly because redundancies exist in Gabor-feature-based occlusion dictionary. For ELM-AE part, the dimension of Gabor-feature-based global dictionary can be compressed because high-dimensional face images can be rapidly represented by low-dimensional feature. By introducing L 1/2 regularization, our approach can produce sparser and more robust representation compared to L 1-regularized sparse representation-based classification (SRC), which also contributes to the decrease of the computational cost in sparse representation. In comparison with related work such as SRC and Gabor-feature-based SRC, experimental results on a variety of face databases demonstrate the great advantage of our method for computational cost. Moreover, we also achieve approximate or even better recognition rate.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15

Similar content being viewed by others

References

  1. Zhao WY, Chellppa R, Phillips PJ, Rosenfeld A (2003) Face recognition: a literature survey. ACM Comput Surv 35:399–459

    Article  Google Scholar 

  2. Turk M, Pentland A (1991) Eigenfaces for recognition. J. Cogn Neurosci 3:71–86

    Article  Google Scholar 

  3. Belhumeur PN, Hespanha JP, Kriengman DJ (1997) Eigenfaces vs. fisherfaces: recognition using class specific linear projection. IEEE PAMI 19:711–720

    Article  Google Scholar 

  4. Barkan O et al (2013) Fast high dimensional vector multiplication face recognition. In: IEEE international conference on computer vision (ICCV), 2013

  5. Jiang Xudong (2011) Linear subspace learning-based dimensionality reduction. Sig Process Mag IEEE 28(2):16–26

    Article  Google Scholar 

  6. Freifeld O, Black MJ (2012) Lie bodies: a manifold representation of 3D human shape. Computer vision—ECCV 2012. Springer Berlin Heidelberg, pp 1–14

  7. Wright J, Yang AY, Ganesh A et al (2009) Robust face recognition via sparse representation. IEEE Trans Pattern Anal Mach Intell 31(2):210–227

    Article  Google Scholar 

  8. Yang M, Zhang L (2010) Gabor feature based sparse representation for face recognition with Gabor occlusion dictionary. Computer vision—ECCV 2010. Springer Berlin Heidelberg, pp 448–461

  9. Sharma BP, Rajesh R, Rajesh M (2013) Face recognition using Gabor wavelet for image processing applications. In: Proc. of Int. Conf. on Emerging Trends in Engineering and Technology. http://searchdl.org/public/book_series/AETS/3/98.pdf

  10. Cambria E et al (2013) Extreme learning machines. IEEE Intell Syst 28(6):30–59

    Article  Google Scholar 

  11. Gupta HA, Raju A, Alwan A (2013) The effect of non-linear dimension reduction on Gabor filter bank feature space. J Acoust Soc Am 134(5):4069

    Article  Google Scholar 

  12. Cao Jiuwen, Lin Zhiping (2014) Bayesian signal detection with compressed measurements. Inf Sci 289:241–253

    Article  Google Scholar 

  13. Xu ZB (2010) Data modeling: visual psychology approach and L1/2 regularization theory. In: Proceedings of international congress of Mathematicians, pp 3151–3184

  14. Huang GB, Zhu QY, Siew CK (2006) Extreme learning machine: theory and applications. Neurocomputing 70(1):489–501

    Article  Google Scholar 

  15. Huang GB, Zhu QY, Siew CK (2004) Extreme learning machine: a new learning scheme of feedforward neural networks. In: Proceedings of IEEE international joint conference on neural networks, vol 2, pp 985–990

  16. Cao J, Lianglin X (2014) Protein sequence classification with improved extreme learning machine algorithms. BioMed Res Int 2014:1–12. Article ID 103054

    Google Scholar 

  17. Huang GB, Zhou H, Ding X (2012) Extreme learning machine for regression and multiclass classification. IEEE Trans Syst Man Cybern Part B Cybern 42(2):513–529

    Article  Google Scholar 

  18. Noble B (1975) Methods for computing the Moore–Penrose generalized inverse, and related matters. In: Nashed MZ (ed) Generalized inverses and applications. Academic, New York, pp 245–301. http://www.univie.ac.at/nuhag-php/bibtex/open_files/no76_B.%20Noble0001.pdf

  19. Liu C, Wechsler H (2002) Gabor feature based classification using the enhanced fisher linear discriminant model for face recognition. IEEE IP 11:467–476

    Google Scholar 

  20. Baldi P (2012) Autoencoders, unsupervised learning, and deep architectures. ICML Unsupervised and Transfer Learning, JMLR: Workshop and Conference Proceedings, vol 27, pp 37–50. http://ww.mtome.com/Publications/CiML/CiML-v7-book.pdf#page=53

  21. Huang GB, Chen L (2007) Convex incremental extreme learning machine. Neurocomputing 70(16):3056–3062

    Article  Google Scholar 

  22. Akaike H (1987) Factor analysis and AIC. Psychometrika 52(3):317–332

    Article  MATH  MathSciNet  Google Scholar 

  23. Burnham KP, Anderson DR (2004) Multimodel inference understanding AIC and BIC in model selection. Sociol Methods Res 33(2):261–304

    Article  MathSciNet  Google Scholar 

  24. Zhao P, Yu B (2006) On model selection consistency of Lasso. J Mach Learn Res 7:2541–2563

    MATH  MathSciNet  Google Scholar 

  25. Ogutu JO, Torben S-S, Hans-Peter P (2012) Genomic selection using regularized linear regression models: ridge regression, lasso, elastic net and their extensions. In: BMC proceedings, vol 6, no Suppl 2, BioMed Central Ltd

  26. Kim J, Koh K, Kusting M et al (2007) A method for large-scale l1-regularized least squares. IEEE J Sel Top Signal Process 1:606–617

    Article  Google Scholar 

  27. Bengio Y (2009) Learning deep architectures for AI. Found Trends® Mach Learn 2(1):1–127

  28. Lee K, Ho J, Kriegman D (2005) Acquiring linear subspaces for face recognition under variable lighting. IEEE PAMI 27:684–698

    Article  Google Scholar 

  29. Martinez A, Benavente R (1998) The AR face database. CVC Technical Report no. 24

  30. Phillips PJ, Moon H, Rizvi SA, Rauss P (2000) The FERET evaluation method-ology for face recognition algorithms. IEEE PAMI 22:1090–1104

    Article  Google Scholar 

Download references

Acknowledgments

The authors would like to thank Dr Jun Zhou at Griffith University for helpful and excellent discussions and comments. This work is partially supported by Natural Science Foundation of China (41176076, 31202036 and 51075377).

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Bo He or Tianhong Yan.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Han, B., He, B., Sun, T. et al. HSR: L 1/2-regularized sparse representation for fast face recognition using hierarchical feature selection. Neural Comput & Applic 27, 305–320 (2016). https://doi.org/10.1007/s00521-015-1907-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-015-1907-y

Keywords

Navigation