Skip to main content
Log in

RIECNN: real-time image enhanced CNN for traffic sign recognition

  • Original Article
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

Traffic sign recognition plays a crucial role in the development of autonomous cars to reduce the accident rate and promote road safety. It has been a necessity to address traffic signs that are affected significantly by the environment as well as poor real-time performance for deep-learning state-of-the-art algorithms. In this paper, we introduce Real-Time Image Enhanced CNN (RIECNN) for Traffic Sign Recognition. RIECNN is a real-time, novel approach that tackles multiple, diverse traffic sign datasets, and out-performs the state-of-the-art architectures in terms of recognition rate and execution time. Experiments are conducted using the German Traffic Sign Benchmark (GTSRB), the Belgium Traffic Sign Classification (BTSC), and the Croatian Traffic Sign (rMASTIF) benchmark. Experimental results show that our approach has achieved the highest recognition rate for all Benchmarks, achieving a recognition accuracy of 99.75% for GTSRB, 99.25% for BTSC and 99.55% for rMASTIF. In terms of latency and meeting the real-time constraint, the pre-processing time and inference time together do not exceed 1.3 ms per image. Not only have our proposed approach achieved remarkably high accuracy with real-time performance, but it also demonstrated robustness against traffic sign recognition challenges such as brightness and contrast variations in the environment.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15

Similar content being viewed by others

Notes

  1. The source code for the developed models and experiments can be found through: https://github.com/RanaMostafaAbdElMohsen/Traffic_Sign_Recognition.

References

  1. Li Y, Ma L, Huang Y, Li J (2018) Segment-based traffic sign detection from mobile laser scanning data. In: Proceedings of the 38th Annual IEEE international geoscience and remote sensing symposium (IGARSS) p 4607–4610

  2. Jose A, Thodupunoori H, Nair BB (2018) A novel traffic sign recognition system combining viola-jones framework and deep learning. In: Proceedings of the international conference on soft computing and signal processing (ICSCSP) p 507–517

  3. Ciresan D, Meier U, Masci J, Schmidhuber J (2012) Multi-column deep neural network for traffic sign classification. J Neural Netw. https://doi.org/10.1016/j.neunet.2012.02.023

    Article  Google Scholar 

  4. García ÁA, Álvarez JA, Soria-Morillo LM (2018) Deep neural network for traffic sign recognition systems: An analysis of spatial transformers and stochastic optimisation methods. Neural Netw Off J Int Neural Netw Soc 99:158

    Article  Google Scholar 

  5. Jaderberg M, Simonyan K, Zisserman A, Kavukcuoglu K (2015) in NIPS. arXiv:1506.02025

  6. Sermanet P, LeCun Y (2011) Traffic sign recognition with multi-scale convolutional networks. In: International joint conference on neural networks (IJCNN)

  7. Saha S, Kamran SA, Sabbir AS (2018) Total recall: Understanding traffic signs using deep hierarchical convolutional neural networks. In: 2018 21st international conference of computer and information technology (ICCIT) (pp 1–6). IEEE.https://doi.org/10.1109/ICCITECHN.2018.8631925

  8. Zhang J, Wang W, Lu C, Wang J, Sangaiah AK (2019) Lightweight deep network for traffic sign classification. Ann Des Télécommun pp 1–11

  9. Mao X, Hijazi SL, Casas RA, Kaul P, Kumar R, Rowen C (2016) Hierarchical cnn for traffic sign recognition. In: 2016 IEEE intelligent vehicles symposium (IV) pp 130–135

  10. Jurišić Fran, Filkovi ć Ivan, Kalafatić Zoran (2015) Multiple-dataset traffic sign classification with onecnn. In: 2015 3rd IAPR Asian conference on pattern recognition (ACPR) https://doi.org/10.1109/ACPR.2015.7486576

  11. Zeng Y, Xu X, Fang Y, Zhao K (2015) Traffic sign recognition using deep convolutional networks and extreme learning machine. IScIDE

  12. Ying Z, Li G, Ren Y, Wang R, Wang W (2017) A new image contrast enhancement algorithm using exposure fusion framework. CAIP

  13. Jobson DJ, Rahman Z, Woodell GA (1997) A multiscale retinex for bridging the gap between color images and the human observation of scenes. IEEE Trans Image Process Publ IEEE Signal Process Soc 6(7):965

    Article  Google Scholar 

  14. Simonyan K, Zisserman A (2014) Very deep convolutional networks for large-scale image recognition. arXiv preprint arXiv:1409.1556

  15. Okuta R, Unno Y, Nishino D, Hido S, Loomis C (2017) In: Proceedings of workshop on machine learning systems (LearningSys) in The thirty-first annual conference on neural information processing systems (NIPS)

  16. Gecer B, Azzopardi G (2016) Color-blob-based cosfire filters for object recognition. J Image Vis Comput. https://doi.org/10.1016/j.imavis.2016.10.006

    Article  Google Scholar 

  17. Stallkamp J, Schlipsing M, Salmen J, Igel C (2012) Man vs. computer: Benchmarking machine learning algorithms for traffic sign recognition. J Neural Netw pp 323–332

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ahmed H. Abdel-Gawad.

Ethics declarations

Conflict of Interests

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Abdel-Salam, R., Mostafa, R. & Abdel-Gawad, A.H. RIECNN: real-time image enhanced CNN for traffic sign recognition. Neural Comput & Applic 34, 6085–6096 (2022). https://doi.org/10.1007/s00521-021-06762-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-021-06762-5

Keywords

Navigation