Advertisement

Optimising Deep Learning by Hyper-heuristic Approach for Classifying Good Quality Images

  • Muneeb ul Hassan
  • Nasser R. SabarEmail author
  • Andy SongEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10861)

Abstract

Deep Convolutional Neural Network (CNN), which is one of the prominent deep learning methods, has shown a remarkable success in a variety of computer vision tasks, especially image classification. However, tuning CNN hyper-parameters requires expert knowledge and a large amount of manual effort of trial and error. In this work, we present the use of CNN on classifying good quality images versus bad quality images without understanding the image content. The well known data-sets were used for performance evaluation. More importantly we propose a hyper-heuristic approach for tuning CNN hyper-parameters. The proposed hyper-heuristic encompasses of a high level strategy and various low level heuristics. The high level strategy utilises search performance to determine how to apply low level heuristics to automatically find an appropriate set of CNN hyper-parameters. Our experiments show the effectiveness of this hyper-heuristic approach which can achieve high accuracy even when the training size is significantly reduced and conventional CNNs can no longer perform well. In short the proposed hyper-heuristic approach does enhance CNN deep learning.

Keywords

Hyper-heuristics Deep learning CNN Optimisation 

References

  1. 1.
    Silver, D., Huang, A., Maddison, C.J., Guez, A., Sifre, L., Van Den Driessche, G., Schrittwieser, J., Antonoglou, I., Panneershelvam, V., Lanctot, M., et al.: Mastering the game of go with deep neural networks and tree search. Nature 529(7587), 484–489 (2016)CrossRefGoogle Scholar
  2. 2.
    Redmon, J., Divvala, S., Girshick, R., Farhadi, A.: You only look once: unified, real-time object detection. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 779–788 (2016)Google Scholar
  3. 3.
    Ji, S., Wei, X., Yang, M., Kai, Y.: 3D convolutional neural networks for human action recognition. IEEE Trans. Pattern Anal. Mach. Intell. 35(1), 221–231 (2013)CrossRefGoogle Scholar
  4. 4.
    Karpathy, A., Toderici, G., Shetty, S., Leung, T., Sukthankar, R., Fei-Fei, L.: Large-scale video classification with convolutional neural networks. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 1725–1732 (2014)Google Scholar
  5. 5.
    Graves, A., Mohamed, A., Hinton, G.: Speech recognition with deep recurrent neural networks. In: 2013 IEEE International Conference on Acoustics, Speech and Signal Processing (icassp), pp. 6645–6649. IEEE (2013)Google Scholar
  6. 6.
    Hermann, K.M., Kocisky, T., Grefenstette, E., Espeholt, L., Kay, W., Suleyman, M., Blunsom, P.: Teaching machines to read and comprehend. In: Advances in Neural Information Processing Systems, pp. 1693–1701 (2015)Google Scholar
  7. 7.
    LeCun, Y., Bottou, L., Bengio, Y., Haffner, P.: Gradient-based learning applied to document recognition. Proc. IEEE 86(11), 2278–2324 (1998)CrossRefGoogle Scholar
  8. 8.
    Taigman, Y., Yang, M., Ranzato, M.A., Wolf, L.: Deepface: closing the gap to human-level performance in face verification. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 1701–1708 (2014)Google Scholar
  9. 9.
    Basu, S., Karki, M., Ganguly, S., DiBiano, R., Mukhopadhyay, S., Gayaka, S., Kannan, R., Nemani, R.: Learning sparse feature representations using probabilistic quadtrees and deep belief nets. Neural Process. Lett. 45, 1–13 (2015)Google Scholar
  10. 10.
    Liu, C.-L., Nakashima, K., Sako, H., Fujisawa, H.: Handwritten digit recognition: benchmarking of state-of-the-art techniques. Pattern Recogn. 36(10), 2271–2285 (2003)CrossRefGoogle Scholar
  11. 11.
    Abadi, M., Agarwal, A., Barham, P., Brevdo, E., Chen, Z., Citro, C., Corrado, G.S., Davis, A., Dean, J., Devin, M., et al.: Tensorflow: large-scale machine learning on heterogeneous distributed systems. arXiv preprint arXiv:1603.04467 (2016)
  12. 12.
    Burke, E.K., Hyde, M., Kendall, G., Ochoa, G., Özcan, E., Woodward, J.R.: A classification of hyper-heuristic approaches. In: Gendreau, M., Potvin, J.Y. (eds.) Handbook of Metaheuristics, pp. 449–468. Springer, Boston (2010).  https://doi.org/10.1007/978-1-4419-1665-5_15CrossRefGoogle Scholar
  13. 13.
    Sabar, N.R., Kendall, G.: Population based Monte Carlo tree search hyper-heuristic for combinatorial optimization problems. Inf. Sci. 314, 225–239 (2015)CrossRefGoogle Scholar
  14. 14.
    Sabar, N.R., Zhang, X.J., Song, A.: A math-hyper-heuristic approach for large-scale vehicle routing problems with time windows. In: 2015 IEEE Congress on Evolutionary Computation (CEC), pp. 830–837. IEEE (2015)Google Scholar
  15. 15.
    Sabar, N.R., Ayob, M.: Examination timetabling using scatter search hyper-heuristic. In: 2nd Conference on Data Mining and Optimization 2009, DMO 2009, pp. 127–131. IEEE (2009)Google Scholar
  16. 16.
    Sabar, N.R., Ayob, M., Kendall, G., Qu, R.: Grammatical evolution hyper-heuristic for combinatorial optimization problems. Strategies 3, 4 (2012)Google Scholar
  17. 17.
    Sabar, N.R., Ayob, M., Kendall, G., Qu, R.: Automatic design of a hyper-heuristic framework with gene expression programming for combinatorial optimization problems. IEEE Trans. Evol. Comput. 19(3), 309–325 (2015)CrossRefGoogle Scholar
  18. 18.
    Abdullah, S., Sabar, N.R., Nazri, M.Z.A., Turabieh, H., McCollum, B.: A constructive hyper-heuristics for rough set attribute reduction. In: 2010 10th International Conference on Intelligent Systems Design and Applications (ISDA), pp. 1032–1035. IEEE (2010)Google Scholar
  19. 19.
    Sabar, N.R., Ayob, M., Kendall, G., Qu, R.: A dynamic multiarmed bandit-gene expression programming hyper-heuristic for combinatorial optimization problems. IEEE Trans. Cybern. 45(2), 217–228 (2015)CrossRefGoogle Scholar
  20. 20.
    Sabar, N.R., Abawajy, J., Yearwood, J.: Heterogeneous cooperative co-evolution memetic differential evolution algorithm for big data optimization problems. IEEE Trans. Evol. Comput. 21(2), 315–327 (2017)CrossRefGoogle Scholar
  21. 21.
    Sabar, N.R., Turky, A.M., Song, A.: Optimising deep belief networks by hyper-heuristic approach. In: CEC 2017-IEEE Congress on Evolutionary Computation (2017)Google Scholar
  22. 22.
    Ruder, S.: An overview of gradient descent optimization algorithms. arXiv preprint arXiv:1609.04747 (2016)

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  1. 1.RMIT UniversityMelbourneAustralia
  2. 2.La Trobe UniversityMelbourneAustralia

Personalised recommendations