Abstract
Artificial intelligence (AI) has been a buzz word for quite a long time now; the advancements in this field have reached its way to our pockets in form of smartphones. Google Lens, Siri, Alexa, and many other AI assistances had become part of our lives now. The key features like identifying an image or text is a necessity for such programs; in fact many investments are being poured in building better models for image recognition and contextual speed recognition through neural networks. In this chapter, the architectures of various neural networks are explored. Fundamentally, convolution neural networks aka ConvNets or CNN and recurrent neural networks or RNN are explained with a few examples and their implementation in Python. Intuitive explanation with easily understandable mathematical interpretation can be seen in this chapter.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Yellapragada, B., Kolla, B.: Effective handwritten digit recognition using deep convolution neural network. Int. J. Adv. Trends Comput. Sci. Eng. 9 (2020). https://doi.org/10.30534/ijatcse/2020/66922020
Hidaka, A., Kurita, T.: Consecutive dimensionality reduction by canonical correlation analysis for visualization of convolutional neural networks. Proc. ISCIE Int. Symp. Stoch. Syst. Theory Appl., 160–167 (2017, 2017). https://doi.org/10.5687/sss.2017.160
van der Vaart Wouter, V., Lambers, K.: Learning to look at LiDAR: the use of R-CNN in the automated detection of archaeological objects in LiDAR data from the Netherlands. J. Comput. Appl. Archaeol. 2, 31–40 (2019). https://doi.org/10.5334/jcaa.32
Hossain, Md. M., Talbert, D., Ghafoor, S., Kannan, R.: FAWCA: A Flexible-greedy Approach to find Well-tuned CNN Architecture for Image Recognition Problem. (2018)
Krizhevsky, A., Sutskever, I., Hinton, G.: ImageNet classification with deep convolutional neural networks. Neural Inf. Process. Syst. 25 (2012). https://doi.org/10.1145/3065386
Hughes, D. & Correll, N.: Distributed Machine Learning in Materials that Couple Sensing, Actuation, Computation and Communication. (2016)
Yuan, X., Li, L., Wang, Y.: Nonlinear dynamic soft sensor modeling with supervised long short-term memory network. IEEE Trans. Ind. Inf., 3168–3176 (2019). https://doi.org/10.1109/TII.2019.2902129
Sua, Y., Kuo, C.: On extended long short-term memory and dependent bidirectional recurrent neural network. Neurocomputing. 356, 151–161 (2018). https://doi.org/10.1016/j.neucom.2019.04.044
Naber, M., Hilger, M.: Wolfgang Einhäuser; animal detection and identification in natural scenes: image statistics and emotional valence. J. Vis. 12(1), 25 (2012). https://doi.org/10.1167/12.1.25
Chen, J., Yang, G., Zhao, H., et al.: Audio style transfer using shallow convolutional networks and random filters. Multimed. Tools Appl. 79, 15043–15057 (2020). https://doi.org/10.1007/s11042-020-08798-6
Gatys, L., Ecker, A., Bethge, M.: A neural algorithm of artistic style. arXiv, https://doi.org/10.1167/16.12.326. (2015)
Doersch, C.: (2016). Tutorial on Variational Autoencoders, arXiv - 1606.05908, https://arxiv.org/abs/1606.05908
Reichstaller, A., Knapp, A.: Compressing uniform test suites using Variational autoencoders. In: IEEE International Conference on Software Quality, Reliability and Security Companion (QRS-C), pp. 435–440. IEEE (2017). https://doi.org/10.1109/QRS-C.2017.128
Nghiem, P., Phan, N.: Generative Adversarial Networks (2019). https://doi.org/10.13140/RG.2.2.31946.62401
Goodfellow, Ian. (2016). NIPS 2016 Tutorial: Generative Adversarial Networks
Zhang, Y., Zhao, X., Liu, P.: Multi-point displacement monitoring based on full convolutional neural network and smartphone. IEEE Access. 7, 139628–139634 (2019). https://doi.org/10.1109/ACCESS.2019.2943599
Jakub, L., Vladimir, B.: GANs in Action, Chapter 3 ,September 2019 ISBN 9781617295560
Shorten, C., Khoshgoftaar, T.: A survey on image data augmentation for deep learning. J. Big Data. 6 (2019). https://doi.org/10.1186/s40537-019-0197-0
Poots, J.; Tensorflow: Past the Basics. (2019)
Prakash, K.B., Ruwali, A., Kanagachidambaresan, G.R.: Introduction to tensor flow, programming with tensor flow, EIA/Springer innovations in communication and computing. https://doi.org/10.1007/978-3-030-57077-4_1
JHA. A.K., Ruwali, A., Prakash, K.B., Kanagachidambaresan, G.R.: Tensor flow basics, programming with tensor flow, EIA/Springer innovations in communication and computing. https://doi.org/10.1007/978-3-030-57077-4_2
Kanagachidambaresan, G.R., Manohar Vinoothna, G., Prakash, K.B.: Visualizations, programming with tensor flow, EIA/Springer innovations in communication and computing. https://doi.org/10.1007/978-3-030-57077-4_3
Prakash, K.B.,. Ruwali, A., Kanagachidambaresan, G.R.: Regression, programming with tensor flow, EIA/Springer innovations in communication and computing. https://doi.org/10.1007/978-3-030-57077-4_4
Vadla, P.K., Ruwali, A., Lakshmi, M.V.P., Kanagachidambaresan, G.R.: Neural network, programming with tensor flow, EIA/Springer innovations in communication and computing. https://doi.org/10.1007/978-3-030-57077-4_5
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this chapter
Cite this chapter
Bharadwaj, Y.S.S. (2021). Advanced Deep Learning Techniques. In: Prakash, K.B., Kannan, R., Alexander, S., Kanagachidambaresan, G.R. (eds) Advanced Deep Learning for Engineers and Scientists. EAI/Springer Innovations in Communication and Computing. Springer, Cham. https://doi.org/10.1007/978-3-030-66519-7_6
Download citation
DOI: https://doi.org/10.1007/978-3-030-66519-7_6
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-66518-0
Online ISBN: 978-3-030-66519-7
eBook Packages: EngineeringEngineering (R0)