Abstract
As mobile devices and personal computers have been more frequently used through the Internet, data generated by not only people but also devices have been continuously piling up. The data growing endlessly is called Big Data and Deep Learning algorithms with the Big Data have been introducing the next level of artificial intelligence. It is generally applicable that the more data deep learning algorithms train, the more accurate the deep learning algorithms are. Then, an important problem is which size of data is enough for deep learning algorithms to train the data. In many cases, it is not practical that we wait for the data to grow bigger enough, and thus we need a new learning model that can reduce this latency time and timely derive learning results with useful accuracy. In this paper, we propose novel block-incremental learning models for deep learning and experimentally show that the proposed model can timely derive learning results with useful accuracy and the final accuracy is even better than the traditional deep learning algorithms with the same size of training data.
Keywords
- Incremental learning
- Learning model
- Machine learning
This is a preview of subscription content, access via your institution.
Buying options






References
Frank, E.: Machine Learning Techniques for Data Mining. Lecture Notes. University of Waikato, New Zealand, 25 October 2000
.
(KISDI Premium Report 12-02).
(2012)
Domingos, P.: A few useful things to know about machine learning. Commun. ACM 55(10), 78–87 (2012)
Vedaldi, A.: Cats and dogs. In: Proceedings of the 2012 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 3498–3505. IEEE Computer Society, June 2012
Schmidhuber, J.: Deep learning in neural networks: an overview. Neural Netw. 61, 85–117 (2015)
Mikolov, T., Karafiát, M., Burget, L., Cernocký, J., Khudanpur, S.: Recurrent neural network based language model. In: Interspeech, vol. 2, p. 3, September 2010
Kim, Y.: Convolutional neural networks for sentence classification. In: Interspeech, vol. 2, p. 3 (2014). arXiv preprint: arXiv:1408.5882
Raina, R., Battle, A., Lee, H., Packer, B., Ng, A.Y.: Self-taught learning: transfer learning from unlabeled data. In: Proceedings of the 24th International Conference on Machine Learning, pp. 759–766. ACM, June 2007
Cauwenberghs, G., Poggio, T.: Incremental and decremental support vector machine learning. In: NIPS, vol. 13, December 2000
Acknowledgments
This research was supported by Support Program for Women in Science, Engineering and Technology through the National Research Foundation of Korea (NRF) funded by the Ministry of Science, ICT and future Planning (No. 2016H1C3A1903202).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Lee, G., Ryu, S., Kim, C. (2018). Block-Incremental Deep Learning Models for Timely Up-to-Date Learning Results. In: Lee, W., Choi, W., Jung, S., Song, M. (eds) Proceedings of the 7th International Conference on Emerging Databases. Lecture Notes in Electrical Engineering, vol 461. Springer, Singapore. https://doi.org/10.1007/978-981-10-6520-0_24
Download citation
DOI: https://doi.org/10.1007/978-981-10-6520-0_24
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-10-6519-4
Online ISBN: 978-981-10-6520-0
eBook Packages: EngineeringEngineering (R0)