Skip to main content

Advertisement

Log in

A-DBNF: adaptive deep belief network framework for regression and classification tasks

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

Many machine learning methods and models have been proposed for multivariate data regression and classification in recent years. Most of them are supervised learning methods, which require a large number of labeled data. Moreover, current methods need exclusive human labor and supervision to fine-tune the model hyperparameters. In this paper, we propose an adaptive deep belief network framework (A-DBNF) that can adapt to different datasets with minimum human labor. The proposed framework employs a deep belief network (DBN) to extract representative features of the datasets in the unsupervised learning phase and then fine-tune the network parameters by using few labeled data in the supervised learning phase. We integrate the DBN model with a genetic algorithm (GA) to select and optimize the model hyperparameters and further improve the network performance. We validate the performance of the proposed framework on several benchmark datasets, comparing the regression and classification accuracy with state-of-the-art methods. A-DBNF showed a noticeable performance improvement on three regression tasks using only 40–50% of labeled data. Our model outperformed most of the related methods in classification tasks by using 23–48% of labeled data.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

References

  1. Schroff F, Kalenichenko D, Philbin J (2015) Facenet: a unified embedding for face recognition and clustering. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 815–823.

    Google Scholar 

  2. Wen Y, Zhang K, Li Z, Qiao Y (2016) A discriminative feature learning approach for deep face recognition. In: European conference on computer vision. Springer, Cham, pp 499–515

    Google Scholar 

  3. Girshick R, Donahue J, Darrell T, Malik J (2014) Rich feature hierarchies for accurate object detection and semantic segmentation. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 580–587

    Google Scholar 

  4. Girshick R (2015) Fast r-cnn. In: Proceedings of the IEEE international conference on computer vision, pp 1440–1448

    Google Scholar 

  5. Hwang JJ, Liu TL (2015) Pixel-wise deep learning for contour detection. arXiv preprint arXiv:1504.01989

  6. He K, Zhang X, Ren S, Sun J (2016) Deep residual learning for image recognition. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 770–778

    Google Scholar 

  7. Ibrokhimov B, Hur C, Kang S (2020) Effective node selection technique towards sparse learning. Appl Intell 50:3239–3251

    Article  Google Scholar 

  8. Sermanet P, Eigen D, Zhang X, Mathieu M, Fergus R, LeCun Y (2013) Overfeat: integrated recognition, localization and detection using convolutional networks. arXiv preprint arXiv:1312.6229

  9. Simonyan K, Zisserman A (2014) Very deep convolutional networks for large-scale image recognition. arXiv preprint arXiv:1409.1556

  10. Chen W, Wilson J, Tyree S, Weinberger KQ, Chen Y (2016) Compressing convolutional neural networks in the frequency domain. In: Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. ACM, New York, pp 1475–1484

  11. Farabet C, Couprie C, Najman L, LeCun Y (2012) Learning hierarchical features for scene labeling. IEEE Trans Pattern Anal Mach Intell 35(8):1915–1929

    Article  Google Scholar 

  12. Phan N, Wang Y, Wu X, Dou D (2016) Differential privacy preservation for deep auto-encoders: an application of human behavior prediction. In: Thirtieth AAAI Conference on Artificial Intelligence

    Google Scholar 

  13. Phan N, Dou D, Wang H, Kil D, Piniewski B (2017) Ontology-based deep learning for human behavior prediction with explanations in health social networks. Inf Sci 384:298–313

    Article  Google Scholar 

  14. Zhao L, Zhou Y, Lu H, Fujita H (2019) Parallel computing method of deep belief networks and its application to traffic flow prediction. Knowl-Based Syst 163:972–987

    Article  Google Scholar 

  15. Goodfellow I, Bengio Y, Courville A, Bengio Y (2016) Deep learning, vol 1. MIT press, Cambridge

    MATH  Google Scholar 

  16. Lin KP, Pai PF, Ting YJ (2019) Deep belief networks with genetic algorithms in forecasting wind speed. IEEE Access 7:99244–99253

    Article  Google Scholar 

  17. Abdel-Zaher AM, Eldeib AM (2016) Breast cancer classification using deep belief networks. Expert Syst Appl 46:139–144

    Article  Google Scholar 

  18. Chu J, Wang H, Meng H, Jin P, Li T (2018) Restricted boltzmann machines with gaussian visible units guided by pairwise constraints. IEEE Trans Cybern 49(12):4321–4334

    Article  Google Scholar 

  19. Zhang J, Wang H, Chu J, Huang S, Li T, Zhao Q (2019) Improved Gaussian–Bernoulli restricted Boltzmann machine for learning discriminative representations. Knowl-Based Syst 185:104911

    Article  Google Scholar 

  20. Chen Y, Zhao X, Jia X (2015) Spectral–spatial classification of hyperspectral data based on deep belief network. IEEE J Sel Top Appl Earth Obs Remote Sens 8(6):2381–2392

    Article  Google Scholar 

  21. Cao R, Bhattacharya D, Hou J, Cheng J (2016) DeepQA: improving the estimation of single protein model quality with deep belief networks. BMC Bioinforma 17(1):495

    Article  Google Scholar 

  22. Tang B, Liu X, Lei J, Song M, Tao D, Sun S, Dong F (2016) Deepchart: combining deep convolutional networks and deep belief networks in chart classification. Signal Process 124:156–161

    Article  Google Scholar 

  23. Langone R, Alzate C, De Ketelaere B, Vlasselaer J, Meert W, Suykens JA (2015) LS-SVM based spectral clustering and regression for predicting maintenance of industrial machines. Eng Appl Artif Intell 37:268–278

    Article  Google Scholar 

  24. Chen W, Xie X, Wang J, Pradhan B, Hong H, Bui DT, Duan Z, Ma J (2017) A comparative study of logistic model tree, random forest, and classification and regression tree models for spatial prediction of landslide susceptibility. Catena 151:147–160

    Article  Google Scholar 

  25. Couronné R, Probst P, Boulesteix AL (2018) Random forest versus logistic regression: a large-scale benchmark experiment. BMC Bioinforma 19(1):270

    Article  Google Scholar 

  26. Jain V, Sharma J, Singhal K, Phophalia A (2019) Exponentially weighted random Forest. In: International Conference on Pattern Recognition and Machine Intelligence. Springer, Cham, pp 170–178

    Chapter  Google Scholar 

  27. Denil M, Matheson D, De Freitas N (2014) Narrowing the gap: random forests in theory and in practice. In: International conference on machine learning, pp 665–673

    Google Scholar 

  28. Hossain D, Capi G (2017) Genetic algorithm based deep learning parameters tuning for robot object recognition and grasping. Int Sch Sci Res Innov 11(3):629–633

    Google Scholar 

  29. Shao H, Jiang H, Li X, Liang T (2018) Rolling bearing fault detection using continuous deep belief network with locally linear embedding. Comput Ind 96:27–39

    Article  Google Scholar 

  30. Kuremoto T, Kimura S, Kobayashi K, Obayashi M (2014) Time series forecasting using a deep belief network with restricted Boltzmann machines. Neurocomputing 137:47–56

    Article  Google Scholar 

  31. Qolomany B, Maabreh M, Al-Fuqaha A, Gupta A, Benhaddou D (2017) Parameters optimization of deep learning models using particle swarm optimization. In: 2017 13th International Wireless Communications and Mobile Computing Conference (IWCMC). IEEE, Valencia, pp 1285–1290

  32. Sun Q, Wang Y, Jiang Y, Shao L, Chen D (2017) Fault diagnosis of SEPIC converters based on PSO-DBN and wavelet packet energy spectrum. In: 2017 Prognostics and System Health Management Conference (PHM-Harbin). IEEE, Harbin, pp 1–7

  33. Ibrokhimov B, Hur C, Kim H, Kang S (2020) An optimized deep belief network model for accurate breast Cancer classification. IEIE Trans Smart Process Comput 9(4):266–273

    Article  Google Scholar 

  34. He Y, Weng W, Fujimura S (2017) Improvements to genetic algorithm for flexible job shop scheduling with overlapping in operations. In: 2017 IEEE/ACIS 16th International Conference on Computer and Information Science (ICIS). IEEE, Wuhan, pp 791–796

  35. Taher AAK, Kadhim SM (2020) Improvement of genetic algorithm using artificial bee colony. Bull Electr Eng Inform 9(5):2125–2133

    Article  Google Scholar 

  36. Alkafaween E, Hassanat A (2018) Improving TSP solutions using GA with a new hybrid mutation based on knowledge and randomness. arXiv preprint arXiv:1801.07233

  37. Karaboga D (2005) An idea based on honey bee swarm for numerical optimization. In: Technical report-tr06, vol 200. Erciyes university, engineering faculty, computer engineering department, pp 1–10

  38. Dua D, Graff C (2019) UCI machine learning repository. University of California, School of Information and Computer Science, Irvine. http://archive.ics.uci.edu/ml. Accessed 24 Oct 2020

  39. Harrison D Jr, Rubinfeld DL (1978) Hedonic housing prices and the demand for clean air. J Environ Econ Manag 5(1):81–102

    Article  Google Scholar 

  40. Yeh IC, Hsu TK (2018) Building real estate valuation models with comparative approach through case-based reasoning. Appl Soft Comput 65:260–271

    Article  Google Scholar 

  41. Cortez P, Cerdeira A, Almeida F, Matos T, Reis J (2009) Modeling wine preferences by data mining from physicochemical properties. Decis Support Syst 47(4):547–553

    Article  Google Scholar 

  42. Anselin L, Florax R (2012) (eds) New directions in spatial econometrics. Springer Science & Business Media, Berlin

  43. Sarkar S, Chawla S (2016) Inferring the contiguity matrix for spatial autoregressive analysis with applications to house price prediction. arXiv preprint arXiv:1607.01999

  44. Wang L, Chan FF, Wang Y, Chang Q (2016) Predicting public housing prices using delayed neural networks. In: 2016 IEEE Region 10 Conference (TENCON). IEEE, pp 3589–3592

  45. Al Bataineh A, Kaur D (2018) A comparative study of different curve fitting algorithms in artificial neural network using housing dataset. In: NAECON 2018-IEEE National Aerospace and Electronics Conference. IEEE, Dayton, pp 174–178

  46. Jabeen K, Ahamed KI (2016) Abalone Age Prediction using Artificial Neural Network. IOSR J Comput Eng 18(05):34–38

  47. Wang Z (2018) Abalone age prediction employing a Cascade network algorithm and conditional generative adversarial networks. Research School of Computer Science, Australian National University, Canberra

  48. Wang X, Guan Z (2016) Evaluation model of grape wine quality based on BP neural network. In: 2016 International Conference on Logistics, Informatics and Service Sciences (LISS). IEEE, Sydney, pp 1–6

  49. Asim A, Asim A, Li Y, Xie Y, Zhu Y, Peng J (2002) Data Mining for Abalone. Citeseer, Princeton

  50. Miyato T, Dai AM, Goodfellow I (2016) Adversarial training methods for semi-supervised text classification. arXiv preprint arXiv:1605.07725

  51. Jiang M, Liang Y, Feng X, Fan X, Pei Z, Xue Y, Guan R (2018) Text classification based on deep belief network and softmax regression. Neural Comput & Applic 29(1):61–70

    Article  Google Scholar 

  52. Sumathi S, Pugalendhi GK (2020) Cognition based spam mail text analysis using combined approach of deep neural network classifier and random forest. J Ambient Intell Humaniz Comput. https://doi.org/10.1007/s12652-020-02087-8

  53. Johnson R, Zhang T (2014) Effective use of word order for text categorization with convolutional neural networks. arXiv preprint arXiv:1412.1058

Download references

Acknowledgments

This research was supported by the Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education (NRF-2019R1A2C1008048).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Sanggil Kang.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Ibrokhimov, B., Hur, C., Kim, H. et al. A-DBNF: adaptive deep belief network framework for regression and classification tasks. Appl Intell 51, 4199–4213 (2021). https://doi.org/10.1007/s10489-020-02050-2

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-020-02050-2

Keywords

Navigation