Advertisement

N-semble: neural network based ensemble approach

  • Rishith RayalEmail author
  • Divya Khanna
  • Jasminder Kaur Sandhu
  • Nishtha Hooda
  • Prashant Singh Rana
Original Article

Abstract

Output can be predicted from experimental or achieve data by using machine learning models like random forest, artificial neural network, decision tree and many more models. Each model has its own limitations and advantages. To improve model’s accuracy, outcome of multiple models can be combined for prediction. The way of combining the predictions of different models is the key to increase the overall accuracy. In this work, a new approach is discussed to create an ensemble model on a regression dataset which overcomes the limitation of classical ensemble approach. Artificial neural network is trained in a special way to ensemble the predictions of multiple models. The comparison between N-semble and classical model is performed on various evaluation measures and it is concluded that N-semble outperforms.

Keywords

Supervised machine learning N-semble Regression Neural network ensemble Correlation Validation 

References

  1. 1.
  2. 2.
    Liu B, Wang S, Dong Q, Li S, Liu X (2016) Identification of DNA-binding proteins by combining auto-cross covariance transformation and ensemble learning. IEEE Trans Nano Biosci 15(4):328–334Google Scholar
  3. 3.
    Liu B, Long R, Chou KC (2016) iDHS-EL: identifying DNase I hypersensitive sites by fusing three different modes of pseudo nucleotide composition into an ensemble learning framework. Bioinformatics 32(16):2411–2418Google Scholar
  4. 4.
    Liu B, Long R, Chou KC (2017) iRSpot-EL: identify recombination spots with an ensemble learning approach. Bioinformatics 33(1):35–41Google Scholar
  5. 5.
    Liu B, Zhang D, Xu R, Xu J, Wang X, Chen Q, Dong Q, Chou KC (2014) Combining evolutionary information extracted from frequency profiles with sequence-based kernels for protein remote homology detection. Bioinformatics 30(4):472–479Google Scholar
  6. 6.
    Cortes C, Mohri M, Rostamizadeh A (2016) Two-stage learning kernel algorithms. Proc 27th Int Conf Mach Learning (ICML-10) 62(3):1485–1500Google Scholar
  7. 7.
    Varma M, Bodla RB (2009) More generality in efficient multiple kernel learning. In: Proceedings of the 26th annual international conference on machine learning. ACMGoogle Scholar
  8. 8.
    Zhou Z-H, Jiang Y (2004) NeC4.5: neural ensemble based C4.5. IEEE Trans Knowl Data Eng 16(6):770–773Google Scholar
  9. 9.
    Pantola P, Bala A, Rana PS (2015) Consensus based ensemble model for spam detection. 2015 international conference on advances in computing, communications and informatics (ICACCI)Google Scholar
  10. 10.
    Rana PS, Sharma H, Bhattacharya M, Shukla A (2015) Quality assessment of modelled protein structure using physicochemical properties. J Bioinf Comput Biol 13(2):1550005Google Scholar
  11. 11.
    Scornet E (2016) Random forests and kernel methods. IEEE Trans Inf Theory 62(3):1485–1500MathSciNetzbMATHGoogle Scholar
  12. 12.
    Ma X, Guo J, Xiao K, Sun X (2015) PRBP: prediction of RNA-binding proteins using a random forest algorithm combined with an RNA-binding residue predictor. IEEE/ACM Trans Comput Biol Bioinf 12(6):1385–1393Google Scholar
  13. 13.
    Lin H et al (2015) Weighing fusion method for truck scales based on prior knowledge and neural network ensembles. IEEE Trans Instrum Meas 14(6):649–659Google Scholar
  14. 14.
    Xia J, Liao W, Chanussot J, Du P, Song G, Philips W (2015) Improving random forest with ensemble of features and semi supervised feature extraction. IEEE Geosci Remote Sens Lett 12(7):1471–1475Google Scholar
  15. 15.
    Dai HL (2015) Imbalanced protein data classification using ensemble FTM-SVM. IEEE Trans Nanobiosci 14(4):350–359Google Scholar
  16. 16.
    Hansen LK, Salamon P (1990) Neural network ensembles. IEEE Trans Pattern Anal Mach Intell 12(10):993–1001Google Scholar
  17. 17.
    Dai HL (2015) Imbalanced protein data classification using ensemble FTM-SVM. IEEE Trans Nanobiosci 14(4):350–359Google Scholar
  18. 18.
    Phan H, Maab M, Mazur R, Mertins A (2015) Random regression forests for acoustic event detection and classification. IEEE/ACM Trans Audio Speech Lang Process 23(1):20–31Google Scholar
  19. 19.
    Dehzangi A (2013) A combination of feature extraction methods with an ensemble of different classifiers for protein structural class prediction problem. IEEE/ACM Trans Comput Biol Bioinf 10(3):564–575Google Scholar
  20. 20.
    Heaton J (2008) Introduction to neural networks with Java Heaton Res IncGoogle Scholar
  21. 21.
    Wei L, Liao M, Gao X, Zou Q (2015) Enhanced protein fold prediction method through a novel feature extraction technique. IEEE Trans Nanobiosci 14(6):649–659Google Scholar
  22. 22.
    Wang Xizhao, Xing Hong-Jie, Li Yan et al (2015) A study on relationship between generalization abilities and fuzziness of base classifiers in ensemble learning. IEEE Trans Fuzzy Syst 23(5):1638–1654Google Scholar
  23. 23.
    Wang Xizhao, Aamir Rana, Ai-Min Fu (2015) Fuzziness based sample categorization for classifier performance improvement. J Intell Fuzzy Syst 29:1185–1196MathSciNetGoogle Scholar
  24. 24.
    Ashfaq RAR et al, Wang XZ, Huang JZ, Abbas H, He YL (2017) Fuzziness based semi-supervised learning approach for intrusion detection system. Inf Sci 378:484–497Google Scholar
  25. 25.
    Xizhao Wang, Tianlun Zhang, Ran Wang (2017) Non-iterative deep learning: incorporating restricted Boltzmann machine into multilayer random weight neural networks. IEEE Trans Syst Man Cybern Syst. doi:10.1109/TSMC.2017.2701419Google Scholar
  26. 26.
    Rulequest: data mining with cubist. www.rulequest.com/cubist-info.htmlGoogle Scholar
  27. 27.
    Documentation on Xgboost. https://goo.gl/7nttEF
  28. 28.
    Liam A, Wiener M (2002) Classification and regression by randomForest. News 2(3):1822Google Scholar
  29. 29.
    K-fold validation, website: scikit-learn. http://goo.gl/JXknN8
  30. 30.
    XgBoost website: CRAN.R-Project. http://goo.gl/ulWSI3
  31. 31.
    CART website: CRAN.R-Project. http://goo.gl/ulWSI3

Copyright information

© Springer-Verlag GmbH Germany 2017

Authors and Affiliations

  1. 1.Department of Information and Communication TechnologyABV-Indian Institute of Information TechnologyGwaliorIndia
  2. 2.Computer Science and Engineering DepartmentThapar UniversityPatialaIndia

Personalised recommendations