Implementation and Analysis of Contextual Neural Networks in H2O Framework

  • Krzysztof WołkEmail author
  • Erik Burnell
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11432)


Contextual neural networks utilizing conditional multi-step aggregation functions have many useful properties. For example, their ability to decrease the activity between internal neuron connections may decrease computational costs, whereas their built-in automatic selection of attributes required for proper classification can simplify problem setup. The research of contextual neural networks was motivated by a limited number of satisfactory machine learning solutions providing these features. An implementation of the CxNN model in the machine learning framework was also developed to validate the method. In this article we explain relevant terms and the implementation of contextual neural networks as well as conditional multi-step aggregation functions. To validate the solution, experiments and their results are presented for selected UCI benchmarks and Cancer Gene Expression Microarray data.


Aggregation functions GBP Scan-paths Sigma-if 


  1. 1.
    Richter, A.N., Khoshgoftaar, T.M., Landset, S., Hasanin, T.: A multi-dimensional comparison of toolkits for machine learning with Big Data. In: IEEE International Conference on Information Reuse and Integration, pp. 1–8, IEEE, San Francisco (2015)Google Scholar
  2. 2.
    Ng, S.S.Y., Zhu, W., Tang, W.W.S., Wan, L.C.H., Wat, A.Y.W.: An independent study of two deep learning platforms - H2O and SINGA. In: International Conference on Industrial Engineering and Engineering Management, IEEM 2016, pp 1–5. IEEE Press, Bali (2016)Google Scholar
  3. 3.
    Cook, D.: Practical Machine Learning with H2O. Powerful, Scalable Techniques for Deep Learning and AI. O’Reilly Media, Beijing (2016)Google Scholar
  4. 4.
    Liang, M., Trejo, C., Muthu, L., Ngo, L.B., Luckow A., Apon, A.W.: Evaluating R-based Big Data analytic frameworks. In: IEEE International Conference on Cluster Computing (CLUSTER), pp. 1–2, IEEE, Chicago (2015)Google Scholar
  5. 5.
  6. 6.
    Domingos, S.L., Carvalho, R.N., Carvalho, R.S., Ramos, G.N.: Identifying IT purchases anomalies in the Brazilian government procurement system using deep learning. In: 15th IEEE International Conference on Machine Learning and Applications (ICMLA) (2016)Google Scholar
  7. 7.
    Grolinger, K., Capretz, M.A.M., Seewald, L.: Energy consumption prediction with Big Data: balancing prediction accuracy and computational resources. In: IEEE International Congress on Big Data (BigData Congress), pp. 1–8 (2016)Google Scholar
  8. 8.
    Suleiman, D., Al-Naymat, G.: SMS spam detection using H2O framework. In: Procedia Computer Science, vol. 113, pp. 154–161 (2017)Google Scholar
  9. 9.
  10. 10.
    Huk, M.: Backpropagation generalized delta rule for the selective attention Sigma-if artificial neural network. Int. J. Appl. Math. Comput. Sci. 22, 449–459 (2012)MathSciNetCrossRefGoogle Scholar
  11. 11.
    Huk, M.: Learning distributed selective attention strategies with the Sigma-if neural network. In: Akbar, M., Hussain, D. (eds.) Advances in Computer Science and IT, pp. 209–232. InTech, Vukovar (2009)Google Scholar
  12. 12.
    Szczepanik, M., Jóźwiak, I.: Data management for fingerprint recognition algorithm based on characteristic points’ groups. In: Pechenizkiy, M., Wojciechowski, M. (eds.) New Trends in Databases and Information Systems. Advances in Intelligent Systems and Computing, vol. 185, pp. 425–432. Springer, Heidelberg (2013). Scholar
  13. 13.
    Huk, M.: Context-related data processing with artificial neural networks for higher reliability of telerehabilitation systems. In: 17th International Conference on E-health Networking, Application & Services (HealthCom), pp. 217–221. IEEE Computer Society, Boston (2015)Google Scholar
  14. 14.
    Huk, M., Kwasnicka, H.: The concept and properties of Sigma-if neural network. In: Ribeiro, B., Albrecht, R.F., Dobnikar, A., Pearson, D.W., Steele, N.C. (eds.) Adaptive and Natural Computing Algorithms, pp. 13–17. Springer, Vienna (2005). Scholar
  15. 15.
    Srivastava, N., Hinton, G., Krizhevsky, A., Sutskever, I., Salakhutdinov, R.: Dropout: a simple way to prevent neural networks from overfitting. J. Mach. Learn. Res. 15, 1929–1958 (2014)MathSciNetzbMATHGoogle Scholar
  16. 16.
    Huk, M.: Sigma-if neural network as the use of selective attention technique in classification and knowledge discovery problems solving. Annales UMCS Sectio AI - Informatica 4(2), 121–131 (2006)Google Scholar
  17. 17.
    Huk, M.: Notes on the generalized backpropagation algorithm for contextual neural networks with conditional aggregation functions. J. Intell. Fuzzy Syst. 32, 1365–1376 (2017)CrossRefGoogle Scholar
  18. 18.
    Huk, M.: Manifestation of selective attention in Sigma-if neural network. In: 2nd International Symposium Advances in Artificial Intelligence and Applications, International Multiconference on Computer Science and Information Technology, IMCSIT/AAIA 2007, vol. 2, pp. 225–236 (2007)Google Scholar
  19. 19.
    Raczkowski, D., Canning, A.: Thomas-Fermi charge mixing for obtaining self-consistency in density functional calculations. Phys. Rev. B 64, 121101–121105 (2001)CrossRefGoogle Scholar
  20. 20.
    UCI Machine Learning Repository.
  21. 21.
    Armstrong, S.A.: MLL translocations specify a distinct gene expression profile that distinguishes a unique leukemia. Nat. Genet. 30, 41–47 (2002)CrossRefGoogle Scholar
  22. 22.
    Janusz, B.J., Wołk, K.: Implementing contextual neural networks in distributed machine learning framework. In: Nguyen, N.T., Hoang, D.H., Hong, T.-P., Pham, H., Trawiński, B. (eds.) ACIIDS 2018. LNCS (LNAI), vol. 10752, pp. 212–223. Springer, Cham (2018). Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Wroclaw University of Science and TechnologyWroclawPoland

Personalised recommendations