Advertisement

A New Supervised Classification of Credit Approval Data via the Hybridized RBF Neural Network Model Using Information Complexity

  • Oguz Akbilgic
  • Hamparsum BozdoganEmail author
Part of the Studies in Classification, Data Analysis, and Knowledge Organization book series (STUDIES CLASS)

Abstract

In this paper, we introduce a new approach for supervised classification to handle mixed-data (i.e., categorical, binary, and continuous) data structures using a hybrid radial basis function neural networks (HRBF-NN). HRBF-NN supervised classification combines regression trees, ridge regression, and the genetic algorithm (GA) with radial basis function (RBF) neural networks (NN) along with information complexity (ICOMP) criterion as the fitness function to carry out both classification and subset selection of best predictors which discriminate between the classes. In this manner, we reduce the dimensionality of the data and at the same time improve classification accuracy of the fitted predictive model. We apply HRBF-NN supervised classification to a real benchmark credit approval mixed-data set to classify the customers into good/bad classes for credit approval. Our results show the excellent performance of HRBF-NN method in supervised classification tasks.

Keywords

Radial Basis Function Classification Tree Radial Basis Function Neural Network Supervise Classification Saturated Model 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Notes

Acknowledgements

This paper was invited as a keynote presentation by Prof. Bozdogan at the European Conference on Data Analysis (ECDA-2013) at the University of Luxembourg in Luxembourg during July 10–12, 2013. Prof. Bozdogan extents his gratitude to the conference organizers: Professors Sabine Krolak-Schwerdt, Matthias Bömer, and Berthold Lausen.

References

  1. Akaike, H. (1973). Information theory and an extension of the maximum likelihood principle. In B. H. Petrox & F. Csaki, (Eds.), Second International Symposium on Information Theory (pp. 267–281). Budapest: Academiai Kiado.Google Scholar
  2. Akbilgic, O. (2011). Variable selection and prediction using hybrid radial basis function neural networks: A case study on stock markets. PhD thesis, Istanbul University.Google Scholar
  3. Akbilgic, O., & Bozdogan, H. (2011). Predictive subset selection using regression trees and rbf neural networks hybridized with the genetic algorithm. European Journal of Pure and Applied Mathematics, 4(4), 467–485.MathSciNetGoogle Scholar
  4. Akbilgic, O., Bozdogan, H., & Balaban, M. E. (2013). A novel hybrid RBF neural network model as a forecaster. Statistics and Computing. doi:10.1007/s11222-013-9375-7.Google Scholar
  5. Anderson, R. (2007). The credit scoring toolkit. Oxford: Oxford University Press.Google Scholar
  6. Bishop, C. M. (1991). Improving the generalization properties of radial basis function neural networks. Neural Computation, 3(4), 579–588.CrossRefGoogle Scholar
  7. Bishop, C. M. (1995). Neural networks for pattern recognition. Oxford: Oxford University Press.Google Scholar
  8. Bozdogan, H. (1987). Model selection and Akaike’s information criterion (AIC): The general theory and it’s analytical extension. Journal of Mathematical Psychology, 52(3), 345–370.zbMATHMathSciNetGoogle Scholar
  9. Bozdogan, H. (1994). Mixture-model cluster analysis using a new informational complexity and model selection criteria. In H. Bozdogan (Ed.), Multivariate Statistical Modeling, Proceedings of the First US/Japan Conference on the Frontiers of Statistical Modeling: An Informational Approach (Vol. 2, pp. 69–113). North-Holland: SpringerGoogle Scholar
  10. Bozdogan, H. (2000). Akaike’s information criterion and recent developments in informational complexity. Journal of Mathematical Psychology, 44, 62–91.CrossRefzbMATHMathSciNetGoogle Scholar
  11. Bozdogan, H. (2004) Intelligent statistical data mining with information complexity and genetic algorithms. In H. Bozdogan (Ed.) Statistical data mining and knowledge discovery (pp. 15–56). Boca Raton: Chapman and Hall/CRCGoogle Scholar
  12. Breiman, L., Freidman, J., Stone, J. C., & Olsen, R. A. (1984). Classification and regression trees. Boca Raton: Chapman and Hall.zbMATHGoogle Scholar
  13. Credit Approval Data Set by UCI MAchine Learning Repository. http://archive.ics.uci.edu/ml/datasets/Credit+Approval. Cited April 26, 2013
  14. David, W. C., & Alice, E. S. (1996). Reliability optimization of series-parallel systems using a genetic algorithm. IEEE Transactions on Reliability, 45(2), 254–266.CrossRefGoogle Scholar
  15. Eiben, A. E., & Smith, J. E. (2010). Introduction to evolutionary computing. New York: Springer.Google Scholar
  16. Flach, P. A., Hernandez-Orallo, J., & Ferri, C. (2013). Comparing apples and oranges: Towards commensurate evaluation metrics in classification. Keynote lecture presented in the European Conference on Data Analysis (ECDA-2013), Luxembourg.Google Scholar
  17. Hoerl, A. E., Kennard, R. W., & Baldwin, K. F. (1975). Ridge regression: Some simulations. Communications in Statistics, 4, 105–123.CrossRefzbMATHGoogle Scholar
  18. Kubat, M. (1998). Decision trees can initialize radial basis function networks. Transactions on Neural Networks, 9(5), 813–821.CrossRefMathSciNetGoogle Scholar
  19. Kullback, A., & Leibler, R. (1951). On information and sufficiency. Annals of Mathematical Statistics, 22, 79–86.CrossRefzbMATHMathSciNetGoogle Scholar
  20. Liu, Z., & Bozdogan, H. (2004) Improving the performance of radial basis function classification using information criteria. In H. Bozdogan (Ed.), Statistical data mining and knowledge discovery (pp. 193–216). Boca Raton: Chapman and Hall/CRC.Google Scholar
  21. Orr, M. (2000). Combining regression trees and RBFs. International Journal of Neural Systems, 10(6), 453–465.CrossRefGoogle Scholar
  22. Rissanen, J. (1978). Modeling by shortest data description. Automatica, 14(5), 465–471.CrossRefzbMATHGoogle Scholar
  23. Schwarz, G. (1978). Estimating the dimension of model. Annals of Statistics, 6, 461–464.CrossRefzbMATHMathSciNetGoogle Scholar
  24. Sutton, C. D. (2005). Classification and regression trees, bagging, and boosting. In Handbook of statistics Vol. 24, pp. 303–329. Elsevier B.V. doi: 10.1016/s0169-716(04)24004-4.Google Scholar
  25. Tikhonov, A. H., & Arsenin, V. Y. (1977). Solutions of ill-posed problems. New York: Wiley.zbMATHGoogle Scholar
  26. White, H. (1982). Maximum likelihood estimation of misspecified models. Econometrica, 50, 1–25.CrossRefzbMATHMathSciNetGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2015

Authors and Affiliations

  1. 1.Department of Business Analytics and StatisticsUniversity of TennesseeKnoxvilleUSA
  2. 2.Department of Quantitative MethodsIstanbul University School of BusinessIstanbulTurkey

Personalised recommendations