Skip to main content

An Overall Performance Comparative of GA-PARSIMONY Methodology with Regression Algorithms

  • Conference paper

Part of the Advances in Intelligent Systems and Computing book series (AISC,volume 299)

Abstract

This paper presents a performance comparative of GA-PAR SIMONY methodology with five well-known regression algorithms and with different genetic algorithm (GA) configurations. This approach is mainly based on combining GA and feature selection (FS) during model tuning process to achieve better overall parsimonious models that assure good generalization capacities. For this purpose, individuals, already sorted by their fitness function, are rearranged in each iteration depending on the model complexity. The main objective is to analyze the overall model performance achieve with this methodology for each regression algorithm against different real databases and varying the GA setting parameters. Our preliminary results show that two algorithms, multilayer perceptron (MLP) with the Broyden-Fletcher-Goldfarb-Shanno training method and support vector machines for regression (SVR) with radial basis function kernel, performing better with similar features reduction when database has low number of input attributes (\(\lesssim32\)) and it has been used low GA population sizes.

Keywords

  • Genetic Algorithm
  • Tuning Modeling
  • Feature Selection
  • Parsimony Criterion
  • Model Comparative

This is a preview of subscription content, access via your institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • DOI: 10.1007/978-3-319-07995-0_6
  • Chapter length: 10 pages
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
eBook
USD   219.00
Price excludes VAT (USA)
  • ISBN: 978-3-319-07995-0
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
Softcover Book
USD   279.99
Price excludes VAT (USA)

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. StatLib—Datasets Archive, http://lib.stat.cmu.edu/datasets/

  2. Asuncion, A., Newman, D.: UCI machine learning repository (2007), http://www.ics.uci.edu/~mlearn/MLRepository.html

  3. Aha, D.W., Kibler, D.: Instance-based learning algorithms. Machine Learning, 37–66 (1991)

    Google Scholar 

  4. Calvo-Rolle, J.L., Corchado, E.: A bio-inspired knowledge system for improving combined cycle plant control tuning. Neurocomputing 126, 95–105 (2014)

    CrossRef  Google Scholar 

  5. Chen, N., Ribeiro, B., Vieira, A., Duarte, J., Neves, J.C.: A genetic algorithm-based approach to cost-sensitive bankruptcy prediction. Expert Syst. Appl. 38(10), 12939–12945 (2011)

    CrossRef  Google Scholar 

  6. Corchado, E., Abraham, A., Carvalho, A.: Hybrid intelligent algorithms and applications. Information Sciences 180(14), 2633–2634 (2010)

    CrossRef  MathSciNet  Google Scholar 

  7. Corchado, E., Graña, M., Wozniak, M.: Editorial: New trends and applications on hybrid artificial intelligence systems. Neurocomputing 75(1), 61–63 (2012)

    CrossRef  Google Scholar 

  8. Corchado, E., Wozniak, M., Abraham, A., de Carvalho, A.C.P.L.F., Snásel, V.: Recent trends in intelligent data analysis. Neurocomputing 126, 1–2 (2014)

    CrossRef  Google Scholar 

  9. Ding, S.: Spectral and wavelet-based feature selection with particle swarm optimization for hyperspectral classification. JSW 6(7), 1248–1256 (2011)

    CrossRef  Google Scholar 

  10. Drucker, H., Chris, K.B.L., Smola, A., Vapnik, V.: Support vector regression machines. In: Advances in Neural Information Processing Systems 9, vol. 9, pp. 155–161 (1997)

    Google Scholar 

  11. Guerrero, J.L., Berlanga, A., Molina, J.M.: A multi-objective approach for the segmentation issue. Engineering Optimization 44(3), 267–287 (2012)

    CrossRef  Google Scholar 

  12. Hall, M., Frank, E., Holmes, G., Pfahringer, B., Reutemann, P., Witten, I.H.: The weka data mining software: An update. SIGKDD Explor. Newsl. 11(1), 10–18 (2009)

    CrossRef  Google Scholar 

  13. Hornik, K., Buchta, C., Zeileis, A.: Open-source machine learning: R meets Weka. Computational Statistics 24(2), 225–232 (2009)

    CrossRef  MATH  MathSciNet  Google Scholar 

  14. Huang, H.L., Chang, F.L.: Esvm: Evolutionary support vector machine for automatic feature selection and classification of microarray data. Biosystems 90(2), 516–528 (2007)

    CrossRef  MathSciNet  Google Scholar 

  15. Menéndez de Llano, R., Bosque, J.L.: Study of neural net training methods in parallel and distributed architectures. Future Gener. Comput. Syst. 26(2), 267–275 (2010)

    CrossRef  Google Scholar 

  16. Michalewicz, Z., Janikow, C.Z.: Handling constraints in genetic algorithms. In: ICGA, pp. 151–157 (1991)

    Google Scholar 

  17. Quinlan, J.R.: Learning with continuous classes. In: 5th Australian Joint Conference on Artificial Intelligence, pp. 343–348 (1992)

    Google Scholar 

  18. R Core Team: R: A Language and Environment for Statistical Computing. R Foundation for Statistical Computing, Vienna, Austria (2013)

    Google Scholar 

  19. Reif, M., Shafait, F., Dengel, A.: Meta-learning for evolutionary parameter optimization of classifiers. Machine Learning 87(3), 357–380 (2012)

    CrossRef  MathSciNet  Google Scholar 

  20. Sanz-García, A., Fernández-Ceniceros, J., Fernández-Martínez, R., Martínez-De-Pisón, F.: Methodology based on genetic optimisation to develop overall parsimony models for predicting temperature settings on annealing furnace. Ironmaking and Steelmaking 41(2), 87–98 (2014)

    CrossRef  Google Scholar 

  21. Sanz-García, A., Fernández-Ceniceros, J., Antoñanzas-Torres, F., Martínez-de-Pisón-Ascacibar, F.J.: Parsimonious support vector machines modelling for set points in industrial processes based on genetic algorithm optimization. In: Herrero, A., et al. (eds.) International Joint Conference SOCO’13-CISIS’13-ICEUTE’13. AISC, vol. 239, pp. 1–10. Springer, Heidelberg (2014)

    Google Scholar 

  22. Sedano, J., Curiel, L., Corchado, E., de la Cal, E., Villar, J.R.: A soft computing method for detecting lifetime building thermal insulation failures. Integrated Computer-Aided Engineering 17(2), 103–115 (2010)

    Google Scholar 

  23. Winkler, S.M., Affenzeller, M., Kronberger, G., Kommenda, M., Wagner, S., Jacak, W., Stekel, H.: Analysis of selected evolutionary algorithms in feature selection and parameter optimization for data based tumor marker modeling. In: Moreno-Díaz, R., Pichler, F., Quesada-Arencibia, A. (eds.) EUROCAST 2011, Part I. LNCS, vol. 6927, pp. 335–342. Springer, Heidelberg (2012)

    CrossRef  Google Scholar 

  24. Witten, I.H., Frank, E., Hall, M.A.: Data Mining: Practical Machine Learning Tools and Techniques, 3rd edn. Morgan Kaufmann, Amsterdam (2011)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Rubén Urraca-Valle .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and Permissions

Copyright information

© 2014 Springer International Publishing Switzerland

About this paper

Cite this paper

Urraca-Valle, R., Sodupe-Ortega, E., Antoñanzas Torres, J., Antoñanzas-Torres, F., Martínez-de-Pisón, F.J. (2014). An Overall Performance Comparative of GA-PARSIMONY Methodology with Regression Algorithms. In: , et al. International Joint Conference SOCO’14-CISIS’14-ICEUTE’14. Advances in Intelligent Systems and Computing, vol 299. Springer, Cham. https://doi.org/10.1007/978-3-319-07995-0_6

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-07995-0_6

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-07994-3

  • Online ISBN: 978-3-319-07995-0

  • eBook Packages: EngineeringEngineering (R0)