Advertisement

Optimization of Evolutionary Instance Selection

  • Mirosław KordosEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10245)

Abstract

Evolutionary instance selection is the most accurate process comparing to other methods based on distance, such as the instance selection methods based on k-NN. However, the drawback of evolutionary methods is their very high computational cost. We compare the performance of evolutionary and classical methods and discuss how to minimize the computational cost using optimization of genetic algorithm parameters, joining them with the classical instance selection methods and caching the information used by k-NN.

Keywords

Genetic Algorithm Instance Selection Genetic Algorithm Parameter Generational Genetic Algorithm Fitness Function Evaluation 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    Antonelli, M., Ducange, P., Marcelloni, F.: Genetic training instance selection in multiobjective evolutionary fuzzy systems: a coevolutionary approach. IEEE Trans. Fuzzy Syst. 20(2), 276–290 (2012)CrossRefGoogle Scholar
  2. 2.
    Derrac, J., Cornelis, C., Garcia, S., Herrera, F.: Enhancing evolutionary instance selection algorithms by means of fuzzy rough set based feature selection. Inf. Sci. 186, 73–92 (2012)CrossRefGoogle Scholar
  3. 3.
    Tsaia, C.-F., Eberleb, W., Chu, C.-Y.: Genetic algorithms in feature and instance selection. Knowl. Based Syst. 39, 240–247 (2013)CrossRefGoogle Scholar
  4. 4.
    Garcia, S., Derrac, J., Cano, J.R., Herrera, F.: Prototype selection for nearest neighbor classification: taxonomy and empirical study. IEEE Trans. Pattern Anal. Mach. Intell. 34(3), 417–435 (2012)CrossRefGoogle Scholar
  5. 5.
    Olvera-López, J.A., Carrasco-Ochoa, J.A., Martínez-Trinidad, J.F., Kittler, J.: A review of instance selection methods. Artif. Intell. Rev. 34(2), 133–143 (2010)CrossRefGoogle Scholar
  6. 6.
    Hofmann, M., Klinkenberg, R.: RapidMiner: Data Mining Use Cases and Business Analytics Applications. CRC Press, Boca Raton (2013)Google Scholar
  7. 7.
    Jankowski, N., Grochowski, M.: Comparison of instances seletion algorithms I. Algorithms survey. In: Rutkowski, L., Siekmann, J.H., Tadeusiewicz, R., Zadeh, L.A. (eds.) ICAISC 2004. LNCS, vol. 3070, pp. 598–603. Springer, Heidelberg (2004). doi: 10.1007/978-3-540-24844-6_90 CrossRefGoogle Scholar
  8. 8.
    Wilson, D.R., Martinez, T.R.: Reduction techniques for instance-based learning algorithms. Mach. Learn. 38, 257–286 (2000)CrossRefzbMATHGoogle Scholar
  9. 9.
    Aha, D.W., Kibler, D., Albert, M.K.: Instance-based learning algorithms. Mach. Learn. 6, 37–66 (1991)Google Scholar
  10. 10.
    Kordos, M., Blachnik, M.: Instance selection with neural networks for regression problems. In: Villa, A.E.P., Duch, W., Érdi, P., Masulli, F., Palm, G. (eds.) ICANN 2012. LNCS, vol. 7553, pp. 263–270. Springer, Heidelberg (2012). doi: 10.1007/978-3-642-33266-1_33 CrossRefGoogle Scholar
  11. 11.
    Arnaiz-González, A., Blachnik, M., Kordos, M., García-Osorio, C.: Fusion of instance selection methods in regression tasks. Inf. Fusion 30, 69–79 (2016)CrossRefGoogle Scholar
  12. 12.
    Kordos, M., Białka, S., Blachnik, M.: Instance selection in logical rule extraction for regression problems. In: Rutkowski, L., Korytkowski, M., Scherer, R., Tadeusiewicz, R., Zadeh, L.A., Zurada, J.M. (eds.) ICAISC 2013. LNCS, vol. 7895, pp. 167–175. Springer, Heidelberg (2013). doi: 10.1007/978-3-642-38610-7_16 CrossRefGoogle Scholar
  13. 13.
    Blachnik, M., Kordos, M.: Bagging of instance selection algorithms. In: Rutkowski, L., Korytkowski, M., Scherer, R., Tadeusiewicz, R., Zadeh, L.A., Zurada, J.M. (eds.) ICAISC 2014. LNCS, vol. 8468, pp. 40–51. Springer, Cham (2014). doi: 10.1007/978-3-319-07176-3_4 CrossRefGoogle Scholar
  14. 14.
    Goldberg, D.D.: Genetic Algorithms in Search, Optimization and Machine Learning. Addison Wesley, Boston (1989)zbMATHGoogle Scholar
  15. 15.
    Michalewicz, Z.: Genetic Algorithms + Data Structures = Evolution Programs. Springer, Heidelberg (1992)CrossRefzbMATHGoogle Scholar
  16. 16.
    Lobo, F.G., Lima, C.F., Michalewicz, Z.: Parameter Setting in Evolutionary Algorithms. Studies in Computational Intelligence, vol. 54. Springer, Heidelberg (2007)zbMATHGoogle Scholar
  17. 17.
    Zavoianu, Z.C., et al.: Performance comparison of generational and steady-state asynchronous multi-objective evolutionary algorithms for computationally-intensive problems. Knowl. Based Syst. 87, 47–60 (2015)CrossRefGoogle Scholar
  18. 18.
    Cano, J.R., Herrera, F., Lozano, M.: Instance selection using evolutionary algorithms: an experimental study. In: Pal, N.R., Jain, L. (eds.) Advanced Techniques in Knowledge Discovery and Data Mining. Advanced Information and Knowledge Processing, pp. 127–152. Springer, London (2004). doi: 10.1007/1-84628-183-0_5 Google Scholar
  19. 19.
    Alcala-Fdez, J., et al.: KEEL data-mining software tool: data set repository, integration of algorithms and experimental analysis framework. J. Multiple-Valued Logic Soft Comput. 17, 255–287 (2011). http://sci2s.ugr.es/keel/datasets.php Google Scholar
  20. 20.
    Rusiecki, A., Kordos, M., Kamiński, T., Greń, K.: Training neural networks on noisy data. In: Rutkowski, L., Korytkowski, M., Scherer, R., Tadeusiewicz, R., Zadeh, L.A., Zurada, J.M. (eds.) ICAISC 2014. LNCS, vol. 8467, pp. 131–142. Springer, Cham (2014). doi: 10.1007/978-3-319-07173-2_13 CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  1. 1.Department of Computer Science and AutomaticsUniversity of Bielsko-BialaBielsko-BiałaPoland

Personalised recommendations