Advertisement

On the Evolutionary Search for Data Reduction Method

  • Hanna LackaEmail author
  • Maciej Grzenda
Part of the Advances in Intelligent and Soft Computing book series (AINSC, volume 151)

Introduction

One of the key applications of statistical analysis and data mining is the development of the classification and prediction models. In both cases, significant improvements can be attained by limiting the number of model inputs. This can be done at two levels, namely by eliminating unnecessary attributes [3] and reducing the dimensionality of the data [12].Variety of methods have been proposed in both fields.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Cover, T.M., Hart, P.E.: Nearest neighbor pattern classification. IEEE Transaction of Information Theory 13, 21–27 (1967)zbMATHCrossRefGoogle Scholar
  2. 2.
    Frank, A., Asuncion, A.: UCI Machine Learning Repository. University of California, School of Information and Computer Scienc, Irvine, CA (2010), http://archive.ics.uci.edu/ml Google Scholar
  3. 3.
    de Haro-García, A., Pérez-Rodríguez, J., García-Pedrajas, N.: Feature Selection for Translation Initiation Site Recognition. In: Mehrotra, K.G., Mohan, C.K., Oh, J.C., Varshney, P.K., Ali, M. (eds.) IEA/AIE 2011, Part II. LNCS, vol. 6704, pp. 357–366. Springer, Heidelberg (2011)CrossRefGoogle Scholar
  4. 4.
    Grzenda, M.: Prediction-Oriented Dimensionality Reduction of Industrial Data Sets. In: Mehrotra, K.G., Mohan, C.K., Oh, J.C., Varshney, P.K., Ali, M. (eds.) IEA/AIE 2011, Part I. LNCS, vol. 6703, pp. 232–241. Springer, Heidelberg (2011)CrossRefGoogle Scholar
  5. 5.
    Haykin, S.: Neural Networks and Learning Machines. Person Education (2009)Google Scholar
  6. 6.
    Haykin, S.: Neural Networks: A Comprehensive Foundation. Prentice Hall Inc. (1999)Google Scholar
  7. 7.
    Holland, J.H.: Adaptation in natural and artificial systems. University of Michigan Press, Ann Arbor (1975)Google Scholar
  8. 8.
    Ishibuchi, H., Nakashima, T., Nii, M.: Learning of Neural Networks with GA-based Instance Selection. In: Proc. of 9th IFSA World Congress and 20th NAFIPS International Conference, Vancouver, Canada, July 25-28, pp. 2102–2107 (2001)Google Scholar
  9. 9.
    Larose, D.T.: Data Mining Methods and Models. John Wiley & Sons (2006)Google Scholar
  10. 10.
    Lattin, J.M., Carroll, J.D., Green, P.E.: Analyzing Multivariate Data. Thomson (2003)Google Scholar
  11. 11.
  12. 12.
    Lee, J., Verleysen, M.: Nonlinear Dimensionality Reduction. Springer (2010)Google Scholar
  13. 13.
    Kohavi, R., John, G.H.: Wrappers for feature subset selection. Artificial Intelligence 97(1-2), 273–324 (1997)zbMATHCrossRefGoogle Scholar
  14. 14.
    Raymer, M.L., Punch, W.F., Goodman, E.D., Kuhn, L.A., Jain, A.K.: Dimensionality reduction using genetic algorithms. IEEE Transactions on Evolutionary Computation 4(2), 164–171 (2000)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  1. 1.Institute of Computer SciencePolish Academy of SciencesWarszawaPoland
  2. 2.Faculty of Mathematics and Information ScienceWarsaw University of TechnologyWarszawaPoland

Personalised recommendations