Advertisement

Generalisation Enhancement via Input Space Transformation: A GP Approach

  • Ahmed Kattan
  • Michael Kampouridis
  • Alexandros Agapitos
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8599)

Abstract

This paper proposes a new approach to improve generalisation of standard regression techniques when there are hundreds or thousands of input variables. The input space X is composed of observational data of the form (x i , y(x i )), i = 1... n where each x i denotes a k-dimensional input vector of design variables and y is the response. Genetic Programming (GP) is used to transform the original input space X into a new input space Z = (z i , y(z i )) that has smaller input vector and is easier to be mapped into its corresponding responses. GP is designed to evolve a function that receives the original input vector from each x i in the original input space as input and return a new vector z i as an output. Each element in the newly evolved z i vector is generated from an evolved mathematical formula that extracts statistical features from the original input space. To achieve this, we designed GP trees to produce multiple outputs. Empirical evaluation of 20 different problems revealed that the new approach is able to significantly reduce the dimensionality of the original input space and improve the performance of standard approximation models such as Kriging, Radial Basis Functions Networks, and Linear Regression, and GP (as a regression techniques). In addition, results demonstrate that the new approach is better than standard dimensionality reduction techniques such as Principle Component Analysis (PCA). Moreover, the results show that the proposed approach is able to improve the performance of standard Linear Regression and make it competitive to other stochastic regression techniques.

Keywords

Genetic Programming Symbolic Regression Approximation Models Surrogate Dimensionality Reduction 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Bishop, C.M., Nasrabadi, N.M.: Pattern recognition and machine learning, vol. 1. Springer, New York (2006)zbMATHGoogle Scholar
  2. 2.
    Estébanez, C., Aler, R., Valls, J.M.: Genetic programming based data projections for classification tasks. World Academy of Science, Engineering and Technology (2005)Google Scholar
  3. 3.
    Forrester, A., Sóbester, A., Keane, A.: Engineering design via surrogate modelling: a practical guide. John Wiley & Sons (2008)Google Scholar
  4. 4.
    García, S., Herrera, F.: An extension on statistical comparisons of classifiers over multiple data sets for all pairwise comparisons. Journal of Machine Learning Research 9(66), 2677–2694 (2008)zbMATHGoogle Scholar
  5. 5.
    Icke, I., Bongard, J.: Improving genetic programming based symbolic regression using deterministic machine learning. In: 2013 IEEE Congress on Evolutionary Computation (CEC), pp. 1763–1770 (2013)Google Scholar
  6. 6.
    Kattan, A., Galvan, E.: Evolving radial basis function networks via gp for estimating fitness values using surrogate models. In: 2012 IEEE Congress on Evolutionary Computation (CEC), pp. 1–7 (2012)Google Scholar
  7. 7.
    Koza, J.R.: Genetic Programming: On the programming of computers by means of natural selection, vol. 1. MIT Press (1992)Google Scholar
  8. 8.
    McConaghy, T.: Latent variable symbolic regression for high-dimensional inputs. In: Genetic Programming Theory and Practice VII, pp. 103–118. Springer (2010)Google Scholar
  9. 9.
    McConaghy, T.: Ffx: Fast, scalable, deterministic symbolic regression technology. In: Genetic Programming Theory and Practice IX, pp. 235–260. Springer (2011)Google Scholar
  10. 10.
    Molga, M., Smutnick, C.: Test functions for optimization needs (2005)Google Scholar
  11. 11.
    Poli, R., Langdon, W.W.B., McPhee, N.F., Koza, J.R.: A field guide to genetic programming. Lulu.com (2008)Google Scholar
  12. 12.
    Smits, G., Kordon, A., Vladislavleva, K., Jordaan, E., Kotanchek, M.: Variable selection in industrial datasets using pareto genetic programming. In: Yu, T., Riolo, R.L., Worzel, B. (eds.) Genetic Programming Theory and Practice III, Genetic Programming, May 12-14, vol. 9, ch. 6, pp. 79–92. Springer, Ann Arbor (2005)Google Scholar
  13. 13.
    Sobester, A., Nair, P., Keane, A.: Evolving intervening variables for response surface approximations. In: Proceedings of the 10th AIAA/ISSMO Multi-disciplinary Analysis and Optimization Conference, pp. 1–12. American Institute of Aeronautics and Astronautics (2004), http://eprints.soton.ac.uk/22962/, aIAA 2004-4379
  14. 14.
    Zhang, Y., Zhang, M.: A multiple-output program tree structure in genetic programming. In: Mckay, R.I., Cho, S.B. (eds.) Proceedings of the Second Asian-Pacific Workshop on Genetic Programming, Cairns, Australia, December 6-7, p. 12 (2004), http://www.mcs.vuw.ac.nz/~mengjie/papers/yun-meng-apwgp04.pdf

Copyright information

© Springer-Verlag Berlin Heidelberg 2014

Authors and Affiliations

  • Ahmed Kattan
    • 1
  • Michael Kampouridis
    • 2
  • Alexandros Agapitos
    • 3
  1. 1.AI Real-World Applications Lab, Department of Computer ScienceUm Al Qura UniversityKingdom of Saudi Arabia
  2. 2.School of ComputingUniversity of KentUK
  3. 3.Complex and Adaptive Systems Laboratory, School of Computer Science and InformaticsUniversity College DublinIreland

Personalised recommendations