A Novel GA-Taguchi-Based Feature Selection Method

  • Cheng-Hong Yang
  • Chi-Chun Huang
  • Kuo-Chuan Wu
  • Hsin-Yun Chang
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5326)


This work presents a novel GA-Taguchi-based feature selection method. Genetic algorithms are utilized with randomness for “global search” of the entire search space of the intractable search problem. Various genetic operations, including crossover, mutation, selection and replacement are performed to assist the search procedure in escaping from sub-optimal solutions. In each iteration in the proposed nature-inspired method, the Taguchi methods are employed for “local search” of the entire search space and thus can help explore better feature subsets for next iteration. The two-level orthogonal array is utilized for a well-organized and balanced comparison of two levels for features—a feature is or is not selected for pattern classification—and interactions among features. The signal-to-noise ratio (SNR) is then used to determine the robustness of the features. As a result, feature subset evaluation efforts can be significantly reduced and a superior feature subset with high classification performance can be obtained. Experiments are performed on different application domains to demonstrate the performance of the proposed nature-inspired method. The proposed hybrid GA-Taguchi-based approach, with wrapper nature, yields superior performance and improves classification accuracy in pattern classification.


Genetic Algorithm Taguchi Method Orthogonal Array Feature Subset Selection Pattern Classification 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Blake, C.L., Merz, C.J.: UCI Repository of machine learning databases. University of California, Department of Information and a Computer Science, Irvine (1998), Google Scholar
  2. 2.
    Blum, A., Langley, P.: Selection of Relevant Features and Examples in Machine Learning. Artificial Intelligence 97, 245–272 (1997)MathSciNetCrossRefzbMATHGoogle Scholar
  3. 3.
    Cawley, G.C., Talbot, N.L.C.: Efficient Leave-one-out Cross-validation of Kernel Fisher Discriminant Classifiers. Pattern Recognition 36, 2585–2592 (2003)CrossRefzbMATHGoogle Scholar
  4. 4.
    Cover, T.M., Hart, P.E.: Nearest Neighbor Pattern Classification. IEEE Trans. on Information Theory 13, 21–27 (1967)CrossRefzbMATHGoogle Scholar
  5. 5.
    Dasarathy, B.V.: Nearest Neighbor (NN) Norms: NN Pattern Classification Techniques. IEEE Computer Society Press, Los Alamitos (1990)Google Scholar
  6. 6.
    Dash, M., Liu, H.: Feature Selection for Classification. Intelligent Data Analysis 2, 232–256 (1997)Google Scholar
  7. 7.
    Doak, J.: An Evaluation of Feature Selection Methods and Their Application to Computer Security. Technical Report, Univ. of California at Davis, Dept. Computer Science (1992)Google Scholar
  8. 8.
    Duda, R.O., Hart, P.E.: Pattern Classification and Scene Analysis. John Wiley & Sons, Chichester (1973)zbMATHGoogle Scholar
  9. 9.
    Goldberg, D.E.: Genetic Algorithms in Search, Optimization and Machine Learning. Addison Wesley, Reading (1989)zbMATHGoogle Scholar
  10. 10.
    Hall, M.A.: Correlation-based Feature Subset Selection for Machine Learning. PhD Dissertation, University of Waikato (1998)Google Scholar
  11. 11.
    Holland, J.H.: Adaptation in natural and artificial systems. University of Michigan Press, Ann Arbor (1975)Google Scholar
  12. 12.
    Inza, I., Larrañaga, P., Sierra, B.: Feature Subset Selection by Bayesian Networks: a Comparison with Genetic and Sequential Algorithms. International Journal of Approximate Reasoning 27, 143–164 (2001)CrossRefzbMATHGoogle Scholar
  13. 13.
    John, G.H., Kohavi, R., Pfleger, K.: Irrelevant Feature and the Subset Selection Problem. In: Proc. 11th Int’l Conf. Machine Learning, pp. 121–129 (1994)Google Scholar
  14. 14.
    Kohavi, R., John, G.H.: Wrappers for Feature Subset Selection. Artificial Intelligence 97, 273–324 (1997)CrossRefzbMATHGoogle Scholar
  15. 15.
    Liu, H., Motoda, H.: Feature Selection for Knowledge Discovery and Data Mining. Kluwer Academic, Boston (1998)CrossRefzbMATHGoogle Scholar
  16. 16.
    Liu, H., Setiono, R.: A Probabilistic Approach to Feature Selection - A Filter Solution. In: Proc. of 13th International Conference on Machine Learning, pp. 319–327 (1996)Google Scholar
  17. 17.
    Liu, H., Yu, L.: Toward Integrating Feature Selection Algorithms for Classification and Clustering. IEEE Trans. Knowl. Data Eng. 17, 491–502 (2005)CrossRefGoogle Scholar
  18. 18.
    Mitchell, M.: An Introduction to Genetic Algorithms. MIT Press, Cambridge (1992)zbMATHGoogle Scholar
  19. 19.
    Stone, M.: Cross-validatory Choice and Assessment of Statistical Predictions. Journal of the Royal Statistical Society B 36, 111–147 (1974)MathSciNetzbMATHGoogle Scholar
  20. 20.
    Taguchi, G., Chowdhury, S., Taguchi, S.: Robust Engineering. McGraw-Hill, New York (2000)zbMATHGoogle Scholar
  21. 21.
    Wu, Y., Wu, A., Taguchi, G.: Taguchi Methods for Robust Design. ASME, New York (2000)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2008

Authors and Affiliations

  • Cheng-Hong Yang
    • 1
  • Chi-Chun Huang
    • 2
  • Kuo-Chuan Wu
    • 1
  • Hsin-Yun Chang
    • 3
    • 4
  1. 1.Department of Computer Science and Information EngineeringNational Kaohsiung University of Applied SciencesKaohsiungTaiwan
  2. 2.Department of Information ManagementNational Kaohsiung Marine UniversityKaohsiungTaiwan
  3. 3.Department of Business AdministrationChin-Min Institute of TechnologyTou-FenTaiwan
  4. 4.Department of Industrial Technology EducationNational Kaohsiung Normal UniversityKaohsiungTaiwan

Personalised recommendations