Skip to main content

Single and Ensemble Fault Classifiers Based on Features Selected by Multi-Objective Genetic Algorithms

  • Chapter
  • First Online:
Advances in Mathematical and Statistical Modeling

Part of the book series: Statistics for Industry and Technology ((SIT))

  • 1834 Accesses

Abstract

The problem of identifying faults in systems and processes can be formulated as a problem of partitioning objects (i.e., the measured data patterns representing the symptoms) into classes (i.e., the types of faults causing the symptoms). In this view, two main steps need to be carried out in order to effectively perform the fault identification: i) the selection of the features carrying information relevant for the identification; ii) the classification of the measured data patterns of features into the different fault types. In this work, the two tasks are tackled by combining a multi-objective genetic algorithm search with a Fuzzy K-Nearest Neighbors classification. Two different approaches to the development of the fault classification model are considered: a single classifier based on a feature subset chosen a posteriori on the Pareto-front identified by the multi-objective genetic search and an ensemble of classifiers, each one built on a different feature subset taken from the genetic algorithm population at convergence. Examples of application of the proposed approaches are given with reference to two different industrial processes: the classification of simulated nuclear transients in the feedwater system of a Boiling Water Reactor and of multiple faults in rotating machinery.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  • Bozdogan, H. (2003). Statistical Data Mining with Informational Complexity and Genetic Algorithm.In Statistical data mining and knowledge discovery (Ed., Chapman and Hall), CRC Press, New York.

    Google Scholar 

  • Chambers, L. (1995).Practical handbook of genetic algorithms: applications Vol. I; new frontiers Vol. II. CRC Press, New York.

    Google Scholar 

  • Goldberg, D. E. (1989).Genetic algorithms in search, optimization, and machine learning. Addison-Wesley, Boston.

    MATH  Google Scholar 

  • Holland, J. H. (1975). Adaptation in natural and artificial systems: an introductory analysis with application to biology.In Control and Artificial Intelligence (Ed., MIT Press, 4-th edition), Boston.

    Google Scholar 

  • Duran, B. and Odell, P. (1974).Cluster Analysis: A Survey. Springer-Verlag, Berlin.

    MATH  Google Scholar 

  • Emmanouilidis, C., Hunter, A., MacIntyre, J., and Cox, C. (1999). Selecting features in neurofuzzy modelling by multiobjective genetic algorithms.In: Proc. of ICANN’99. 9th International Conference on Artificial Neural Networks, Edinburgh, UK, 2:749–754.

    Article  Google Scholar 

  • Horn, J., Nafpliotis, N., and Goldberg, D. E. (1994). A niched Pareto genetic algorithm for multiobjective optimization.In: Proc. Of the IEEE Conference on Evolutionary Computation, ICEC ’94, 1:82–87.

    Article  Google Scholar 

  • Keller, J. M., Gray, M. R., and Givens, J. A. (1985). A fuzzy k-nearest neighbor algorithm.IEEE Transactions on Systems, Man, and Cybernetics, SMC-15 4:580–585.

    Google Scholar 

  • Kilundu, B. and Dehombreaux, P. (2005). Feature selection approach based on principal component analysis and synaptic weights for mechanical fault diagnosis.In Advances in Safety and Reliability (Ed., Kolowrocki), Taylor and Francis Group, London.

    Google Scholar 

  • Kohavi, R. and John, G. (1997). Wrappers for feature subset selection.Artificial Intelligence, (1-2):273–324.

    Google Scholar 

  • Marcelloni, F. (2003). Feature selection based on a modified fuzzy c-means algorithm with supervision.Information Sciences 151:201–256.

    Article  MATH  MathSciNet  Google Scholar 

  • Na, M., Sim, Y. R., and Park, K. H. (2002). Failure detection using a fuzzy neural network with an automatic input selection algorithm. In:Power Plant Surveillance and Diagnostics, Springer, Berlin.

    Google Scholar 

  • Narendra, P. and Fukunaga, K. (1977). A branch and bound algorithm for feature subset selection. A branch and bound algorithm for feature subset selection.IEEE Transactions on Evolutionary Computation, C-26:917–922.

    Google Scholar 

  • Puska, E. and Normann, S. (2002). 3-d core studies for hambo simulator. In: Proceedings of presentations on man-machine system research, Enlarged Halden programme group meeting, 2.

    Google Scholar 

  • Raymer, M. L., Punch, W. F., Goodman, E. D., Khun, L. A., and Jain, A. K. (2000). Dimensionality reduction using genetic algorithms.IEEE Transaction on Evolutionary Computation, 4(2).

    Google Scholar 

  • Sawaragy, Y., Nakayama, H., and Tanino, T. (1985).Theory of multiobjective optimization. Academic Press, Orlando.

    Google Scholar 

  • Strang, G. and Nguyen, T. (1996).Wavelets and Filter Banks. Wellesley, Wellesley.

    Google Scholar 

  • Tsymbal, A., Puuronen, S., and Skrypnyk, I. (2001). Ensemble feature selection with dynamic integration of classifiers.Int. ICSC Congress on Computational Intelligence Methods.

    Google Scholar 

  • Tsymbal, A., Pechenizkiy, M., and Cunningham, P. (2005). Diversity in search strategies for ensemble feature selection.Information Fusion.

    Google Scholar 

  • Verikas, A. and Bacauskiene, M. (2002). Feature selection with neural networks.Pattern Recognition letters, 23(11):1323–1335.

    Article  MATH  Google Scholar 

  • Zhang, H. and Sun, G. (2002). Feature selection using tabu search method.Pattern Recognition, 35:701–711.

    Article  MATH  Google Scholar 

  • Zio, E., Baraldi, P., and Pedroni, N. (2006a) A Niched pareto genetic algorithm for selecting features for nuclear transients classification. In: Ruan, D., D’ Hondt, P., Fantoni, P., De Cock, M., Nachtegael, M., Kerre, E. E. (Eds.),Applied Artificial Intelligence: Proceedings of the 7th International FLINS Conference.

    Google Scholar 

  • Zio, E., Baraldi, P., and Pedroni, N. (2006b) Selecting features for nuclear transients classification by means of genetic algorithms.IEEE Transactions on Nuclear Science, 53(3):1479–1493.

    Article  Google Scholar 

  • Zio, E., Baraldi, P., and Roverso, D. (2005). An extended classifiability index for feature selection in nuclear transients.Annals of Nuclear Energy, 32(15):1632–1649.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2008 Birkhäuser Boston

About this chapter

Cite this chapter

Zio, E., Baraldi, P., Golam, G., Pedroni, N. (2008). Single and Ensemble Fault Classifiers Based on Features Selected by Multi-Objective Genetic Algorithms. In: Advances in Mathematical and Statistical Modeling. Statistics for Industry and Technology. Birkhäuser Boston. https://doi.org/10.1007/978-0-8176-4626-4_24

Download citation

Publish with us

Policies and ethics