Advertisement

Finite-sensor fault-diagnosis simulation study of gas turbine engine using information entropy and deep belief networks

  • De-long Feng
  • Ming-qing Xiao
  • Ying-xi Liu
  • Hai-fang Song
  • Zhao Yang
  • Ze-wen Hu
Article

Abstract

Precise fault diagnosis is an important part of prognostics and health management. It can avoid accidents, extend the service life of the machine, and also reduce maintenance costs. For gas turbine engine fault diagnosis, we cannot install too many sensors in the engine because the operating environment of the engine is harsh and the sensors will not work in high temperature, at high rotation speed, or under high pressure. Thus, there is not enough sensory data from the working engine to diagnose potential failures using existing approaches. In this paper, we consider the problem of engine fault diagnosis using finite sensory data under complicated circumstances, and propose deep belief networks based on information entropy, IE-DBNs, for engine fault diagnosis. We first introduce several information entropies and propose joint complexity entropy based on single signal entropy. Second, the deep belief networks (DBNs) is analyzed and a logistic regression layer is added to the output of the DBNs. Then, information entropy is used in fault diagnosis and as the input for the DBNs. Comparison between the proposed IE-DBNs method and state-of-the-art machine learning approaches shows that the IE-DBNs method achieves higher accuracy.

Key words

Deep belief networks (DBNs) Fault diagnosis Information entropy Engine 

CLC number

TP391 V267.3 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Aguiar, V., Guedes, I., 2015. Shannon entropy, Fisher infor-mation and uncertainty relations for log-periodic oscil-lators. Phys. A, 423:72–79. http://dx.doi.org/10.1016/j.physa.2014.12.031MathSciNetCrossRefGoogle Scholar
  2. Bengio, Y., 2009. Learning Deep Architectures for AI. Available from http://www.iro.umontreal.ca/~bengioy/papers/ftml.pdfzbMATHGoogle Scholar
  3. Bengio, Y., 2012. Practical recommendations for gradient-based training of deep architectures. LNCS, 7700:437–478. http://dx.doi.org/10.1007/978-3-642-35289-8_26Google Scholar
  4. Bengio, Y., Courville, A., Vincent, P., 2013. Representation learning: a review and new perspectives. IEEE Trans. Patt. Anal. Mach. Intell., 35(8):1798–1828. http://dx.doi.org/10.1109/TPAMI.2013.50CrossRefGoogle Scholar
  5. Bottou, L., 2012. Stochastic gradient descent tricks. LNCS, 7700:421–436. http://dx.doi.org/10.1007/978-3-642-35289-8_25Google Scholar
  6. Chen, Y.S., Zhao, X., Jia, X.P., 2015. Spectral-spatial classi-fication of hyperspectral data based on deep belief net-work. IEEE J. Sel. Topics Appl. Earth Observ. Remote Sens., 8(6):2381–2392. http://dx.doi.org/10.1109/JSTARS.2015.2388577CrossRefGoogle Scholar
  7. Cui, H.X., Zhang, L.B., Kang, R.Y., et al., 2009. Research on fault diagnosis for reciprocating compressor valve using information entropy and SVM method. J. Loss Prevent. Process Ind., 22(6):864–867. http://dx.doi.org/10.1016/j.jlp.2009.08.012CrossRefGoogle Scholar
  8. Dai, J.H., Tian, H.W., 2013. Entropy measures and granularity measures for set-valued information systems. Inform. Sci., 240:72–82. http://dx.doi.org/10.1016/j.ins.2013.03.045MathSciNetCrossRefGoogle Scholar
  9. Ferrer, A., 2007. Multivariate statistical process control based on principal component analysis (MSPC-PCA): some re-flections and a case study in an autobody assembly pro-cess. Qual. Eng., 19(4):311–325. http://dx.doi.org/10.1080/08982110701621304CrossRefGoogle Scholar
  10. Geng, J.B., Huang, S.H., Jin, J.S., et al., 2006. A method of rotating machinery fault diagnosis based on the close degree of information entropy. Int. J. Plant Eng. Manag., 11(3):137–144. http://dx.doi.org/10.13434/j.ckni.1007-4546.2006.03.002Google Scholar
  11. Hinton, G.E., 2010. A Practical Guide to Training Restricted Boltzmann Machines. Available from https://www.cs.toronto.edu/~hinton/absps/guideTR.pdfGoogle Scholar
  12. Hinton, G.E., Osindero, S., Teh, Y.W., 2006. A fast learning algorithm for deep belief nets. Neur. Comput., 18(7):1527–1554. http://dx.doi.org/10.1162/neco.2006.18.7.1527MathSciNetCrossRefGoogle Scholar
  13. Hinton, G.E., Deng, L., Yu, D., et al., 2012. Deep neural networks for acoustic modeling in speech recognition: the shared views of four research groups. IEEE Signal Pro-cess. Mag., 29(6):82–97. http://dx.doi.org/10.1109/MSP.2012.2205597CrossRefGoogle Scholar
  14. Jin, C.X., Li, F.C., Li, Y., 2014. A generalized fuzzy ID3 algorithm using generalized information entropy. Knowl.-Based Syst., 64:13–21. http://dx.doi.org/10.1016/j.knosys.2014.03.014CrossRefGoogle Scholar
  15. Koverda, V.P., Skokov, V.N., 2012. Maximum entropy in a nonlinear system with a 1/f power spectrum. Phys. A, 391(1–2):21–28. http://dx.doi.org/10.1016/j.physa.2011.07.015CrossRefGoogle Scholar
  16. Larochelle, H., Bengio, Y., Louradour, J., et al., 2009. Ex-ploring strategies for training deep neural networks. J. Mach. Learn. Res., 10(10):1–40.zbMATHGoogle Scholar
  17. Li, F.C., Zhang, Z., Jin, C.X., 2016. Feature selection with partition differentiation entropy for large-scale data sets. Inform. Sci., 329:690–700. http://dx.doi.org/10.1016/j.ins.2015.10.002CrossRefGoogle Scholar
  18. Li, J., 2015. Recognition of the optical image based on the wavelet space feature spectrum entropy. Optik-Int. J. Light Electron Opt., 126(23):3931–3935. http://dx.doi.org/10.1016/j.ijleo.2015.07.166CrossRefGoogle Scholar
  19. Liu, Z.G., Hu, Q.L., Cui, Y., et al., 2014. A new detection approach of transient disturbances combining wavelet packet and Tsallis entropy. Neurocomputing, 142:393–407. http://dx.doi.org/10.1016/j.neucom.2014.04.020CrossRefGoogle Scholar
  20. Martens, J., Sutskever, I., 2012. Training deep and recurrent networks with Hessian-free optimization. LCNS, 7700:479–535. http://dx.doi.org/10.1007/978-3-642-35289-8_27Google Scholar
  21. Memisevic, R., Hinton, G.E., 2010. Learning to represent spatial transformations with factored higher-order Boltzmann machine. Neur. Comput., 22(6):1473–1492. http://dx.doi.org/10.1162/neco.2010.01-09-953CrossRefGoogle Scholar
  22. Mohamed, A.R., Dahl, G.E., Hinton, G.E., 2012. Acoustic modeling using deep belief networks. IEEE Trans. Audio Speech Lang. Process., 20(1):14–22. http://dx.doi.org/10.1109/TASL.2011.2109382CrossRefGoogle Scholar
  23. Nichols, J.M., Seaver, M., Trickey, S.T., 2006. A method for detecting damage-induced nonlinearities in structures using information theory. J. Sound Vibr., 297(1–2):1–16. http://dx.doi.org/10.1016/j.jsv.2006.01.025CrossRefGoogle Scholar
  24. Niu, J., Bu, X.Z., Li, Z., et al., 2014. An improved bilinear deep belief network algorithm for image classification. 10th Int. Conf. on Computational Intelligence and Secu-rity, p.189–192. http://dx.doi.org/10.1109/CIS.2014.38Google Scholar
  25. Nourani, V., Alami, M.T., Vousoughi, F.D., 2015. Wavelet-entropy data pre-processing approach for ANN-based groundwater level modeling. J. Hydrol., 524:255–269. http://dx.doi.org/10.1016/j.jhydrol.2015.02.048CrossRefGoogle Scholar
  26. Ong, B.T., Sugiura, K., Zettsu, K., 2014. Dynamic pre-training of deep recurrent neural networks for predicting envi-ronmental monitoring data. IEEE Int. Conf. on Big Data, p.760–765. http://dx.doi.org/10.1109/BigData.2014.7004302Google Scholar
  27. Pan, Y.B., Yang, B.L., Zhou, X.W., 2015. Feedstock molecu-lar reconstruction for secondary reactions of fluid cata-lytic cracking gasoline by maximum information entropy method. Chem. Eng. J., 281:945–952. http://dx.doi.org/10.1016/j.cej.2015.07.037CrossRefGoogle Scholar
  28. Rastegin, A.E., 2015. On generalized entropies and infor-mation-theoretic Bell inequalities under decoherence. Ann. Phys., 355:241–257. http://dx.doi.org/10.1016/j.aop.2015.02.015CrossRefGoogle Scholar
  29. Rodríguez, P.H., Alonso, J.B., Ferrer, M.A., et al., 2013. Ap-plication of the Teager-Kaiser energy operator in bearing fault diagnosis. ISA Trans., 52(2):278–284. http://dx.doi.org/10.1016/j.isatra.2012.12.006CrossRefGoogle Scholar
  30. Saimurugan, M., Ramachandran, K.I., Sugumaran, V., et al., 2011. Multi component fault diagnosis of rotational me-chanical system based on decision tree and support vector machine. Expert Syst. Appl., 38(4):3819–3826. http://dx.doi.org/10.1016/j.eswa.2010.09.042CrossRefGoogle Scholar
  31. Sainath, T.N., Kingsbury, B., Soltau, H., et al., 2013. Opti-mization techniques to improve training speed of deep neural networks for large speech tasks. IEEE Trans. Au-dio Speech Lang. Process., 21(11):2267–2276. http://dx.doi.org/10.1109/TASL.2013.2284378CrossRefGoogle Scholar
  32. Sainath, T.N., Kingsbury, B., Saon, G., et al., 2015. Deep convolutional neural networks for large-scale speech tasks. Neur. Networks, 64:39–48. http://dx.doi.org/10.1016/j.neunet.2014.08.005CrossRefGoogle Scholar
  33. Sekerka, R.F., 2015. Entropy and information theory. In: Thermal Physics: Thermodynamics and Statistical Me-chanics for Scientists and Engineers. Elsevier, p.247–256. http://dx.doi.org/10.1016/B978-0-12-803304-3.00015-6CrossRefGoogle Scholar
  34. Sermanet, P., Chintala, S., LeCun, Y., 2012. Convolutional neural networks applied to house numbers digit classifi-cation. 21st Int. Conf. on Pattern Recognition, p.3288–3291.Google Scholar
  35. Song, X.D., Sun, G.H., Dong, S.H., 2015. Shannon infor-mation entropy for an infinite circular well. Phys. Lett. A, 379(22–23):1402–1408. http://dx.doi.org/10.1016/j.physleta.2015.03.020MathSciNetCrossRefGoogle Scholar
  36. Su, H.T., You, G.J.Y., 2014. Developing an entropy-based model of spatial information estimation and its applica-tion in the design of precipitation gauge networks. J. Hydrol., 519(D):3316–3327. http://dx.doi.org/10.1016/j.jhydrol.2014.10.022CrossRefGoogle Scholar
  37. Susan, S., Hanmandlu, M., 2013. A non-extensive entropy feature and its application to texture classification. Neu-rocomputing, 120:214–225. http://dx.doi.org/10.1016/j.neucom.2012.08.059Google Scholar
  38. Sutskever, I., Hinton, G.E., Taylor, G.W., 2008. The recurrent temporal restricted Boltzmann machine. Proc. 22nd An-nual Conf. on Neural Information Processing Systems, p.1601–1608.Google Scholar
  39. Tamilselvan, P., Wang, P.F., 2013. Failure diagnosis using deep belief learning based health state classification. Re-liab. Eng. Syst. Safety, 115:124–135. http://dx.doi.org/10.1016/j.ress.2013.02.022CrossRefGoogle Scholar
  40. Tamilselvan, P., Wang, P.F., Youn, B.D., 2011. Multi-sensor health diagnosis using deep belief network based state classification. ASME Int. Design Engineering Technical Conf. & Computers and Information in Engineering Conf., p.749–758. http://dx.doi.org/10.1115/DETC2011-48352Google Scholar
  41. Tran, V.T., AlThobiani, F., Ball, A., 2014. An approach to fault diagnosis of reciprocating compressor valves using Teager–Kaiser energy operator and deep belief networks. Expert Syst. Appl., 41(9):4113–4122. http://dx.doi.org/10.1016/j.eswa.2013.12.026CrossRefGoogle Scholar
  42. Xie, Y., Zhang, T., 2005. A fault diagnosis approach using SVM with data dimension reduction by PCA and LDA method. Chinese Automation Congress, p.869–874. http://dx.doi.org/10.1109/CAC.2015.7382620Google Scholar
  43. Zhang, W.L., Li, R.J., Deng, H.T., et al., 2015. Deep convo-lutional neural networks for multi-modality isointense infant brain image segmentation. NeuroImage, 108:214–224. http://dx.doi.org/10.1016/j.neuroimage.2014.12.061CrossRefGoogle Scholar
  44. Zhao, X.Z., Ye, B.Y., 2016. Singular value decomposition packet and its application to extraction of weak fault feature. Mech. Syst. Signal Process., 70-71:73–86. http://dx.doi.org/10.1016/j.ymssp.2015.08.033CrossRefGoogle Scholar
  45. Zhou, S.S., Chen, Q.C., Wang, X.L., 2014. Deep adaptive networks for visual data classification. J. Multim., 9(10):1142–1151.CrossRefGoogle Scholar

Copyright information

© Journal of Zhejiang University Science Editorial Office and Springer-Verlag Berlin Heidelberg 2016

Authors and Affiliations

  • De-long Feng
    • 1
  • Ming-qing Xiao
    • 1
  • Ying-xi Liu
    • 2
  • Hai-fang Song
    • 1
  • Zhao Yang
    • 1
  • Ze-wen Hu
    • 1
  1. 1.Aeronautics and Astronautics Engineering CollegeAir Force Engineering UniversityXi’anChina
  2. 2.Air Force Xi’an Flight AcademyXi’anChina

Personalised recommendations