Finite-sensor fault-diagnosis simulation study of gas turbine engine using information entropy and deep belief networks

  • De-long Feng
  • Ming-qing Xiao
  • Ying-xi Liu
  • Hai-fang Song
  • Zhao Yang
  • Ze-wen Hu


Precise fault diagnosis is an important part of prognostics and health management. It can avoid accidents, extend the service life of the machine, and also reduce maintenance costs. For gas turbine engine fault diagnosis, we cannot install too many sensors in the engine because the operating environment of the engine is harsh and the sensors will not work in high temperature, at high rotation speed, or under high pressure. Thus, there is not enough sensory data from the working engine to diagnose potential failures using existing approaches. In this paper, we consider the problem of engine fault diagnosis using finite sensory data under complicated circumstances, and propose deep belief networks based on information entropy, IE-DBNs, for engine fault diagnosis. We first introduce several information entropies and propose joint complexity entropy based on single signal entropy. Second, the deep belief networks (DBNs) is analyzed and a logistic regression layer is added to the output of the DBNs. Then, information entropy is used in fault diagnosis and as the input for the DBNs. Comparison between the proposed IE-DBNs method and state-of-the-art machine learning approaches shows that the IE-DBNs method achieves higher accuracy.

Key words

Deep belief networks (DBNs) Fault diagnosis Information entropy Engine 

CLC number

TP391 V267.3 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. Aguiar, V., Guedes, I., 2015. Shannon entropy, Fisher infor-mation and uncertainty relations for log-periodic oscil-lators. Phys. A, 423:72–79. Scholar
  2. Bengio, Y., 2009. Learning Deep Architectures for AI. Available from Scholar
  3. Bengio, Y., 2012. Practical recommendations for gradient-based training of deep architectures. LNCS, 7700:437–478. Scholar
  4. Bengio, Y., Courville, A., Vincent, P., 2013. Representation learning: a review and new perspectives. IEEE Trans. Patt. Anal. Mach. Intell., 35(8):1798–1828. Scholar
  5. Bottou, L., 2012. Stochastic gradient descent tricks. LNCS, 7700:421–436. Scholar
  6. Chen, Y.S., Zhao, X., Jia, X.P., 2015. Spectral-spatial classi-fication of hyperspectral data based on deep belief net-work. IEEE J. Sel. Topics Appl. Earth Observ. Remote Sens., 8(6):2381–2392. Scholar
  7. Cui, H.X., Zhang, L.B., Kang, R.Y., et al., 2009. Research on fault diagnosis for reciprocating compressor valve using information entropy and SVM method. J. Loss Prevent. Process Ind., 22(6):864–867. Scholar
  8. Dai, J.H., Tian, H.W., 2013. Entropy measures and granularity measures for set-valued information systems. Inform. Sci., 240:72–82. Scholar
  9. Ferrer, A., 2007. Multivariate statistical process control based on principal component analysis (MSPC-PCA): some re-flections and a case study in an autobody assembly pro-cess. Qual. Eng., 19(4):311–325. Scholar
  10. Geng, J.B., Huang, S.H., Jin, J.S., et al., 2006. A method of rotating machinery fault diagnosis based on the close degree of information entropy. Int. J. Plant Eng. Manag., 11(3):137–144. Scholar
  11. Hinton, G.E., 2010. A Practical Guide to Training Restricted Boltzmann Machines. Available from Scholar
  12. Hinton, G.E., Osindero, S., Teh, Y.W., 2006. A fast learning algorithm for deep belief nets. Neur. Comput., 18(7):1527–1554. Scholar
  13. Hinton, G.E., Deng, L., Yu, D., et al., 2012. Deep neural networks for acoustic modeling in speech recognition: the shared views of four research groups. IEEE Signal Pro-cess. Mag., 29(6):82–97. Scholar
  14. Jin, C.X., Li, F.C., Li, Y., 2014. A generalized fuzzy ID3 algorithm using generalized information entropy. Knowl.-Based Syst., 64:13–21. Scholar
  15. Koverda, V.P., Skokov, V.N., 2012. Maximum entropy in a nonlinear system with a 1/f power spectrum. Phys. A, 391(1–2):21–28. Scholar
  16. Larochelle, H., Bengio, Y., Louradour, J., et al., 2009. Ex-ploring strategies for training deep neural networks. J. Mach. Learn. Res., 10(10):1–40.zbMATHGoogle Scholar
  17. Li, F.C., Zhang, Z., Jin, C.X., 2016. Feature selection with partition differentiation entropy for large-scale data sets. Inform. Sci., 329:690–700. Scholar
  18. Li, J., 2015. Recognition of the optical image based on the wavelet space feature spectrum entropy. Optik-Int. J. Light Electron Opt., 126(23):3931–3935. Scholar
  19. Liu, Z.G., Hu, Q.L., Cui, Y., et al., 2014. A new detection approach of transient disturbances combining wavelet packet and Tsallis entropy. Neurocomputing, 142:393–407. Scholar
  20. Martens, J., Sutskever, I., 2012. Training deep and recurrent networks with Hessian-free optimization. LCNS, 7700:479–535. Scholar
  21. Memisevic, R., Hinton, G.E., 2010. Learning to represent spatial transformations with factored higher-order Boltzmann machine. Neur. Comput., 22(6):1473–1492. Scholar
  22. Mohamed, A.R., Dahl, G.E., Hinton, G.E., 2012. Acoustic modeling using deep belief networks. IEEE Trans. Audio Speech Lang. Process., 20(1):14–22. Scholar
  23. Nichols, J.M., Seaver, M., Trickey, S.T., 2006. A method for detecting damage-induced nonlinearities in structures using information theory. J. Sound Vibr., 297(1–2):1–16. Scholar
  24. Niu, J., Bu, X.Z., Li, Z., et al., 2014. An improved bilinear deep belief network algorithm for image classification. 10th Int. Conf. on Computational Intelligence and Secu-rity, p.189–192. Scholar
  25. Nourani, V., Alami, M.T., Vousoughi, F.D., 2015. Wavelet-entropy data pre-processing approach for ANN-based groundwater level modeling. J. Hydrol., 524:255–269. Scholar
  26. Ong, B.T., Sugiura, K., Zettsu, K., 2014. Dynamic pre-training of deep recurrent neural networks for predicting envi-ronmental monitoring data. IEEE Int. Conf. on Big Data, p.760–765. Scholar
  27. Pan, Y.B., Yang, B.L., Zhou, X.W., 2015. Feedstock molecu-lar reconstruction for secondary reactions of fluid cata-lytic cracking gasoline by maximum information entropy method. Chem. Eng. J., 281:945–952. Scholar
  28. Rastegin, A.E., 2015. On generalized entropies and infor-mation-theoretic Bell inequalities under decoherence. Ann. Phys., 355:241–257. Scholar
  29. Rodríguez, P.H., Alonso, J.B., Ferrer, M.A., et al., 2013. Ap-plication of the Teager-Kaiser energy operator in bearing fault diagnosis. ISA Trans., 52(2):278–284. Scholar
  30. Saimurugan, M., Ramachandran, K.I., Sugumaran, V., et al., 2011. Multi component fault diagnosis of rotational me-chanical system based on decision tree and support vector machine. Expert Syst. Appl., 38(4):3819–3826. Scholar
  31. Sainath, T.N., Kingsbury, B., Soltau, H., et al., 2013. Opti-mization techniques to improve training speed of deep neural networks for large speech tasks. IEEE Trans. Au-dio Speech Lang. Process., 21(11):2267–2276. Scholar
  32. Sainath, T.N., Kingsbury, B., Saon, G., et al., 2015. Deep convolutional neural networks for large-scale speech tasks. Neur. Networks, 64:39–48. Scholar
  33. Sekerka, R.F., 2015. Entropy and information theory. In: Thermal Physics: Thermodynamics and Statistical Me-chanics for Scientists and Engineers. Elsevier, p.247–256. Scholar
  34. Sermanet, P., Chintala, S., LeCun, Y., 2012. Convolutional neural networks applied to house numbers digit classifi-cation. 21st Int. Conf. on Pattern Recognition, p.3288–3291.Google Scholar
  35. Song, X.D., Sun, G.H., Dong, S.H., 2015. Shannon infor-mation entropy for an infinite circular well. Phys. Lett. A, 379(22–23):1402–1408. Scholar
  36. Su, H.T., You, G.J.Y., 2014. Developing an entropy-based model of spatial information estimation and its applica-tion in the design of precipitation gauge networks. J. Hydrol., 519(D):3316–3327. Scholar
  37. Susan, S., Hanmandlu, M., 2013. A non-extensive entropy feature and its application to texture classification. Neu-rocomputing, 120:214–225. Scholar
  38. Sutskever, I., Hinton, G.E., Taylor, G.W., 2008. The recurrent temporal restricted Boltzmann machine. Proc. 22nd An-nual Conf. on Neural Information Processing Systems, p.1601–1608.Google Scholar
  39. Tamilselvan, P., Wang, P.F., 2013. Failure diagnosis using deep belief learning based health state classification. Re-liab. Eng. Syst. Safety, 115:124–135. Scholar
  40. Tamilselvan, P., Wang, P.F., Youn, B.D., 2011. Multi-sensor health diagnosis using deep belief network based state classification. ASME Int. Design Engineering Technical Conf. & Computers and Information in Engineering Conf., p.749–758. Scholar
  41. Tran, V.T., AlThobiani, F., Ball, A., 2014. An approach to fault diagnosis of reciprocating compressor valves using Teager–Kaiser energy operator and deep belief networks. Expert Syst. Appl., 41(9):4113–4122. Scholar
  42. Xie, Y., Zhang, T., 2005. A fault diagnosis approach using SVM with data dimension reduction by PCA and LDA method. Chinese Automation Congress, p.869–874. Scholar
  43. Zhang, W.L., Li, R.J., Deng, H.T., et al., 2015. Deep convo-lutional neural networks for multi-modality isointense infant brain image segmentation. NeuroImage, 108:214–224. Scholar
  44. Zhao, X.Z., Ye, B.Y., 2016. Singular value decomposition packet and its application to extraction of weak fault feature. Mech. Syst. Signal Process., 70-71:73–86. Scholar
  45. Zhou, S.S., Chen, Q.C., Wang, X.L., 2014. Deep adaptive networks for visual data classification. J. Multim., 9(10):1142–1151.CrossRefGoogle Scholar

Copyright information

© Journal of Zhejiang University Science Editorial Office and Springer-Verlag Berlin Heidelberg 2016

Authors and Affiliations

  • De-long Feng
    • 1
  • Ming-qing Xiao
    • 1
  • Ying-xi Liu
    • 2
  • Hai-fang Song
    • 1
  • Zhao Yang
    • 1
  • Ze-wen Hu
    • 1
  1. 1.Aeronautics and Astronautics Engineering CollegeAir Force Engineering UniversityXi’anChina
  2. 2.Air Force Xi’an Flight AcademyXi’anChina

Personalised recommendations