Skip to main content
Log in

Predicting Forced Blower Failures Using Machine Learning Algorithms and Vibration Data for Effective Maintenance Strategies

  • Original Research Article
  • Published:
Journal of Failure Analysis and Prevention Aims and scope Submit manuscript

Abstract

The emergence of Industry 4.0, also known as the fourth industrial revolution, has brought forth the concept of prognostics and health management (PHM) as an inevitable trend in the realm of industrial big data and smart manufacturing. This study aims to present a proof-of-concept that illustrates how machine learning can be employed to analyze industrial facility data and anticipate the condition of industrial machines. Specifically, a comprehensive case study focusing on vibration monitoring is conducted. The proposed models aim to predict maintenance requirements for the forced blower of a chemical plant by utilizing vibration data obtained during the manufacturing process. To validate the methodology, five different machine learning algorithms, namely logistic regression (LR), support vector machine (SVM), K-nearest neighbor (KNN), extreme gradient boosting (XGBoost), and random forest (RF), are employed. The evaluation metrics used include Matthews correlation coefficient (MCC) and receiver operator characteristic curve (ROC). This study aims to establish a relationship between machine failures caused by vibration and the prediction of both healthy and faulty bearings using the machine learning approaches. The findings indicate that the XGBoost algorithm outperforms other approaches with an MCC of 0.800 and a higher area under the ROC curve.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10

Similar content being viewed by others

Abbreviations

(y 3(i), Φ(xi)):

It is the loss function calculates the dissimilarity between the true label yi and the predicted output Φ(xi). It measures the disparity between the real value and the value projected by the model

v r.m.s(mm/s):

Is the corresponding r.m.s velocity

v t(mm/s):

Is the time-dependent vibration velocity

b:

It is the bias term that determines the offset of hyperplane from the origin

f 1(x) :

It constitutes the projected outcome for the given input x

f 2(x) :

It signifies the anticipated output for the input vector x

f 3(x) :

It signifies the anticipated output for the input vector x

F 4(x) :

It signifies the anticipated output for the input vector x

F 5(x) :

It signifies the anticipated output for the input vector x

h i(x) :

It is the prediction of decision tree for the input vector x

K(x i,x):

It is the kernel function, responsible for quantifying the similarity between the practice instance xi and the input x

L :

It is the objective function which each decision tree hi(x) is built iteratively to minimize it

M :

It is the number of decision trees used in the XGBoost classifier

N :

It is the count of decision trees used in the random forest classifier

n 1 :

It is the count of practice instances

n 2 :

It denotes the quantity of training examples

Nk(x):

Represents the collection of k closest neighbors to x within the training set

\( p\left( {y = 1|x,w} \right) \) :

It is the probability of the output variable y being 1 given the input vector x and the weight vector w

\( T_{i} \left( x \right) \) :

It is the prediction of decision tree for the input vector x

W :

It is the weight vector that determines the importance of each input feature

x i :

It is the feature vector of the training instance

\( y_{{1(i)}} \) :

It is the true label of the training instance (− 1 or 1)

\( y_{{2(i)}} \) :

It is the class label of the training instance

\( y_{{3(i)}} \) :

It is the true label of the training instance

\( \alpha _{i} \) :

It is lagrange multiplier associated with the training instance

\( \sigma (z) \) :

It is the sigmoid function defined as \( \sigma (z){\text{ }} = {\text{ }}1/(1 + \exp ( - z) \)

\( \Phi (x_{i} ) \) :

It is the predicted output of the decision trees trained so far for the input vector xi

\( T(s) \) :

Refers to the sampling time, which surpasses the duration of all predominant frequency components constituting v(t)

References

  1. A. Ghods, H.H. Lee, Probabilistic frequency-domain discrete wavelet transform for better detection of bearing faults in induction motors. Neurocomputing. 188, 206–216 (2016). https://doi.org/10.1016/j.neucom.2015.06.100

    Article  Google Scholar 

  2. Y. Wang, G. Xu, Q. Zhang, D. Liu, K. Jiang, Rotating speed isolation and its application to rolling element bearing fault diagnosis under large speed variation conditions. J. Sound Vib. 348, 381–396 (2015). https://doi.org/10.1016/j.jsv.2015.03.018

    Article  Google Scholar 

  3. T.P. Carvalho, F.A.A.M.N. Soares, R. Vita, R.P. da Francisco, J.P. Basto, S.G.S. Alcalá, A systematic literature review of machine learning methods applied to predictive maintenance. Comput. Ind. Eng. 137, 106 (2019). https://doi.org/10.1016/j.cie.2019.106024

    Article  Google Scholar 

  4. P.J. Rivera Torres, E.I. Serrano Mercado, O. Llanes Santiago, L. Anido Rifón, Modeling preventive maintenance of manufacturing processes with probabilistic Boolean networks with interventions. J. Intell. Manuf. 29(8), 1941–2195 (2018). https://doi.org/10.1007/s10845-016-1226-x

    Article  Google Scholar 

  5. Z. Tian, D. Lin, B. Wu, Condition based maintenance optimization considering multiple objectives. J. Intell. Manuf. 23(2), 333–340 (2012). https://doi.org/10.1007/s10845-009-0358-7

    Article  Google Scholar 

  6. R. Sipos, Z. Wang, F. Moerchen, Log-based Predictive Maintenance (2014), pp. 1867–1876

  7. D. Bansal, D.J. Evans, B. Jones, A real-time predictive maintenance system for machine systems. Int. J. Mach. Tools Manuf. 44(7–8), 759–766 (2004). https://doi.org/10.1016/j.ijmachtools.2004.02.004

    Article  Google Scholar 

  8. R. Burdzik, Research on the influence of engine rotational speed to the vibration penetration into the driver via feet - multidimensional analysis. J. Vibroeng. 15(4), 2114–2123 (2013)

    Google Scholar 

  9. V. Mathew, T. Toby, V. Singh, B. M. Rao, M. G. Kumar, in 2020 IEEE 2nd International Conference on Circuits and Systems, ICCS 2020 (Iccs, 2020), pp. 306–311

  10. Z.M. Çinar, A.A. Nuhu, Q. Zeeshan, O. Korhan, M. Asmael, B. Safaei, Machine learning in predictive maintenance towards sustainable smart manufacturing in industry 4.0. Sustainability. (2020). https://doi.org/10.3390/su12198211

    Article  Google Scholar 

  11. H. Ocak, K.A. Loparo, Estimation of the running speed and bearing defect frequencies of an induction motor from vibration data. Mech. Syst. Signal Process. 18(3), 515–533 (2004). https://doi.org/10.1016/S0888-3270(03)00052-9

    Article  Google Scholar 

  12. A. Abdalla, F. Gaballa, A. Ball, A. Andrew, F. Gu, Fault Detection and Diagnosis of Ball Bearing Using Advanced Vibration Analysis Techniques. (University of Huddersfield, Huddersfield, 2013)

    Google Scholar 

  13. M. Elforjani, S. Shanbr, Prognosis of bearing acoustic emission signals using supervised machine learning. IEEE Trans. Ind. Electron. 65(7), 5864–5871 (2018). https://doi.org/10.1109/TIE.2017.2767551

    Article  Google Scholar 

  14. N. Li, Y. Lei, J. Lin, S.X. Ding, An improved exponential model for predicting remaining useful life of rolling element bearings. IEEE Trans. Ind. Electron. 62(12), 7762–7773 (2015). https://doi.org/10.1109/TIE.2015.2455055

    Article  Google Scholar 

  15. D. Mandal, S.K. Pal, P. Saha, Modeling of electrical discharge machining process using back propagation neural network and multi-objective optimization using non-dominating sorting genetic algorithm-II. J. Mater. Process. Technol. 186(1–3), 154–162 (2007). https://doi.org/10.1016/j.jmatprotec.2006.12.030

    Article  CAS  Google Scholar 

  16. R. A, F. M, M. Mcgregor-lowndes, R. Richards, F. Hannah, and A. Overell (2009) QUT Digital Repository: http://eprints.qut.edu.au. Technology 12:25–27

  17. E. Sutrisno, H. Oh, A. S. S. Vasan, and M. Pecht, “Estimation of remaining useful life of ball bearings using data driven methodologies,” PHM 2012 - 2012 IEEE Int. Conf.on Progn. Heal. Manag. Enhancing Safety, Effic. Availability, Eff. Syst. Through PHM Technol. Appl. Conf. Progr., vol. 2, pp. 1–7, 2012. https://doi.org/10.1109/ICPHM.2012.6299548.

  18. B. Ahmad, B.K. Mishra, M. Ghufran, Z. Pervez, N. Ramzan, “Intelligent Predictive Maintenance Model for Rolling Components of a Machine based on Speed and Vibration”, 3rd Int. Conf. Artif. Intell. Inf. Commun. ICAIIC. 2021(June), 459–464 (2021). https://doi.org/10.1109/ICAIIC51459.2021.9415249

    Article  Google Scholar 

  19. D.J. Bordoloi, R. Tiwari, Support vector machine based optimization of multi-fault classification of gears with evolutionary algorithms from time-frequency vibration data. Meas. J. Int. Meas. Confed. 55, 1–14 (2014). https://doi.org/10.1016/j.measurement.2014.04.024

    Article  Google Scholar 

  20. A.K. Panda, J.S. Rapur, R. Tiwari, Prediction of flow blockages and impending cavitation in centrifugal pumps using Support Vector Machine (SVM) algorithms based on vibration measurements. Meas. J. Int. Meas. Confed. 130, 44–56 (2018). https://doi.org/10.1016/j.measurement.2018.07.092

    Article  Google Scholar 

  21. M. Guerroum, M. Zegrari, A. Ait Elmahjoub, M. Berquedich, and M. Masmoudi, Machine Learning for the Predictive Maintenance of a Jaw Crusher in the Mining Industry. 2021. https://doi.org/10.1109/ICTMOD52902.2021.9739338.

  22. M. Cakir, M. A. Guvenc, and S. Mistikoglu, “The experimental application of popular machine learning algorithms on predictive maintenance and the design of IIoT based condition monitoring system,” Comput. Ind. Eng., vol. 151, p. 106948, 2021. https://doi.org/10.1016/j.cie.2020.106948.

  23. C. Patil, S. Jadhav, A. Bardiya, A. Davande, M. Raverkar, Machine Learning-Based Predictive Maintenance of Industrial Machines. Int. J. Comput. Trends Technol. 71, 50–56 (2023). https://doi.org/10.14445/22312803/IJCTT-V71I3P108

    Article  Google Scholar 

  24. S. Arena, E. Florian, I. Zennaro, P. Orrù, and F. Sgarbossa, “A novel decision support system for managing predictive maintenance strategies based on machine learning approaches,” Saf. Sci., vol. 146, p. 105529, Feb. 2022. https://doi.org/10.1016/j.ssci.2021.105529.

  25. T. Sexton, M. P. Brundage, M. Hoffman, and K. C. Morris (2017) Hybrid datafication of maintenance logs from AI-assisted human tags. Proc. - 2017 IEEE Int. Conf. Big Data, Big Data 2017, vol. 2018: 1769–1777. https://doi.org/10.1109/BigData.2017.8258120.

  26. J. D. R. Farquhar, D. R. Hardoon, H. Meng, J. Shawe-Taylor, and S. Szedmak, “Two view learning: SVM-2K, theory and practice,” Adv. Neural Inf. Process. Syst., pp. 355–362, 2005.

  27. J. C. Platt, “Sequential Minimal Optimization: A Fast Algorithm for Training Support Vector Machines,” pp. 1–21, 1998.

  28. C. Cortes, V. Vapnik, Support-Vector Networks. Mach. Learn. 20, 273–297 (1995)

    Article  Google Scholar 

  29. K. Miner-Romanoff, Student Perceptions of Juvenile Offender Accounts in Criminal Justice Education. Am. J. Crim. Justice. 39(3), 611–629 (2014). https://doi.org/10.1007/s12103-013-9223-5

    Article  Google Scholar 

  30. Open Data Science, “How Does the Random Forest Algorithm Work in Machine Learning - Open Data Science - Your News Source for AI, Machine Learning & more.” [Online]. Available: https://opendatascience.com/how-does-the-random-forest-algorithm-work-in-machine-learning/. Accessed: May 24, 2022.

  31. L. E. O. Bbeiman, “Bagging Predictors,” vol. 140, pp. 123–140, 1996.

  32. A. Liaw and M. Wiener, “Classification and Regression by randomForest,” vol. 2, no. December, pp. 18–22, 2002.

  33. T. Kebabsa, N. Ouelaa, A. Djebala, Experimental vibratory analysis of a fan motor in industrial environment. Int. J. Adv. Manuf. Technol. 98(9–12), 2439–2447 (2018). https://doi.org/10.1007/s00170-018-2391-1

    Article  Google Scholar 

  34. M.H. Alsharif, A.H. Kelechi, K. Yahya, S.A. Chaudhry, Machine learning algorithms for smart data analysis in internet of things environment: Taxonomies and research trends. Symmetry (Basel). (2020). https://doi.org/10.3390/SYM12010088

    Article  Google Scholar 

  35. T. Hastie, R. Tibshirani, and J. Friedman, “The Elements of Statistical Learning: Data Mining, Inference, and Prediction,” Springer Series in Statistics, May 2001.

  36. K. N. Stevens, T. M. Cover, and P. E. Hart, “Nearest Neighbor Pattern Classification,” IEEE Transactions on Information Theory, vol. IT-13, no. 1, pp. 21-27, January 1967.

  37. A. Abdi, “Three types of Machine Learning Algorithms List of Common Machine Learning Algorithms,” no. November, 2016.

  38. V. Nasteski, A Survey of Supervised Machine Learning Methods. Horizons B. 4(12), 51–62 (2017). https://doi.org/10.20544/horizons.b.04.1.17.p05

    Article  Google Scholar 

  39. T. Chen and C. Guestrin, “XGBoost: A Scalable Tree Boosting System,” in Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, vol. 13-17-Augu, pp. 785–794, 2016. https://doi.org/10.1145/2939672.2939785.

  40. A. Binding, N. Dykeman, and S. Pang, “Machine Learning Predictive Maintenance on Data in the Wild,” in IEEE 5th World Forum on Internet of Things (WF-IoT) - Conference Proceedings, pp. 507–512, 2019. https://doi.org/10.1109/WF-IoT.2019.8767312.

  41. R. Santhanam, N. Uzir, S. Raman, and S. Banerjee, “Experimenting XGBoost Algorithm for Prediction and Classification of Different Datasets,” 2017.

  42. J.H. Friedman, Greedy function approximation: a gradient boosting machine. Ann. Stat. 29(5), 1189–1232 (2001). https://doi.org/10.1214/aos/1013203451

    Article  Google Scholar 

  43. “Specifications Sheet June 2017, Industrial Accelerometer,” [Online]. Available: https://micromega-dynamics.com/wp-content/uploads/2016/08/IAC-I-03-ENRev2p3.pdf. Accessed on: June 19, 2022.

  44. I. Standard, “INTERNATIONAL,” vol. 995, 1995.

  45. J. Bergstra, Y. Bengio, Random search for hyper-parameter optimization. J. Mach. Learn. Res. 13, 281–305 (2012). https://doi.org/10.5555/2188385.2188395

    Article  Google Scholar 

  46. T. Gneiting, P. Vogel, Analyzing the receiver operating characteristic (ROC) curve. Southwest Respiratory and Critical Care Chronicles. 5(19), 34 (2017). https://doi.org/10.12746/swrccc.v5i19.391

    Article  Google Scholar 

  47. “Understanding AUC ROC Curve Towards Data Science,” [Online]. Available: https://www.studocu.com/row/document/कठमणड-वशववदयlय/machine-earning/understanding-auc-roc-curve-towards-data-science/8902167. Accessed on 1 Sep 2022

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Khaled Salem.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Salem, K., AbdelGwad, E. & Kouta, H. Predicting Forced Blower Failures Using Machine Learning Algorithms and Vibration Data for Effective Maintenance Strategies. J Fail. Anal. and Preven. 23, 2191–2203 (2023). https://doi.org/10.1007/s11668-023-01765-x

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11668-023-01765-x

Keywords

Navigation