Advertisement

A Holistic Approach to Part Quality Prediction in Injection Molding Based on Machine Learning

  • Alexander Schulze StruchtrupEmail author
  • Dimitri Kvaktun
  • Reinhard Schiffers
Conference paper
  • 37 Downloads

Abstract

All plastics processing companies have to fulfill the objectives of time, cost and quality. Against this background, those producing in high wage countries are especially challenged, because superior part quality is often the only possibility to prevail in competition. Since this leads to high expenses on quality assurance, for some time already efforts have been made to predict the quality of injection molded parts from process data using machine learning algorithms. However, these did not yet prevail in industry, mainly for two reasons: First, because of the inevitable learning effort that is required to set up a quality prediction model and second, because of the complexity in the application. Current research in the field of transfer learning aiming to shorten learning phases addresses the first challenge. In this paper, we present a holistic approach for the data analysis steps that are necessary once process and quality data have been generated, aiming to minimize the application effort for the operator. This includes the development and application of suitable algorithms for automatic selection of data, process features as well as machine learning algorithms including hyper-parameter optimization and model adaption. Combining the two approaches could bring quality prediction one significant step forward to successful industry application. Beyond this, the presented approach is universally applicable and can therefore be used for other plastics processing methods as well.

Keywords

Injection molding Quality prediction Machine learning 

References

  1. 1.
    Hopmann, C., Michaeli, W.: Einführung in die Kunststoffverarbeitung, 7th edn. Hanser, Munich (2015)CrossRefGoogle Scholar
  2. 2.
    Hopmann, C., Michaeli, W., Greiff, H., et al.: Technologie des Spritzgießens, 4th edn. Hanser, Munich (2017)CrossRefGoogle Scholar
  3. 3.
    Standard DIN 24450: Maschinen zum Verarbeiten von Kunststoffen und Kautschuk. Beuth, Berlin (1987)Google Scholar
  4. 4.
    Schiffers, R.: Verbesserung der Prozessfähigkeit beim Spritzgießen durch Nutzung von Prozessdaten und eine neuartige Schneckenhubführung. PhD thesis (2009)Google Scholar
  5. 5.
    Gierth, M.: Methoden und Hilfsmittel zur prozessnahen Qualitätssicherung beim Spritzgießen von Thermoplasten. PhD thesis (1992)Google Scholar
  6. 6.
    Hanning, D.: Continuous Process Control. Qualitätssicherung im Kunststoffverarbeitungs-prozess auf Basis statistischer Prozessmodelle. PhD thesis (1994)Google Scholar
  7. 7.
    Häußler, J.: Eine Qualitätssicherungsstrategie für die Kunststoffverarbeitung auf der Basis künstlicher Neuronaler Netzwerke. PhD thesis (1994)Google Scholar
  8. 8.
    Vaculik, R.: Regelung der Formteilqualität beim Spritzgießen auf Basis statistischer Prozessmodelle. PhD thesis (1996)Google Scholar
  9. 9.
    Al-Haj Mustafa, M.: Modellbasierte Ansätze zur Qualitätsregelung beim Kunststoffspritzgießen. PhD thesis (2000)Google Scholar
  10. 10.
    Schnerr, O.: Automatisierung der Online-Qualitätsüberwachung beim Kunststoffspritzgießen. PhD thesis (2000)Google Scholar
  11. 11.
    Walter, A.: Methoden des prozessnahen Qualitätsmanagements in der Kunststoffverarbeitung. PhD thesis (2000)Google Scholar
  12. 12.
    Liedl, P., Haag, G., Müller, H., et al.: Spitzenqualität mit kurzen Zyklen. Kunststoffe 2, 38–40 (2010)Google Scholar
  13. 13.
    Hopmann, C., Theunissen, M., Heinisch, J.: Von der Simulation in die Maschine – objektivierte Prozesseinrichtung durch maschinelles Lernen. In: VDI Jahrestagung Spritzgießen, Baden-Baden (2018)Google Scholar
  14. 14.
    Hopmann, C., Theunissen, M., Wipperfürth, J., et al.: Prozesseinrichtung durch maschinelles Lernen. Kunststoffe 6, 36–41 (2018)Google Scholar
  15. 15.
    Hopmann, C., Wahle, J., Theunissen, M., et al.: Flexibilisierung der Spritzgießfertigung durch Digitalisierung. In: Kunststoffindustrie 4.0 – 29. Internationales Kolloquium Kunststofftechnik, pp. 76–88 (2018)Google Scholar
  16. 16.
    Tercan, H., Guajardo, A., Heinisch, J., et al.: Tranfer-learning: bridging the gap between real and simulation data for machine learning in injection molding. Procedia CIRP 72, 185–190 (2018)CrossRefGoogle Scholar
  17. 17.
    Hopmann, C., Bibow, P., Heinisch, J.: Internet of Plastics Processing. IPC Madison, USA (2019)Google Scholar
  18. 18.
    Guyon, I., Elisseeff, A.: An introduction to variable and feature selection. J. Mach. Learn. Res. 3(3), 1157–1182 (2003)Google Scholar
  19. 19.
    Charest, M., Finn, R.; Dubay, R.: Integration of artificial intelligence in an injection molding process for on-line process parameter adjustment. In: Annual IEEE International Systems Conference (SysCon), pp. 1–6. IEEE, Vancouver, Canada (2018)Google Scholar
  20. 20.
    Gao, H., Zhang, Y., Zhou, X., Li, D.: Intelligent methods for the process parameter determination of plastic injection molding. Front. Mech. Eng. 13(1), 85–95 (2018)CrossRefGoogle Scholar
  21. 21.
    Duda, R., Hart, P., Stork, D.: Pattern Classification, 2nd edn. Wiley, New York (2001)zbMATHGoogle Scholar
  22. 22.
    Arlot, S., Celisse, A.: A survey of cross-validation procedures for model selection. Stat. Surv. 4, 40–79 (2010)MathSciNetCrossRefGoogle Scholar
  23. 23.
    Pudil, P., Novovičová, J., Kittler, J.: Floating search methods in feature selection. Pattern Recogn. Lett. 15(11), 1119–1125 (1994)CrossRefGoogle Scholar
  24. 24.
    Chandrashekar, G., Sahin, F.: A survey on feature selection methods. Comput. Electr. Eng. 40(1), 16–28 (2014)CrossRefGoogle Scholar
  25. 25.
    Hall, M.A.: Correlation-based feature selection for machine learning. PhD thesis (1999)Google Scholar
  26. 26.
    Kira, K., Rendell, L.A.: The feature selection problem: traditional methods and a new Algorithm. In: AAAI’92 Proceedings of the Tenth National Conference on artificial Intelligence, pp. 129–134. AAAI, San Jose, California, USA (1992)Google Scholar
  27. 27.
    Ding, C., Peng, H.: Minimum redundancy feature selection from microarray gene expression data. In: IEEE Computer Society Bioinformatics Conference, pp. 523–528, IEEE, Stanford, USA (2003)Google Scholar
  28. 28.
    Hall, M. A., Smith, L. A.: Practical feature subset selection for machine learning. In: ACSC’98 Proceedings of the 21st Australasian Computer Science Conference, pp. 181–191. ACSC, Perth, Australia (1998)Google Scholar
  29. 29.
    Russell, S.J., Norvig, P.: Artificial intelligence, 2nd edn. Prentice Hall, Pearson Education, Upper Saddle River (2003)zbMATHGoogle Scholar
  30. 30.
    Alpaydin, E.: Introduction to machine learning, 2nd edn. MIT Press, Cambridge (2010)zbMATHGoogle Scholar
  31. 31.
    Hagan, M.T., Demuth, H.B., Beale, M.H.: Neural Network Design, 1st edn. PWS, Boston (1996)Google Scholar
  32. 32.
    Smola, A.J., Schölkopf, B.: A tutorial on support vector regression. Stat. Comput. 14(3), 199–222 (2004)MathSciNetCrossRefGoogle Scholar
  33. 33.
    Breiman, L., Friedman, J., Stone, C.J., Olshen, R.A.: Classification and Regression Trees, 1st edn. CRC Press, Boca Raton (1984)zbMATHGoogle Scholar
  34. 34.
    Biau, G., Devroye, L., Dujmović, V., Krzyżak, A.: An affine invariant k-nearest neighbor regression estimate. J. Multivar. Anal. 112, 24–34 (2012)MathSciNetCrossRefGoogle Scholar
  35. 35.
    Hastie, T., Tibshirani, R., Friedman, J.H.: The Elements of Statistical Learning. Data Mining, Inference, and Prediction, 2nd edn. Springer, New York (2017)zbMATHGoogle Scholar
  36. 36.
    Breiman, L.: Random forests. Mach. Learn. 45(1), 5–32 (2001)CrossRefGoogle Scholar
  37. 37.
    Rasmussen, C.E., Williams, C.K.I.: Gaussian Processes for Machine Learning, 3rd edn. MIT Press, Cambridge (2008)zbMATHGoogle Scholar
  38. 38.
    Urban, D., Mayerl, J.: Angewandte Regressionsanalyse: Theorie, Technik und Praxis, 5th edn. Springer VS, Wiesbaden (2018)CrossRefGoogle Scholar
  39. 39.
    Claesen, M., De Moor, B.: Hyperparameter search in machine learning. In: MIC 2015: The XI Metaheuristics International Conference, pp. 1–5, MIC, Agadir, Morocco (2015)Google Scholar
  40. 40.
    Bengio, Y.: Practical recommendations for gradient-based training of deep architectures. Lecture Notes in Computer Science 7700 LECTURE NO, pp. 437–478 (2012)Google Scholar
  41. 41.
    Ito, K., Nakano, R.: Optimizing Support Vector regression hyperparameters based on cross-validation. In: Proceedings of the International Joint Conference on Neural Networks, pp. 2077–2082. IEEE, Portland, USA (2003)Google Scholar
  42. 42.
    Matignon, R.: Data Mining using SAS Enterprise Miner, 1st edn. Wiley-Interscience, Hoboken (2007)CrossRefGoogle Scholar
  43. 43.
    Wilson, D.R., Martinez, T.R.: Improved heterogeneous distance functions. J. Artif. Intell. Res. 6, 1–34 (1997)MathSciNetCrossRefGoogle Scholar

Copyright information

© Springer-Verlag GmbH Germany, part of Springer Nature 2020

Authors and Affiliations

  • Alexander Schulze Struchtrup
    • 1
    Email author
  • Dimitri Kvaktun
    • 1
  • Reinhard Schiffers
    • 1
  1. 1.Universität Duisburg-EssenDuisburgGermany

Personalised recommendations