Advertisement

Long Term Prediction of Product Quality in a Glass Manufacturing Process Using a Kernel Based Approach

  • Tobias Jung
  • Luis Herrera
  • Bernhard Schoelkopf
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3512)

Abstract

In this paper we report the results obtained using a kernel-based approach to predict the temporal development of four response signals in the process control of a glass melting tank with 16 input parameters. The data set is a revised version from the modelling challenge in EUNITE-2003. The central difficulties are: large time-delays between changes in the inputs and the outputs, large number of data, and a general lack of knowledge about the relevant variables that intervene in the process. The methodology proposed here comprises Support Vector Machines (SVM) and Regularization Networks (RN). We use the idea of sparse approximation both as a means of regularization and as a means of reducing the computational complexity. Furthermore, we will use an incremental approach to add new training examples to the kernel-based method and efficiently update the current solution. This allows us to use a sophisticated learning scheme, where we iterate between prediction and training, with good computational efficiency and satisfactory results.

Keywords

Support Vector Machine Support Vector Regression Reproduce Kernel Hilbert Space Gaussian Process Regression Sparse Approximation 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    EUNITE Competition 2003: Prediction of product quality in glass manufacturing (2003), http://www.eunite.org
  2. 2.
    Engel, Y., Mannor, S., Meir, R.: The Kernel Recursive Least Square Algorithm. IEEE Transactions on Signal Processing 52(8), 2275–2285 (2004)CrossRefMathSciNetGoogle Scholar
  3. 3.
    Engel, Y., Mannor, S., Meir, R.: Sparse online greedy support vector regression. In: Proc. of 13th European Conference on Machine Learning. Springer, Heidelberg (2002)Google Scholar
  4. 4.
    Schoelkopf, B., Smola, A.: Learning with Kernels. MIT Press, Cambridge (2002)Google Scholar
  5. 5.
    Smola, A., Schoelkopf, B.: Sparse greedy matrix approximation for machine learning. In: Proc. of 17th International Conference on Machine Learning. Morgan Kaufmann, San Francisco (2000)Google Scholar
  6. 6.
    Cao, L.J., Tay, F.E.H.: Support Vector Machine With Adaptive Parameters in Financial Time Series Forecasting. IEEE Transactions on Neural Networks 14(6), 1506–1518 (2003)CrossRefGoogle Scholar
  7. 7.
    Cristianini, N., Shawe-Taylor, J.: An Introduction to Support Vector Machines. Cambridge University Preess, Cambridge (2000)Google Scholar
  8. 8.
    Chang, M.-W., Chen, B.-J., Lin, C.-J.: EUNITE Network Competition: Electricity Load Forecasting, November 2001. Winner of EUNITE world wide competition on electricity load prediction Google Scholar
  9. 9.
    Csato, L., Opper, M.: Sparse on-line Gaussian processes. Neural Computation 14(3), 641–669 (2002)zbMATHCrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2005

Authors and Affiliations

  • Tobias Jung
    • 1
  • Luis Herrera
    • 2
  • Bernhard Schoelkopf
    • 3
  1. 1.Fachbereich Mathematik & InformatikJohannes Gutenberg-UniversitaetMainzGermany
  2. 2.Dpt. of Computer Architecture and TechnologyUniversity of GranadaGranadaSpain
  3. 3.M.P.I. for Biological CyberneticsTuebingenGermany

Personalised recommendations