Comparison of Neural Networks Incorporating Partial Monotonicity by Structure

  • Alexey Minin
  • Bernhard Lang
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5164)

Abstract

Neural networks applied in control loops and safety-critical domains have to meet more requirements than just the overall best function approximation. On the one hand, a small approximation error is required, on the other hand, the smoothness and the monotonicity of selected input-output relations have to be guaranteed. Otherwise the stability of most of the control laws is lost. Three approaches for partially monotonic models are compared in this article, namely Bounded Derivative Network (BDN) [1], Monotonic Multi-Layer Perceptron Network (MONMLP) [2], and Constrained Linear Regression (CLR). Authors investigated the advantages and disadvantages of these approaches related to approximation performance, training of the model and convergence.

Keywords

Root Mean Square Monotonic Behavior Hyperbolic Tangent Function High Nitric Oxide Small Approximation Error 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Turner, P., Guiver, J., Brian, L.: Introducing The State Space Bounded Derivative Network For Commercial Transition Control. In: Proceedings of the American Control Conference, Denver, Colorado, June 4-6 (2003)Google Scholar
  2. 2.
    Lang, B.: Monotonic Multi-layer Perceptron Networks as Universal Approximators. In: Duch, W., Kacprzyk, J., Oja, E., Zadrożny, S. (eds.) ICANN 2005. LNCS, vol. 3697, pp. 31–37. Springer, Heidelberg (2005)Google Scholar
  3. 3.
    Zhang, H., Zhang, Z.: Feed forward networks with monotone constraints. In: IEEE International Joint Conference on Neural Networks IJCNN 1999, Washington, DC, USA, vol. 3, pp. 1820–1823 (1999)Google Scholar
  4. 4.
    Sill, J.: Monotonic Networks, Advances in Neural Information Processing Systems, Cambridge, MA, vol. 10, pp. 661–667 (1998)Google Scholar
  5. 5.
    Sill, J., Abu-Mostafa, Y.S.: Monotonicity hints, Advances in Neural Information Processing Systems, Cambridge, MA, vol. 9, pp. 634–640 (1997)Google Scholar
  6. 6.
    Kay, H., Ungar, L.H.: Estimating monotonic functions and their bounds. AIChE J. 46, 2426Google Scholar
  7. 7.
    Tarca, L.A., Grandjean, B.P.A., Larachi, F.: Embedding monotonicity and concavity information in the training of multiphase flow neural network correlations by means of genetic algorithms. Computers and Chemical Engineering 28(9), 1701–1713 (2004)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2008

Authors and Affiliations

  • Alexey Minin
    • 1
  • Bernhard Lang
    • 2
  1. 1.Saint-Petersburg State University 
  2. 2.OOO Siemens, Fault Analysis and Prevention group, 191186, Russia Saint-Petersburg, Volynskiy per. dom 3A liter A 

Personalised recommendations