Approximating Noncontinuous Functions

  • Igor Livshin


This chapter will discuss the neural network approximation of noncontinuous functions. Currently, this is a problematic area for neural networks because network processing is based on calculating partial function derivatives (using the gradient descent algorithm), and calculating them for noncontinuous functions at the points where the function value suddenly jump or drop leads to questionable results. You will dig deeper into this issue in this chapter. The chapter also includes a method I developed that solves this issue.

Copyright information

© Igor Livshin 2019

Authors and Affiliations

  • Igor Livshin
    • 1
  1. 1.ChicagoUSA

Personalised recommendations