Skip to main content

Approximating Noncontinuous Functions

  • Chapter
  • First Online:
  • 1856 Accesses

Abstract

This chapter will discuss the neural network approximation of noncontinuous functions. Currently, this is a problematic area for neural networks because network processing is based on calculating partial function derivatives (using the gradient descent algorithm), and calculating them for noncontinuous functions at the points where the function value suddenly jump or drop leads to questionable results. You will dig deeper into this issue in this chapter. The chapter also includes a method I developed that solves this issue.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Igor Livshin

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Livshin, I. (2019). Approximating Noncontinuous Functions. In: Artificial Neural Networks with Java. Apress, Berkeley, CA. https://doi.org/10.1007/978-1-4842-4421-0_8

Download citation

Publish with us

Policies and ethics