Skip to main content

Learning Activation Functions from Data Using Cubic Spline Interpolation

  • Chapter
  • First Online:

Part of the book series: Smart Innovation, Systems and Technologies ((SIST,volume 102))

Abstract

Neural networks require a careful design in order to perform properly on a given task. In particular, selecting a good activation function (possibly in a data-dependent fashion) is a crucial step, which remains an open problem in the research community. Despite a large amount of investigations, most current implementations simply select one fixed function from a small set of candidates, which is not adapted during training, and is shared among all neurons throughout the different layers. However, neither two of these assumptions can be supposed optimal in practice. In this paper, we present a principled way to have data-dependent adaptation of the activation functions, which is performed independently for each neuron. This is achieved by leveraging over past and present advances on cubic spline interpolation, allowing for local adaptation of the functions around their regions of use. The resulting algorithm is relatively cheap to implement, and overfitting is counterbalanced by the inclusion of a novel damping criterion, which penalizes unwanted oscillations from a predefined shape. Preliminary experimental results validate the proposal.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   139.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   179.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   179.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    We note that the following treatment can be extended easily to the case of a network with more than one hidden layer. However, restricting it to a single layer allow us to keep the discussion focused on the problems/advantages arising in the use of SAFs. We leave this extension to a future work.

  2. 2.

    http://www.dcc.fc.up.pt/~ltorgo/Regression/cal_housing.html.

  3. 3.

    http://learning.eng.cam.ac.uk/carl/code/minimize/.

  4. 4.

    https://bitbucket.org/ispamm/spline-nn.

References

  1. Agostinelli, F., Hoffman, M., Sadowski, P., Baldi, P.: Learning activation functions to improve deep neural networks (2014). arXiv preprint arXiv:1412.6830

  2. Chandra, P., Singh, Y.: An activation function adapting training algorithm for sigmoidal feedforward networks. Neurocomputing 61, 429–437 (2004)

    Article  Google Scholar 

  3. Chen, C.T., Chang, W.D.: A feedforward neural network with function shape autotuning. Neural Netw. 9(4), 627–641 (1996)

    Article  Google Scholar 

  4. Glorot, X., Bengio, Y.: Understanding the difficulty of training deep feedforward neural networks. In: International Conference on Artificial Intelligence and Statistics, pp. 249–256 (2010)

    Google Scholar 

  5. Goh, S., Mandic, D.: Recurrent neural networks with trainable amplitude of activation functions. Neural Netw. 16(8), 1095–1100 (2003)

    Article  Google Scholar 

  6. Guarnieri, S., Piazza, F., Uncini, A.: Multilayer feedforward networks with adaptive spline activation function. IEEE Trans. Neural Netw. 10(3), 672–683 (1999)

    Article  Google Scholar 

  7. Haykin, S.: Neural Networks and Learning Machines, 3rd edn. Pearson Education (2009)

    Google Scholar 

  8. Kingma, D., Ba, J.: Adam: A method for stochastic optimization. In: 3rd International Conference for Learning Representations (2015). arXiv preprint arXiv:1412.6980

  9. Lin, M., Chen, Q., Yan, S.: Network in Network (2013). arXiv preprint arXiv:1312.4400

  10. Ma, L., Khorasani, K.: Constructive feedforward neural networks using hermite polynomial activation functions. IEEE Trans. Neural Netw. 16(4), 821–833 (2005)

    Article  Google Scholar 

  11. Piazza, F., Uncini, A., Zenobi, M.: Artificial neural networks with adaptive polynomial activation function. In: International Joint Conference on Neural Networks, vol. 2, pp. II–343. IEEE/INNS (1992)

    Google Scholar 

  12. Rasmussen, C.: Gaussian Processes for Machine Learning. MIT Press (2006)

    Google Scholar 

  13. Scarpiniti, M., Comminiello, D., Parisi, R., Uncini, A.: Nonlinear spline adaptive filtering. Signal Process. 93(4), 772–783 (2013)

    Article  Google Scholar 

  14. Scarpiniti, M., Comminiello, D., Scarano, G., Parisi, R., Uncini, A.: Steady-state performance of spline adaptive filters. IEEE Trans. Signal Process. 64(4), 816–828 (2016)

    Article  MathSciNet  Google Scholar 

  15. Schmidhuber, J.: Deep learning in neural networks: an overview. Neural Netw. 61, 85–117 (2015)

    Article  Google Scholar 

  16. Trentin, E.: Networks with trainable amplitude of activation functions. Neural Netw. 14(4), 471–493 (2001)

    Article  Google Scholar 

  17. Vecci, L., Piazza, F., Uncini, A.: Learning and approximation capabilities of adaptive spline activation function neural networks. Neural Netw. 11(2), 259–270 (1998)

    Article  Google Scholar 

  18. Wahba, G.: Spline Models for Observational Data. SIAM (1990)

    Google Scholar 

  19. Zhang, M., Xu, S., Fulcher, J.: Neuron-adaptive higher order neural-network models for automated financial data modeling. IEEE Trans. Neural Netw. 13(1), 188–204 (2002)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Simone Scardapane .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer International Publishing AG, part of Springer Nature

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Scardapane, S., Scarpiniti, M., Comminiello, D., Uncini, A. (2019). Learning Activation Functions from Data Using Cubic Spline Interpolation. In: Esposito, A., Faundez-Zanuy, M., Morabito, F., Pasero, E. (eds) Neural Advances in Processing Nonlinear Dynamic Signals. WIRN 2017 2017. Smart Innovation, Systems and Technologies, vol 102. Springer, Cham. https://doi.org/10.1007/978-3-319-95098-3_7

Download citation

Publish with us

Policies and ethics