Learning Activation Functions from Data Using Cubic Spline Interpolation

  • Simone ScardapaneEmail author
  • Michele Scarpiniti
  • Danilo Comminiello
  • Aurelio Uncini
Part of the Smart Innovation, Systems and Technologies book series (SIST, volume 102)


Neural networks require a careful design in order to perform properly on a given task. In particular, selecting a good activation function (possibly in a data-dependent fashion) is a crucial step, which remains an open problem in the research community. Despite a large amount of investigations, most current implementations simply select one fixed function from a small set of candidates, which is not adapted during training, and is shared among all neurons throughout the different layers. However, neither two of these assumptions can be supposed optimal in practice. In this paper, we present a principled way to have data-dependent adaptation of the activation functions, which is performed independently for each neuron. This is achieved by leveraging over past and present advances on cubic spline interpolation, allowing for local adaptation of the functions around their regions of use. The resulting algorithm is relatively cheap to implement, and overfitting is counterbalanced by the inclusion of a novel damping criterion, which penalizes unwanted oscillations from a predefined shape. Preliminary experimental results validate the proposal.


Neural network Activation function Spline interpolation 


  1. 1.
    Agostinelli, F., Hoffman, M., Sadowski, P., Baldi, P.: Learning activation functions to improve deep neural networks (2014). arXiv preprint arXiv:1412.6830
  2. 2.
    Chandra, P., Singh, Y.: An activation function adapting training algorithm for sigmoidal feedforward networks. Neurocomputing 61, 429–437 (2004)CrossRefGoogle Scholar
  3. 3.
    Chen, C.T., Chang, W.D.: A feedforward neural network with function shape autotuning. Neural Netw. 9(4), 627–641 (1996)CrossRefGoogle Scholar
  4. 4.
    Glorot, X., Bengio, Y.: Understanding the difficulty of training deep feedforward neural networks. In: International Conference on Artificial Intelligence and Statistics, pp. 249–256 (2010)Google Scholar
  5. 5.
    Goh, S., Mandic, D.: Recurrent neural networks with trainable amplitude of activation functions. Neural Netw. 16(8), 1095–1100 (2003)CrossRefGoogle Scholar
  6. 6.
    Guarnieri, S., Piazza, F., Uncini, A.: Multilayer feedforward networks with adaptive spline activation function. IEEE Trans. Neural Netw. 10(3), 672–683 (1999)CrossRefGoogle Scholar
  7. 7.
    Haykin, S.: Neural Networks and Learning Machines, 3rd edn. Pearson Education (2009)Google Scholar
  8. 8.
    Kingma, D., Ba, J.: Adam: A method for stochastic optimization. In: 3rd International Conference for Learning Representations (2015). arXiv preprint arXiv:1412.6980
  9. 9.
    Lin, M., Chen, Q., Yan, S.: Network in Network (2013). arXiv preprint arXiv:1312.4400
  10. 10.
    Ma, L., Khorasani, K.: Constructive feedforward neural networks using hermite polynomial activation functions. IEEE Trans. Neural Netw. 16(4), 821–833 (2005)CrossRefGoogle Scholar
  11. 11.
    Piazza, F., Uncini, A., Zenobi, M.: Artificial neural networks with adaptive polynomial activation function. In: International Joint Conference on Neural Networks, vol. 2, pp. II–343. IEEE/INNS (1992)Google Scholar
  12. 12.
    Rasmussen, C.: Gaussian Processes for Machine Learning. MIT Press (2006)Google Scholar
  13. 13.
    Scarpiniti, M., Comminiello, D., Parisi, R., Uncini, A.: Nonlinear spline adaptive filtering. Signal Process. 93(4), 772–783 (2013)CrossRefGoogle Scholar
  14. 14.
    Scarpiniti, M., Comminiello, D., Scarano, G., Parisi, R., Uncini, A.: Steady-state performance of spline adaptive filters. IEEE Trans. Signal Process. 64(4), 816–828 (2016)MathSciNetCrossRefGoogle Scholar
  15. 15.
    Schmidhuber, J.: Deep learning in neural networks: an overview. Neural Netw. 61, 85–117 (2015)CrossRefGoogle Scholar
  16. 16.
    Trentin, E.: Networks with trainable amplitude of activation functions. Neural Netw. 14(4), 471–493 (2001)CrossRefGoogle Scholar
  17. 17.
    Vecci, L., Piazza, F., Uncini, A.: Learning and approximation capabilities of adaptive spline activation function neural networks. Neural Netw. 11(2), 259–270 (1998)CrossRefGoogle Scholar
  18. 18.
    Wahba, G.: Spline Models for Observational Data. SIAM (1990)Google Scholar
  19. 19.
    Zhang, M., Xu, S., Fulcher, J.: Neuron-adaptive higher order neural-network models for automated financial data modeling. IEEE Trans. Neural Netw. 13(1), 188–204 (2002)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2019

Authors and Affiliations

  • Simone Scardapane
    • 1
    Email author
  • Michele Scarpiniti
    • 1
  • Danilo Comminiello
    • 1
  • Aurelio Uncini
    • 1
  1. 1.Department of Information Engineering, Electronics and Telecommunications (DIET)“Sapienza” University of RomeRomeItaly

Personalised recommendations