Avoid common mistakes on your manuscript.

## 1 Correction to: Neural Computing and Applications https://doi.org/10.1007/s00521-020-05182-1

### 1.1 Introduction

Here we provide Addendum on SPOCU fitting and Erratum to Article Title:”SPOCU”: scaled polynomial constant unit activation function. https://doi.org/10.1007/s00521-020-05182-1.

Following the publication of our article [1], we have become aware that the legends of Fig. 8b should read as follows: generator *h*(·) of the activation function *S*. Moreover, the following Fig. 1 provides the graph of activation function *S* (*c* = ∞).

The SPOCU activation function is given by

where *β* ∈ (0,1)*, α, γ* > 0 and

with *r*(*x*) = *x*^{3}(*x*^{5} − 2*x*^{4} + 2) and 1 ≤ *c* < ∞ (we admit *c* goes to infinity with *r*(*c*) → ∞).

Shortly after publication of our article [1], we have received questions from the community how to implement SPOCU, since it is not involved in current software packages, e.g. in Matlab or Python (Keras). Here we provide a short idea how to select parameters *α, β, γ* or *c.*

Parametric space for parameters of SPOCU is (*α, β, γ, c*) ∈ (0*,*∞) × (0*,*1) × (0*,*∞) × (1*,*∞]: = P. For proper choice of the parameters, it is sufficient they belong to P and they can satisfy several conditions. One set of conditions has been outlined in [1], which gives^{Footnote 1}

hold. Here *f*(*z*) is the pdf of underlying distribution, *J* is the Jacobi matrix w.r.t to mean and variance parameters and ||·|| is the appropriate matrix norm, e.g. *L*_{1} or *L*_{2}, as considered in the application.

However, other sets of conditions can be more attractive for the reader, all depends on the aims and type of network. E.g. one can, similarly as in 5.1. of [1] fix *c* = *γ* = 1. And, to compute *α, β* one can consider Pareto density *g*(*x*) = *a*(*x* + 1)^{−a−1}*, x* > 0*, a* > 1 to satisfy for some *a* > 1

with additional condition \(s'(0)=1\), which is 2\((\beta^{2} \beta^{5} - \beta^{4} + 3) = \gamma\). Thus, for Pareto tail parameter *a* = 4 we received in Maple software *α* = 0*.*8874425243 and *β* = 0*.*4495811364*.*

## Notes

Standard normality implies pdf of the form \(f(z) = \frac{1}{{\sqrt {2\pi } }}{\text{e}}^{{ - \frac{{z^{2} }}{2}}}\).

## Reference

Kiseľák J, Lu Y, Švihra J, Szépe P, Stehlík M (2020) “SPOCU”: scaled polynomial constant unit activation function. Neural Comput Appl. https://doi.org/10.1007/s00521-020-05182-1

## Acknowledgements

Authors acknowledge professional support of Editor in Chief, Professor John Macintyre.

## Author information

### Authors and Affiliations

### Corresponding author

## Additional information

### Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

## Rights and permissions

## About this article

### Cite this article

Kiseľák, J., Lu, Y., Švihra, J. *et al.* Correction to: “SPOCU”: scaled polynomial constant unit activation function.
*Neural Comput & Applic* **33**, 1749–1750 (2021). https://doi.org/10.1007/s00521-020-05412-6

Published:

Issue Date:

DOI: https://doi.org/10.1007/s00521-020-05412-6