Skip to main content
Log in

A least third-order cumulants objective function

  • Published:
Neural Processing Letters Aims and scope Submit manuscript

Abstract

A novel Least Cumulants Method is proposed to tackle the problem of fitting to underlying function in small data sets with high noise level because higher-order statistics provide an unique feature of suppressing Gaussian noise processes of unknown spectral characteristics. The current backpropagation algorithm is actually the Least Square Method based algorithm which does not perform very well in noisy data set. Instead, the proposed method is more robust to the noise because a complete new objective function based on higher-order statistics is introduced. The proposed objective function was validated by applying to predict benchmark sunspot data and excellent results are obtained. The proposed objective function enables the network to provide a very low training error and excellent generalization property. Our results indicate that the network trained by the proposed objective function can, at most, provide 73% reduction of normalized test error in the benchmark test.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. S.Ergezinger and E.Thomsen, “An accelerated learning algorithm for multilayer perceptrons: optimization layer by layer”, IEEE Trans Neural Networks, Vol. 6, No. 1, pp. 31–42, 1995.

    Google Scholar 

  2. Y.F.Yam and T.W.S.Chow, “Accelerated training algorithm for feedforward neural networks based on least squares method”, Neural Processing Letters, Vol. 2, No. 4, pp. 20–25, 1995.

    Google Scholar 

  3. S.Geman, E.Bienenstock and R.Doursat, “Neural networks and the bias/variance dilemma”, Neural Computation, Vol. 4, pp. 1–58, 1992.

    Google Scholar 

  4. F.Girosi, M.Jones and T.Poggio, “Regularization theory and neural networks architectures”, Neural Computation, Vol. 7, pp. 219–269, 1995.

    Google Scholar 

  5. A.N.Tikhonov and V.Y.Arsenin, Solution of Ill-Posed Problems, W.H. Winston: Washington, DC, 1977.

    Google Scholar 

  6. D.S. Chen and R.C. Jain, “A robust back propagation learning algorithm for function approximation”, IEEE Trans. Neural Networks, Vol. 5, No. 3, pp. 467–479.

  7. V.D.A.Sánchez, “Robustization of a learning method for RBF networks”, Neurocomputing, Vol. 9, pp. 88–94, 1995.

    Google Scholar 

  8. A. Weigend, D. Rumelhart and B. Huberman, “Generalization by weight elimination with application to forecasting”, in R.P. Lippman and J. Moody (eds) Advances in Neural Information Processing III, pp. 875–882, Morgan Kaufmann: San Mateo, CA.

  9. C.Svarer, L.K.Hansen and J.Larsen, “On design and evaluation of tapped-delay neural network architechtures”, Proc. of IEEE International Conference on Neural Networks, San Francisco, pp. 46–51, IEEE Press, New York, 1993.

    Google Scholar 

  10. C.L.Nikias and A.P.Petropuou, Higher-Order Spectra Analysis: A Nonlinear Signal Processing Framework, Prentice-Hall: Englewood Cliffs, NJ, 1993.

    Google Scholar 

  11. J.M.Mendel, “Tutorial on higher-order statistics (spectra) in signal processing and system theory: theoretical results and some applications”, Proc. IEEE, Vol. 79, No. 3, pp. 278–305, 1991.

    Google Scholar 

  12. C.L.Nikias and M.R.Raghuveer, “Bispectrum estimation: a digital signal processing framework”, Proc. IEEE, Vol. 75, No. 7, pp. 869–891, 1987.

    Google Scholar 

  13. G.B.Giannakis and M.K.Tsatsanis, “A unifying view of cumulant and polyspectral measures for nonGaussian signal classification and estimation”, IEEE Trans. Information Theory, Vol. 38, pp. 386–406, 1992.

    Google Scholar 

  14. C.T. Leung and T.W.S. Chow, “A fourth-order cumulants based robust backprogation algorithm”, Neural Computation (submitted).

  15. H. Tong and K. Lim, “Threshold autoregression, limited cycles and cyclical data”, J. R. Stat. Soc., Vol. 42, pp. 245–292.

  16. G.Deco, W.Finnoff and H.G.Zimmermann, “Unsupervised mutual information criterion for elimination of overtraining in supervised multilayer networks”, Neural Computation, Vol. 7, pp. 86–107, 1995.

    Google Scholar 

  17. A.C. Tsoi, “Single layer perceptron”, Lecture Notes on Neural Network, Department of Electrical and Electronic Engineering, University of Hong Kong.

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Leung, CT., Chow, T.W.S. & Yam, Y.F. A least third-order cumulants objective function. Neural Process Lett 3, 91–99 (1996). https://doi.org/10.1007/BF00571682

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1007/BF00571682

Key words

Navigation