Pattern Classification with the Probabilistic Neural Networks Based on Orthogonal Series Kernel

Conference paper

DOI: 10.1007/978-3-319-40663-3_58

Part of the Lecture Notes in Computer Science book series (LNCS, volume 9719)
Cite this paper as:
Savchenko A.V. (2016) Pattern Classification with the Probabilistic Neural Networks Based on Orthogonal Series Kernel. In: Cheng L., Liu Q., Ronzhin A. (eds) Advances in Neural Networks – ISNN 2016. ISNN 2016. Lecture Notes in Computer Science, vol 9719. Springer, Cham

Abstract

Probabilistic neural network (PNN) is the well-known instance-based learning algorithm, which is widely used in various pattern classification and regression tasks, if rather small number of instances for each class is available. The known disadvantage of this network is its insufficient classification computational complexity. The common way to overcome this drawback is the reduction techniques with selection of the most typical instances. Such approach causes the shifting of the estimates of the class probability distribution, and, in turn, the decrease of the classification accuracy. In this paper we examine another possible solution by replacing the Gaussian window and the Parzen kernel to the orthogonal series Fejér kernel and using the naïve assumption about independence of features. It is shown, that our approach makes it possible to achieve much better runtime complexity in comparison with either original PNN or its modification with the preliminary clustering of the training set.

Keywords

Pattern classification Small sample size problem Probabilistic neural network (PNN) Orthogonal series kernel Nonparametric density estimates 

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  1. 1.Laboratory of Algorithms and Technologies for Network AnalysisNational Research University Higher School of EconomicsNizhny NovgorodRussia

Personalised recommendations