The present chapter describes applications of error entropy (and entropyinspired) risks, in a variety of classification tasks performed by more sophisticated machines than those considered in the preceding chapters. These include multi-layer perceptrons (MLPs), recurrent neural networks (RNNs), complex-valued neural networks (CVNNs), modular neural networks (MNNs), and decision trees. We also present a clustering algorithm using a MEE-like concept, LEGClust, which is used in building MNNs. Besides implementation issues, an extensive set of experimental results and comparisons to non-EE approaches are presented. Since the respective learning algorithms use the empirical versions of the risks, the corresponding acronyms (MSE, CE, SEE, and so forth) labeling tables and graphs of results refer from now on to the empirical versions of the risks.
KeywordsRecurrent Neural Network Spectral Cluster Dissimilarity Matrix Proximity Matrix Spectral Cluster Algorithm
Unable to display preview. Download preview PDF.