Advertisement

Switching Neural Networks: A New Connectionist Model for Classification

  • Marco Muselli
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3931)

Abstract

A new connectionist model, called Switching Neural Network (SNN), for the solution of classification problems is presented. SNN includes a first layer containing a particular kind of A/D converters, called latticizers, that suitably transform input vectors into binary strings. Then, the subsequent two layers of an SNN realize a positive Boolean function that solves in a lattice domain the original classification problem.

Every function realized by an SNN can be written in terms of intelligible rules. Training can be performed by adopting a proper method for positive Boolean function reconstruction, called Shadow Clustering (SC). Simulation results obtained on the StatLog benchmark show the good quality of the SNNs trained with SC.

Keywords

Binary String Connectionist Model Logical Product Boolean Lattice Multiclass Problem 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Vapnik, V.N.: Statistical Learning Theory. John Wiley & Sons, New York (1998)MATHGoogle Scholar
  2. 2.
    Muselli, M., Quarati, A.: Reconstructing positive Boolean functions with Shadow Clustering. In: Proceedings of the 17th European Conference on Circuit Theory and Design (ECCTD 2005), Cork, Ireland (August 2005)Google Scholar
  3. 3.
    Muselli, M.: Approximation Properties of Positive Boolean Functions. In: Apolloni, B., Marinaro, M., Nicosia, G., Tagliaferri, R. (eds.) WIRN 2005 and NAIS 2005. LNCS, vol. 3931, pp. 18–22. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  4. 4.
    Kohavi, R., Sahami, M.: Error-based and entropy-based discretization of continuous features. In: Proceedings of the Second International Conference on Knowledge Discovery and Data Mining, pp. 114–119 (1996)Google Scholar
  5. 5.
    Liu, H., Setiono, R.: Feature selection via discretization. IEEE Transactions on Knowledge and Data Engineering 9, 642–645 (1997)CrossRefGoogle Scholar
  6. 6.
    Boros, E., Hammer, P.L., Ibaraki, T., Kogan, A., Mayoraz, E., Muchnik, I.: An Implementation of Logical Analysis of Data. IEEE Transactions on Knowledge and Data Engineering 12, 292–306 (2000)CrossRefGoogle Scholar
  7. 7.
    Muselli, M., Liberati, D.: Binary rule generation via Hamming Clustering. IEEE Transactions on Knowledge and Data Engineering 14, 1258–1268 (2002)CrossRefGoogle Scholar
  8. 8.
    Michie, D., Spiegelhalter, D., Taylor, C. (eds.): Machine Learning, Neural, and Statistical Classification. Ellis-Horwood, London (1994)MATHGoogle Scholar
  9. 9.
    Quinlan, J.R.: C4.5: Programs for Machine Learning. Morgan Kaufmann, San Francisco (1994)Google Scholar
  10. 10.
    Hong, S.J.: R-MINI: An Iterative Approach for Generating Minimal Rules from Examples. IEEE Transactions on Knowledge and Data Engineering 9, 709–717 (1997)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Marco Muselli
    • 1
  1. 1.Istituto di Elettronica e di Ingegneria dell’Informazione e delle TelecomunicazioniConsiglio Nazionale delle RicercheGenovaItaly

Personalised recommendations