Skip to main content

Supervised Adaptive Resonance Theory and Rules

  • Chapter
Innovations in ART Neural Networks

Part of the book series: Studies in Fuzziness and Soft Computing ((STUDFUZZ,volume 43))

  • 210 Accesses

Abstract

Supervised Adaptive Resonance Theory is a family of neural networks that performs incremental supervised learning of recognition categories (pattern classes) and multidimensional maps of both binary and analog patterns. This chapter highlights that the supervised ART architecture is compatible with IF-THEN rule-based symbolic representation. Specifi­cally, the knowledge learned by a supervised ART system can be readily translated into rules for interpretation. Similarly, a priori domain knowl­edge in the form of IF-THEN rules can be converted into a supervised ART architecture. Not only does initializing networks with prior knowl­edge improve predictive accuracy and learning efficiency, the inserted symbolic knowledge can also be refined and enhanced by the supervised ART learning algorithm. By preserving symbolic rule form during learn­ing, the rules extracted from a supervised ART system can be compared directly with the originally inserted rules.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Y. R. Asfour, G. A. Carpenter, S. Grossberg, and G. W. Lesher. Fusion ARTMAP: A neural network architecture for multichannel data fusion and classification. In World Congress on Neural Net­work, Portland, OR, volume II, pages 210–215. Hillsdale, NJ: Lawrence Erlbaum Associates, July 1993.

    Google Scholar 

  2. G. A. Carpenter and S. Grossberg. A massively parallel architecture for a selforganizing neural pattern recognition machine. Computer

    Google Scholar 

  3. G. A. Carpenter, S. Grossberg, N. Markuzon, J. H. Reynolds, and D. B. Rosen. Fuzzy ARTMAP: A neural network architecture for incremental supervised learning of analog multidimensional maps. IEEE Transactions on Neural Networks, 3:698–713,1992.

    Article  Google Scholar 

  4. G. A. Carpenter, S. Grossberg, and J. H. Reynolds. ARTMAP: Supervised real time learning and classification by a self-organizing neural network. Neural Networks, 4:565–588,1991.

    Article  Google Scholar 

  5. G. A. Carpenter, S. Grossberg, and D. B. Rosen. Fuzzy ART: Fast stable learning and categorization of analog patterns by an adaptive resonance system. Neural Networks, 4:759–771,1991.

    Article  Google Scholar 

  6. G. A. Carpenter and A.-H. Tan. Rule extraction: From neural architecture to symbolic representation. Connection Science, 7(l):3–27, 1995.

    Article  Google Scholar 

  7. R. Färber, A. Lapedes, and K Sirotkin. Determination of eucaryotic protein coding regions using neural networks and information theory. Journal of Molecular Biology, 226:471–479,1992.

    Article  Google Scholar 

  8. L. M. Fu and L. C. Fu. Mapping rule-based knowledge into neural architecture. Knowledge-Based Systems, 3:48–56,1990.

    Article  Google Scholar 

  9. R. C. Lacher, S. I. Hruska, and D. C. Kuncicky. Backpropagation learning in expert networks. IEEE Transactions on Neural Net­works, 3:62–72,1992.

    Article  Google Scholar 

  10. A. Lapedes, C. Barnes, C. Burks, R. Färber, and K Sirotkin. Application of neural networks and other machine learning algorithms to dna sequence analysis. In Computers and DNA, SFI Studies in the sciences of complexity, vol VII, pages 265–281. Reading, MA: Addison-Wesley, 1990.

    Google Scholar 

  11. P. M. Murphy and D. W. Aha. UCI repository of machine learning databases [machine-readable data repository]. Irvine, CA: Univer­sity of California, Department of Information and Computer Sci­ence, 1992.

    Google Scholar 

  12. M. O’Neill Escherichia coli promoters: I. Consensus as it relates to spacing class, specificity, repeat substructure, and three dimensional organization. Journal of Biological Chemistry, 264:5522–5530,1989.

    Google Scholar 

  13. M. O’Neill. Escherichia coli promoters: H. A spacing class-dependent promoter search protocol. Journal of Biological Chem­istry, 264:5531–5534,1989.

    Google Scholar 

  14. J. R. Quinlan. Induction of decision trees. Machine teaming, 1:81–106,1986.

    Google Scholar 

  15. A.-H. Tan. Adaptive Resonance Associative Map. Neural Net­works, 8(3):437–446,1995.

    Article  Google Scholar 

  16. A.-H. Tan. Cascade ARTMAP: Integrating neural computation and symbolic knowledge processing. IEEE Transactions on Neural Net­works, 8(2):237–250,1997.

    Article  Google Scholar 

  17. A.-H. Tan and C. Teo. Learning user profiles for personalized information dissemination. In Proceedings, 1998 IEEE Interna­tional Joint Conference on Neural Networks, Alaska, pages 183–188,1998.

    Google Scholar 

  18. G. G. Towell and J. W. Shavlik. Interpretation of artificial neural networks: Mapping knowledge-based neural networks into rules. In Advances in Neural Information Processing Systems 4, pages 977–984. San Mateo, CA: Morgan Kaufmann, 1992.

    Google Scholar 

  19. G. G. Towell and J. W. Shavlik. Extracting refined rules from knowledgebased neural networks. Machine Learning, 13:71–101, 1993.

    Google Scholar 

  20. G. G. Towell, J. W. Shavlik, and M. O. Noordewier. Refinement of approximately correct domain theories by knowledgebased neural networks. In Proceedings, 8th National Conference on AI, Boston, MA, pages 861–866. AAAI Press/The MIT Press, 1990.

    Google Scholar 

  21. V. Tresp, J. Hollatz, and S. Ahmad. Network structuring and training using rulebased knowledge. In Advances in Neural Informa­tion Processing Systems 5, pages 977–984. San Mateo, CA: Morgan Kaufmann, 1993.

    Google Scholar 

  22. M. S. Waterman. Mathematical Methods for DNAsequences. Boca Raton, FL: CRC Press, 1989.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2000 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Tan, AH. (2000). Supervised Adaptive Resonance Theory and Rules. In: Jain, L.C., Lazzerini, B., Halici, U. (eds) Innovations in ART Neural Networks. Studies in Fuzziness and Soft Computing, vol 43. Physica, Heidelberg. https://doi.org/10.1007/978-3-7908-1857-4_4

Download citation

  • DOI: https://doi.org/10.1007/978-3-7908-1857-4_4

  • Publisher Name: Physica, Heidelberg

  • Print ISBN: 978-3-7908-2469-8

  • Online ISBN: 978-3-7908-1857-4

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics