Teacher-Directed Learning with Gaussian and Sigmoid Activation Functions

  • Ryotaro Kamimura
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3316)

Abstract

In this paper, we propose a new computational method for information-theoretic competitive learning that maximizes information about input patterns as well as target patterns. The method has been called teacher-directed learning, because target information directs networks to produce appropriate outputs. In the previous method, we used sigmoidal functions to activate competitive units. We found that the method with the sigmoidal functions could not increase information for some problems. To remedy this shortcoming, we use Gaussian activation functions to simulate competition in the intermediate layer, because changing the width of the functions accelerates information maximization processes. In the output layer, we used the ordinary sigmoid functions to produce outputs. We applied our method to two problems: an artificial data problem and a chemical data problem. In both cases, we could show that information could be significantly increased with the Gaussian functions, and better generalization performance could be obtained.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Kamimura, R., Nakanishi, S.: Improving generalization performance by information minimization. IEICE Transactions on Information and Systems E78-D(2), 163–173 (1995)Google Scholar
  2. 2.
    Kamimura, R., Nakanishi, S.: Hidden information maximization for feature detection and rule discovery. Network 6, 577–622 (1995)MATHCrossRefGoogle Scholar
  3. 3.
    Kamimura, R.: Minimizing α-information for generalization and interpretation. Algorithmica 22, 173–197 (1998)MATHCrossRefMathSciNetGoogle Scholar
  4. 4.
    Kamimura, R., Kamimura, T., Shultz, T.R.: Information theoretic competitive learning and linguistic rule acquistion. Transactions of the Japanese Society for Artificial Intelligence 16(2), 287–298 (2001)CrossRefGoogle Scholar
  5. 5.
    Kamimura, R., Kamimura, T., Uchida, O.: Flexible feature discovery and structural information. Connection Science 13(4), 323–347 (2001)CrossRefGoogle Scholar
  6. 6.
    Kamimura, R., Kamimura, T., Takeuchi, H.: Greedy information acquisition algorithm: A new information theoretic approach to dynamic information acquisition in neural networks. Connection Science 14(2), 137–162 (2002)CrossRefGoogle Scholar
  7. 7.
    Gatlin, L.L.: Information Theory and Living Systems. Columbia University Press (1972)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2004

Authors and Affiliations

  • Ryotaro Kamimura
    • 1
  1. 1.Information Science LaboratoryTokai UniversityHiratsuka KanagawaJapan

Personalised recommendations