Advertisement

Evenet 2000: Designing and Training Arbitrary Neural Networks in Java

  • Evelio J. González
  • Alberto F. Hamilton
  • Lorenzo Moreno
  • José F. Sigut
  • Roberto L. Marichal
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 2085)

Abstract

In this paper, Evenet-2000, a Java-Based neural network toolkit is presented. It is based on the representation of an arbitrary neural network as a block diagram (these blocks are, for example, summing junctions or branch points) with a set of simple manipulation rules. With this toolkit, users can easily design and train any arbitrary neural network, even time-dependent ones, avoiding the complicated calculations that the means of establishing the gradient algorithm requires when a new network architecture is designed. Evenet-2000 consists of three parts: a calculation library, a user-friendly interface and a graphic network editor with all the Java advantages: encapsulation, inheritance, powerful libraries...

Keywords

Neural Network Gradient Algorithm Graphic Editor Signal Flow Graph Calculation Library 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Eric A. Wan and Françoise Beaufays “Relating real-time backpropagation and backpropagation through time. An application of flow graph interrreciprocity.” Neural computation, volume 6 (1994), number 2, pp. 296–306.CrossRefGoogle Scholar
  2. 2.
    Acosta L., Marichal G.N., Moreno L., Rodrigo J.J., Hamilton A., Méndez J.A., “Robotic system based on neural network controllers”. Artificial intelligence in Engineering, vol 13,number 4, pp 393–398, 1999CrossRefGoogle Scholar
  3. 3.
    Piñeiro J.D., Marichal R.L., Moreno L., Sigut J., Estevez I, Aguilar R., Sanchez J.L. and Merino J. “Evoked potential feature detection with recurrent dynamic neural networks.” Proceedings in International ICSC/IFAC Symposium on Neural Computation NC’98. Vienna Austria, 1998Google Scholar
  4. 4.
    Campolucci P., Marchegiani A., Uncini A., Piazza F., “Signal-Flow-Graph Derivation of On-line Gradient Learning Algorithms” IEEE International Conference on Neural Networks, Houston (USA), June 1997.Google Scholar
  5. 5.
    Osowski S., Herault J. “Signal Flow Graphs as an Efficient Tool for Gradient and Exact Hessian Determination”. Complex Systems, Vol 9, 1995.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2001

Authors and Affiliations

  • Evelio J. González
    • 1
  • Alberto F. Hamilton
    • 1
  • Lorenzo Moreno
    • 1
  • José F. Sigut
    • 1
  • Roberto L. Marichal
    • 1
  1. 1.Department of Applied PhysicsUniversity of La LagunaTenerifeSpain

Personalised recommendations