ConSTrainer: A Generic Toolkit for Connectionist Dataset Selection
Abstract
Con ST rainer is a window-based toolkit dedicated to the task of collecting and validating datasets for training connectionist networks. Unlike other connectionist development tools, Con ST rainer is an application- and network-independent tool which can be configured to suit the requirements of a variety of applications through a simple-to-use configuration facility The facility allows the user to create and modify both domain/ranges and domain/range parameters alike. For each parameter in the training exemplar Con ST rainer supports the definition of mutually supportive and mutually exclusive parameter sets. A powerful set of consistency and validation checks is also supported, including vector orthogonality, weightsum checking, and re-ordering of the training dataset. This paper introduces the Con ST rainer toolkit and discusses its utilization in a non-trivial application for diagnostic decision support in Histopathology.
Preview
Unable to display preview. Download preview PDF.
References
- [AcHiSe85]Ackley D. H., Hinton G. E., and Sejnowski T. J. A learning algorithm for Boltzmann machines, Cognitive Science 9 (1985), 147–169.CrossRefGoogle Scholar
- [AngTre89]Angeniol B., and Treleaven P. C., “PYGMALION: Neural Network Programming & Applications”, proc. ESPRIT conf. 1989.Google Scholar
- [Barron83]I. Barron, et al, Transputer does 5 or more MIPS even when not used in parallel, Electronics, 56, 23, November 1983, 109–115.Google Scholar
- [Goddar89]Goddard N. H., et all, “Rochester Connectionist Simulator” Technical report TR-233, University of Rochester, Department of Computer Science, (July 1989).Google Scholar
- [Gutsch88]Gutschow T., “AXON: The researchers Neural Network Language”, Proc., Int. Neural Network Symp. INNS’88, (1988).Google Scholar
- [Hebb61]D. Hebb, Organisation of Behavior, Science Editions, New York, 1961.Google Scholar
- [Hanson87]Hanson W. A., et al “CONE Computational Network Environment”, proc., IEEE First Int. Conf. on neural Networks, pp III-531–538., June 1987.Google Scholar
- [Hinton87]G.E. Hinton, “Connectionist Learning Procedures”, Tecnical Report, Computer Science Department, Carnegie-Mellon University, 1–46, June 1987.Google Scholar
- [Hopfie82]J.J. Hopfield, Neural networks and physical systems with emergent collective computational abilities, Proc. Nat. Acad. Sci, 2554–2558, 1982, 79.Google Scholar
- [McCPit43]McCulloch W. S., Pitts W, “A Logical Calculus of the Ideas Immanent in Nervous Activity”, Bulletin of Mathematical Biophysics, 5, 115–133, 1943, also in Anderson, Rosenfeld (eds.): Neurocomputing.Google Scholar
- [KraFroRi]Kraft T. T., Frostron S. A., MacRtchie B., and Rodgers A., “The Specification of a Concurrent Backpropagation Network Architecture Using Actors”, Technical Report SAIC, San Diego, California 92121.Google Scholar
- [PaGuSk87]Paik E., Gungner D., and Skrzypek J., “UCLA SFINX a neural network simulation environment”, proc., IEEE Int. Conf on Neural Networks, vol 3., pp. 367–376, (1986).Google Scholar
- [RefKam90]Refenes A. N. & Kamalati A. H., “Dataset Optimisation for Training Connectionist Systems”, Research Note /BIOLAB/RN-19/90, Department of Computer Science, University College London, Submitted IEEE 2nd Int. Symposium on Parallel Systems“.Google Scholar