Abstract
Formal grammars have been successfully simulated through Artificial Neural Networks. This fact has established a new approach to the problem of Grammatical Inference. First, [Pollack,91], [Giles,92] and [Watrous,92] trained network architectures from positive samples or positive and negative samples generated by regular grammars to accept or reject new strings. On the other hand, [Servan,88] and [Smith,89] used nets in which strings were fed character by character, so that the possible successors for each character were predicted. Later, [Servan,91] suggested that these networks could also predict the generation probabilities of each character in the strings generated by Stochastic Regular Grammars. Our present work shows empirical evidence supporting this suggestion.
Work supported in part by the Spanish CICYT, under gram TIC-1026/92-CO2.
Supported by a Spanish MEC postgraduate grant.
This is a preview of subscription content, log in via an institution.
Preview
Unable to display preview. Download preview PDF.
References
Continuous Speech Recognition: From Hidden Markov Models to Neural Networks. H. Boulard. EUSIPCO'92, vol. 1, pp. 63–70. 1992.
Finding Structure in Time. Elman,J.L. Technical Report 8801. Center for Research in Language. University of California. La Jolla. 1988.
Learning and Extracting Finite State Automata with Second-Order Recurrent Neural Networks. Giles,C.L et al. Neural Computation, no. 4, pp. 393–405, 1992.
Serial order: A parallel distributed processing approach. Jordan,M.I. Technical Report No. 8604. Institute of Cognitive Science. University of California. San Diego. 1988.
Minsky,M.L. Computation: Finite and Infinite Machines, Chap. 3.5. Ed. Prentice-Hall Prentice-Hall, Englewood Cliffs, New York. 1967.
Generalization of BackPropagation to Recurrent and Higher Order Neural Networks. Pineda,F.J. Neural Information Processing Systems. Ed. D.Z. Anderson. American Institute of Physics. New York. 1988.
The Induction of Dynamical Recognizers. Pollack,J.B. Machine Learning, no. 7, pp. 227–252. 1991.
Learning sequential structure in simple recurrent networks. Rumelhart,D.E. Hinton,G and Williams,R. Parallel distributed processing: Experiments in the microstructure of cognition, vol. 1. Ed. Rumelhart,D.E. McClelland,J.L. and the PDP Research Group. MIT Press. Cambridge. 1986.
Encoding sequential structure in simple recurrent networks. Servan-Schreiber,D. Cleeremans,A. and McClelland,J.L. Technical Report CMU-CS-183. School of Computer Science. Carnegie Mellon University. Pittisburg, PA. 1988.
Graded State Machines: The Representation of Temporal Contingencies in Simple Recurrent Networks. Servan-Schreiber,D. Cleeremans,A. and McClelland,J.L. Machine Learning, no. 7, pp. 161–193. 1991.
Learning Sequential Structure with the Real-Time Recurrent Learning Algorithm. Smith,A.W. and Zipser,D. International Journal of Neural Systems, vol. 1, no. 2, pp. 125–131. 1989.
Induction of Finite-State Languages Using Second-Order Recurrent Networks. Watrous,R.L. and Kuhn,G.M. Neural Computation, no. 4, pp. 406–414. 1992.
Experimental Analysis of the Real-time Recurrent Learning Algorithm. Williams,R.J. and Zipser,D. Connection Science, vol. 1, no.1, pp. 87–111. 1989.
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 1993 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Castaño, M.A., Casacuberta, F., Vidal, E. (1993). Simulation of stochastic regular grammars through simple recurrent networks. In: Mira, J., Cabestany, J., Prieto, A. (eds) New Trends in Neural Computation. IWANN 1993. Lecture Notes in Computer Science, vol 686. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-56798-4_149
Download citation
DOI: https://doi.org/10.1007/3-540-56798-4_149
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-56798-1
Online ISBN: 978-3-540-47741-9
eBook Packages: Springer Book Archive