Abstract
We survey results concerning a new stochastic network we have developed [1–7], which was initially motivated by neural network modelling [1], or — as we called it — by queueing networks with positive and negative customers [2, 3]. Indeed, it is well known that signals in neural networks are formed by impulses or action potentials, traveling much like customers in a queueing network. We call this model a G-network because it serves as a unifying basis for diverse areas of stochastic modelling in queueing networks, computer networks, computer system performance and neural networks. In its simplest version, “negative” and “positive” signals or customers circulate among a finite set of units, modelling inhibitory and excitatory signals of a neural network, or “negative and positive customers” of a queueing network. Signals can arrive either from other units or from the outside world. Positive signals are accumulated at the input of each unit, and constitute its signal potential. The state of each unit or neuron is its signal potential (which is equivalent to the queue length), while the network state is the vector of signal potentials at each neuron. If its potential is positive, a unit or neuron fires, and sends out signals to the other neurons or to the outside world. As it does so, its signal potential is depleted. In the Markovian case, this model has product form, i.e. the steady-state probability distribution of its potential vector is the product of the marginal probabilities of the potential at each neuron. The signal flow equations of the network, which describe the rate at which positive or negative signals arrive to each neuron, are non-linear. We discuss the relationship between this model and the usual connectionist (formal) model of neural networks, and present applications to combinatorial optimization and to image texture processing. Extensions of the model to the case of “multiple signal classes”, and to “networks with triggered customer motion” are presented. We also examine the general stability conditions which guarantee that the network has a well-defined steady-state behaviour.
Similar content being viewed by others
References
E. Gelenbe, Random neural networks with negative and positive signals and product form solution, Neural Comp. 1 (1989) 502–510.
E. Gelenbe, Queueing networks with negative and positive customers and product form solution, J. App. Prob. 28 (1991) 656–663.
E. Gelenbe, P. Glynn and K. Sigman, Queues with negative arrivals, J. App. Prob. 28 (1991) 245–250.
J.M. Fourneau and E. Gelenbe, G-networks with multiple classes of signals,Proc. ORSA Computer Science Technical Committee Conf., Williamsburg, VA (Pergamon Press, 1992).
E. Gelenbe and R. Schassberger, Stability of G-networks, to appear in:Probability and its Applications in the Engineering and Informational Sciences (1992).
E.Gelenbe, G-networks: Negative customers with batch removal, submitted to Perf. Eval. (1992).
E. Gelenbe, G-networks with triggered customer movement, to appear in J. App. Prob. (1993).
E. Gelenbe and F. Batty, Minimum graph covering with the random neural network model,Proc. ORSA Computer Science Technical Committee Conf., Williamsburg, VA (Pergamon Press, 1992).
D.E. Rumelhart, J.L. McClelland and the PDP Research Group,Parallel Distributed Processing, Vols. 1 and 2 (Bradford Books and MIT Press, Cambridge, MA, 1986).
E.C. Kandel and J. H. Schwartz,Principles of Neural Science (Elsevier, Amsterdam, 1985).
C.D. Garcia and W.I. Zangwill,Pathways to Solutions, Fixed Points, and Equilibria (Prentice-Hall, Englewood Cliffs, NJ, 1981).
J.T. Sejnowski, Skeleton fields in the brain, in:Parallel Models of Associative Memory, eds. G.E. Hinton and J.A. Anderson (Lawrence Erlbaum Associates, Hillsdale, NJ, 1981).
J.J. Hopfield and D.W. Tank, Neural computation in combinatorial optimization problems, Biol. Cyber. 52 (1985) 141–152.
W. Henderson, B.S. Northcote and P.G. Taylor, Geometric equilibrium for queues with interactive batch departures, private communication (August 1992).
J.G. Kemeny and J.L. Snell,Finite Markov Chains (Van Nostrand, Princeton, NJ, 1965).
L. Hérault and J.J. Niez, Neural networks and combinatorial optimization: a study of NP-complete graph problems, in:Neural Networks: Advances and Applications, ed. E. Gelenbe (Elsevier, North-Holland, Amsterdam, 1991).
V. Atalay, E. Gelenbe and N. Yalabik, Image texture generation with the random neural network model,Int. Conf. on Artificial Neural Networks (ICANN-91), Helsinki (June 1991).
E. Gelenbe, A. Stafylopatis and A. Likas, Associative memory operation of the random neural network model,Int. Conf. on Artificial Neural Networks (ICANN-91), Helsinki (June 1991).
M. Mokhtari, Application of the random neural network model to the recognition of typed images, to appear in Int. J. Pattern Rec. Art. Int. (IJPRAI).
G. Cross and A. Jain, Markov random field texture models, IEEE Trans. PAMI-5 (Jan. 1983).
S. Lakshmanan and H. Derin, Simultaneous parameter estimation and segmentation of Gibbs random fields using simulated annealing, IEEE Trans. PAMI-11 (August 1989).
L. Onural and M.I. Gürelli, Generation and parameter estimation of Markov random field textures by highly parallel networks, in:From Pixels to Features 2, ESPRIT BRA Workshop on Parallelism in Image Processing, Bonas, France (August 1990).
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Gelenbe, E. G-networks: a unifying model for neural and queueing networks. Ann Oper Res 48, 433–461 (1994). https://doi.org/10.1007/BF02033314
Issue Date:
DOI: https://doi.org/10.1007/BF02033314