Boltzmann machines are usually defined as neural networks in which the input–output relationship is stochastic instead of deterministic (as it is in the original Hopfield network). Then, Boltzmann machines are normally defined as stochastic versions of the Hopfield model or other attractor neural networks. A main difference between Hopfield networks and Boltzmann machines is that whereas in Hopfield networks, the deterministic dynamics brings the state of the system downhill, toward the stable minima of some energy function related with some information content, in a Boltzmann machine, such prescribedstates of the system cannot be reached due to stochastic fluctuations. For a Boltzmann machine, then, the steady state of the system is characterized by an equilibrium probability distribution to be in such states, which is given as a function of the energy of such states by the Boltzmann distribution. The inherent...
- Hinton GE, Sejnowski TJ (1983a) Analyzing cooperative computation. In: Proceedings of the 5th annual congress of the cognitive science society, RochesterGoogle Scholar
- Hinton GE, Sejnowski TJ (1983b) Optimal perceptual inference. In: Proceedings of the IEEE conference on computer vision and pattern recognition. IEEE Computer Society, Washington DC, pp 448–453Google Scholar
- van Kampen NG (1990) Stochastic processes in physics and chemistry. North-Holland, AmsterdamGoogle Scholar