Reference Work Entry

Encyclopedia of Neuroscience

pp 577-577

Catastrophic Inference


One of the problems that arise during the training process of an artificial neural network is catastrophic inference, in which a task being learned overwrites previous learning. As network weights are adjusted to improve performance on the new task, performance on a previous task that relied on the old set of weights decreases, often catastrophically. This has presented a challenge to the application of connectionist simulations as models of biological or psychological data.


You are viewing the full content