A Mirror Neuron System for Syntax Acquisition

* Final gross prices may vary according to local VAT.

Get Access

Abstract

We investigate the use of a connectionist model of a mirror neuron cortical network for a context free syntax acquisition task. A finite state representation of the context free grammar is learned by an implicit knowledge system (IKS) modelled by a connectionist network. A mirror neuron system (MNS) whose evolutionary pedigree suggests adaptation for goal-directed sequential processing is used to track embedded recursions in a learned finite state model of the grammar. The mirror system modifies the output of the IKS depending on the depth of embedding. Reciprocally the IKS updates the MNS as natural ‘goals’ occur within a sequence during sentence production. This solves the computationally hard problem of inferring contexts from sequential input.