Vision of the hand during reaching provides dynamic feedback that can be used to control movement. We investigated the relative contributions of feedback about the direction and distance of the hand relative to a target. Subjects made pointing movements in a 3-D virtual environment, in which a small sphere provided dynamic visual feedback about the position of their unseen fingertip. On a subset of trials, the position of the virtual fingertip was smoothly shifted by 2 cm during movement, either (1) in the direction of movement, which would require adjustments to the distance moved, or (2) orthogonal to the direction of movement, which would require adjustments to the direction moved. Despite not noticing the perturbations, subjects adjusted their movements to compensate for both types of visual shifts. Corrective responses to direction perturbations were observed within 117 ms, and response latencies were invariant to movement speed and perturbation onset time. Initial corrections to distance perturbations were smaller and appeared after longer delays of 130–200 ms, and both the speed and magnitude of responses were reduced for early onset perturbations. Simulations of a feedback control model that optimally integrates visual information over time show that the results can be explained by differences in the sensory noise levels in the visual dimensions relevant for direction and distance control.
Hand movementsMotor controlSensorimeterOn-line controlVision