Six subjects participated in the experiment. The subjects’ ages ranged between 22 and 55 years. None of the subjects had a medical history or clinical signs of vestibular, neurological, oculomotor, or cardiovascular abnormalities. All subjects gave their informed consent. The experimental procedure was approved by the Medical Ethics Committee of Erasmus University Medical Centre and adhered to the Declaration of Helsinki for research involving human subjects.
Stimuli were delivered with a motion platform (see Fig. 1A, B) capable of generating angular and translational stimuli at a total of six degrees of freedom (FCS-MOOG, Nieuw-Vennep, The Netherlands). The platform is moved by six electromechanical actuators connected to a personal computer with dedicated control software. It generates accurate movements with six degrees of freedom. Sensors placed in the actuators continuously monitor the platform motion profile. Measured by these sensors, the device has <0.5-mm precision for linear and <0.05° precision for angular movements. Due to the high resonance frequency of the device (>75 Hz), vibrations during stimulation were very small (<0.02°). A comparison between the stimulus signal sent to the platform and the output measured with a search coil fixed in space while oscillating the platform confirmed that the platform produced a perfect sinusoidal stimulus (p < 0.001). During the experiments, platform motion profile was monitored by the sensors in the actuators, reconstructed using inverse dynamics and sent to the data collection computer at a rate of 100 Hz. To precisely synchronize platform and eye movement data, a laser beam was mounted at the back of the platform and projected onto a small photocell at the base of a 0.8-mm pinhole (reaction time, 10 µs). Simultaneously with the eye movement data, the output voltage of the photocell was sampled at a rate of 1 KHz. This way, the photocell signal provided a real-time indicator of zero crossings of the platform motion onset with 1-ms accuracy. During the offline analysis using Matlab (Mathworks, Natick, MA), the reconstructed motion profile of the platform based on the sensor information of the actuators in the platform was precisely aligned with the onset of platform motion as indicated by the drop in voltage of the photo cell.
Subjects were seated on a chair mounted at the center of the platform (Fig. 1A). The subject’s body was restrained with a four-point seatbelt as used in racing cars. The seatbelts were anchored to the base of the motion platform. A PVC cubic frame that supported the field coils surrounded the chair. The field coil system was adjustable in height such that the subject’s eyes were in the center of the magnetic field. The head was immobilized using an individually molded dental impression bite board which was attached to the cubic frame via a rigid bar. A vacuum pillow folded around the neck and an annulus attached to the chair further ensured fixation of the subject. In addition, we attached two 3D sensors (Analog devices) directly to the bite board, one for angular and one for linear acceleration, to monitor spurious head movements during stimulation.
Eye movement recordings
Eye movements of both eyes were recorded with 3D scleral search coils (Skalar Medical, Delft, The Netherlands) using a standard 25-kHz two-field coil system based on the amplitude detection method of Robinson (model EMP3020, Skalar Medical). The coil signals were passed through an analogue low-pass filter with cutoff frequency of 500 Hz and sampled online and stored to hard disk at a frequency of 1,000 Hz with 16-bit precision (CED system running Spike2 v6, Cambridge Electronic Design, Cambridge). Noise levels of the coil signals during fixation were 0.1° s−1. Coil signals were off-line inspected for slippage by comparing the signals of the left and right eyes. No significant differences were found (p = 0.907). Eye rotations were defined in a head-fixed right-handed coordinate system (see Fig. 1C). In this system, from the subject’s point of view, a leftward rotation about the z-axis (yaw), a downward rotation about the y-axis (pitch), and rightward rotation about the x-axis (roll) are defined as positive. The planes orthogonal to the x, y, and z rotation axes are, respectively, the roll, pitch, and yaw planes (Fig. 1D). Data were also analyzed by projecting them on these three coordinate planes.
Prior to the experiments, torsion eye position measurement error due to non-orthogonality between the direction and torsion coil was corrected using the Bruno and Van den Berg (1997) algorithm. At the beginning of the experiment, the horizontal and vertical signals of both coils were individually calibrated by instructing the subject to successively fixate a series of five targets (central target and a target at 10° left, right, up, and down) for 5 s each. Calibration targets were projected onto a translucent screen at 186-cm distance.
We determined head orientation with respect to gravity and its rotation center. Head orientation was as close as possible to the position where subjects felt straight up. In this position, we measured Reid’s line (an imaginary line connecting the external meatus with the lower orbital cantus; Fig. 1C, left panel). In all subjects, Reid’s line varied between 6° and 10° with earth horizontal. The center of rotation was defined as the intersection between the imaginary line going through the external meatus and the horizontal line going from the nose to the back of the head. The x, y, and z offset of this rotation center with respect to the default rotation center of the platform was determined. The offset values were fed into the platform control computer which then adjusted the center of rotation. Thus, all stimuli were about the defined head center of rotation.
Whole body sinusoidal rotations were delivered about the three cardinal axes, the rostral–caudal or vertical axis (yaw), the interaural axis (pitch), and the nasal–occipital axis (roll), and about intermediate horizontal axes between roll and pitch. The orientation of the stimulus axis was incremented in steps of 22.5° azimuth. The frequency of the stimulus was 1 Hz with a total duration of 14 s, including 2 s of fade-in and fade-out. Peak-to-peak amplitude of the sinusoidal rotation was 4° (peak acceleration 80° s−2). Sinusoidal stimuli were delivered in light and darkness. In the light, subjects fixated a continuously lit visual target (a red LED, 2-mm diameter) located 177 cm in front of the subject at eye level close to the eye primary position. In the dark condition, the visual target was briefly presented (2 s) when the platform was stationary in between two stimulations. Subjects were instructed to fixate the imaginary location of the space fixed target during sinusoidal stimulation after the target had been switched off just prior to motion onset. In a control experiment where we attached one search coil to the bite board and one coil to the forehead, we found that decoupling of the head relative to the platform was <0.03° (see Electronic supplementary material (ESM) Fig. S2).
All subjects were subjected to short-duration whole body transients in a dark environment where the only visible stimulus available to the subject was a visual target located at 177 cm in front of the subject at eye level. Each transient was repeated six times and delivered in random order and with random timing of motion onset (intervals varied between 2.5 and 3.5 s). The profile of the transients was a constant acceleration of 100° s−2 during the first 100 ms of the transient, followed by a gradual linear decrease in acceleration. This stimulus resulted in a linear increase in velocity up to 10° s−1 after 100 ms and was precisely reproducible in terms of amplitude and direction. Transients were well tolerated by our subjects. Decoupling of the head from the bite board was <0.03° during the first 100 ms of the transient. Peak velocity of the eye movements in response to these transients was 100 times above the noise level of the coil signals (Houben et al. 2006).
Coil signals were converted into Fick angles and then expressed as rotation vectors (Haustein 1989; Haslwanter and Moore 1995). From the fixation data of the target straight ahead, we determined the misalignment of the coil in the eye relative to the orthogonal primary magnetic field coils. Signals were corrected for this offset misalignment by 3D counter rotation. To express 3D eye movements in the velocity domain, we converted rotation vector data back into angular velocity (ω). Before conversion of rotation vector to angular velocity, we smoothed the data by zero phase with a forward and reverse digital filter with a 20-point Gaussian window (length of 20 ms). The gain of each component and 3D eye velocity gain was calculated by fitting a sinusoid with a frequency equal to the platform frequency through the horizontal, vertical, and torsion angular velocity components. The gain for each component defined as the ratio between eye component peak velocity and platform peak velocity was calculated separately for each eye. Because left and right eye values were not significantly different (p = 0.907), we pooled the left and right eye data.
The misalignment between the 3D eye velocity axis and head velocity axis was calculated using the approach of Aw et al. (1996b). From the scalar product of two vectors, the misalignment was calculated as the instantaneous angle in three dimensions between the inverse of the eye velocity axis and the head velocity axis. Because the calculated values only indicate the misalignment of the eye rotation axis as a cone around the head orientation axis, we also used gaze plane plots to determine the deviation of the eye rotation axis in yaw, roll, and pitch planes (see Fig. 1D).
Because misalignments could be due to changes in horizontal eye position, we calculated the standard deviation around the mean eye position during each 14-s stimulation period. The variability of eye position around the imaginary fixation point during the dark period was too small to have an effect on misalignment.
All transients were individually inspected on the computer screen. When the subject made a blink or saccade during the transient, that trace was manually discarded. This happened on average in one out of six cases. Angular velocity components during the first 100 ms after onset of the movement were averaged in time bins of 20 ms and plotted as function of platform velocity (Tabak et al. 1997b). Because the transients had a constant acceleration during the first 100 ms, the slopes of the linear regression line fitted through the time bins are a direct measure for eye velocity gain (Tabak et al. 1997a, b). Left and right eye gains were not significantly different (p = 0.907) and were averaged.
The 3D angular velocity gain and misalignment for each azimuth orientation were compared to the gain and misalignment predicted from vector summation of the torsion and vertical components during roll and pitch (Crawford and Vilis 1991). From this, it follows that the orientation of the eye rotation axis aligns with the head rotation axis when velocity gains for roll and pitch are equal, but when the two are different, there is deviation between stimulus and eye rotation axis with a maximum at 45° azimuth.
Repeated measures analysis of variance was used to test for significant differences in misalignment data during sinusoidal stimulation in the light and in darkness and in response to transient stimulation.