Keywords

1 Introduction

Mechanoreceptors on the skin are heterogeneously distributed, and sampling of neural signal in the brain may differ depending on the body part. Given that fact, it may be challenging for the brain to robustly represent stimuli presented to different body sites. In this study, perception of orientation and direction of stimuli presented on the hand and on the arm was investigated. An example of perceptual distortion of tactile space at the peripheral, wherein orientation (trajectory) of the stimuli on the forearm appears to fool the responses of the receptors, is introduced.

Interesting discrepancies between a perceived spatial representation and a physical space have been reported. Perceived space of the stimuli (i.e., the distance between two points of contact) shrinks along the proximodistal axis [1,2,3] and perceived location (i.e., the exact location of contact) shifts toward anchor points such as the wrist and elbow [2, 4]. Still, the general understanding of haptic-space representation and how it changes across the body remains poorly understood. In particular, whether it can be ascribed to the somatotopic mapping (i.e., receptor distribution and receptive field size), remains under discussion.

Variation of the distribution density of receptors is not a problem that occurs only in regard to the skin. Another 2D sensor array, the retina, has a heterogeneous sensor distribution, and computation of most of the basic visual features differ between central and peripheral vision. For example, the signal detection threshold degrades from central vision towards peripheral vision; however, this degradation is relatively weak in detecting flickering and moving signal [5]. On the other hand, in the case of touch, differences in perception of motion and orientation due to different stimulation sites have been sparingly studied [6].

In this study, the following two questions are addressed: (i) whether the direction of a simple moving stimulus that can be easily captured and tracked by eye (and presumably by hand) can be discriminated by the arm and, if not, (ii) how does the arm differ from the hand; that is, whether the difference can be ascribed to a difference in receptor distributions and in which reference frame the difference occurs.

2 Method

As shown in Fig. 1A, tactile stimuli were presented to subjects by a piezoelectric braille display (stimulator, hereafter) (Dot-view2, KGS, Japan) with an array of pins with diameter of 1.3 mm and inter-pin distance of 2.4 mm. Each pin can be switched independently to either the “on” position (maximum 0.7-mm normal displacement or less when damped by the contacting hand) or the “off” position (no displacement), and the status of the pins (“on” or “off”) was updated every 100 ms. They touched the stimuli with the volar surface of their left hand or forearm (Fig. 1B). Their view of the display was occluded by a black cardboard plate, and the subjects wore earplugs to mask noise made by the stimulator.

Fig. 1.
figure 1

(A) Braille-type stimulator. (B) View of the setup used in the experiments. The blue box represents the braille stimulator. (C) Subjects reported perceived direction of the stimulus trajectory in the direction-judgment experiment by pressing two keys (e.g., ‘up’ and ‘left’ for the illustrated stimuli at the bottom of the diagram). (D) A response map for the orientation-description experiment was presented on a screen in front of the subjects. The subjects reported perceived orientation (both edges) of the stimulus trajectory with respect to the stimulator surface by pressing two keys.

In the experiment on direction judgment (Fig. 1C), as the stimulus, one dot was moved in one direction at 50 mm/s for two seconds. The dot moved every 0.1 to 0.2 s, and this variation was unavoidable due to a characteristic of the display. The direction of the dot movement was upward or downward to the right or left (LU, LD, RU, and RD in Fig. 1C). Note that a new starting point of the dot was chosen in every trial, and when the dot reached the edge of the stimulus area, it appeared from the opposite edge. The length of the trajectory was varied across trials, but in all trials, the dot was moved on the same trajectory for more than 5 cm (i.e., half the diagonal of the stimulus area). The stimuli were presented within 32 × 32 pins for the “hand” condition and “arm” condition, while they were presented within 8 × 8 pins for the “s_hand” condition and 45 × 32 pins for the “l_arm” condition. The dot stimuli moved in parallel to the diagonal of these stimulation areas. The subjects were asked to answer two two-alternative forced choices (2-AFCs): whether the stimulus moved upward or downward (Q1) and leftward or rightward (Q2). After each response, a feedback signal was sent to the subjects by beep sound. Ten subjects participated.

In the experiment on orientation description, the stimuli were 12 aligned dots, presented within a circular area of 32 pins in diameter (74.4 mm), and their orientation was one of eight possibilities (Fig. 1D). The dots were presented sequentially in one direction [“move + condition”], in the opposite direction [“move−“], in random order [“shuffle”], or presented all at once [“static”] within 2 s. Partially spatially overlapping with adjacent dots, each dot consisted of four to six “on” pins and appeared for 0.1 to 0.2 s. These variations were unavoidable due to a characteristic of the display. The subjects were asked to report the perceived orientation of the stimuli with respect to the stimulator surface by pressing two keys according to the response mapping (Fig. 1D) presented on a screen in front of them on the keyboard (e.g., the subject pressed ‘4’ and ‘R’ when they perceived vertical orientation). Since whether the line stimuli were perceived symmetrically remained obscure, both edges of perceived shape (rather than one orientation) were recorded. The subjects did know that the presented stimuli passed the centre of the stimulus area, but they did not know that the stimuli were straight lines. No feedback signal was provided. Three different posture conditions were tested for different groups of ten subjects: subject’s hands/arms oriented straight ahead (“normal”), rotated outward (“divergent”), or rotated inward (“convergent”), as shown in Fig. 4.

3 Results

3.1 Direction-Judgment Experiment

To investigate perceived direction through the skin, the moving-dot stimuli were presented obliquely on the volar surface of the hand (palm) and on the arm (forearm) of each subject by braille. When a dot reached the outer boundary of the presentation area, it reappeared at the opposite end of its line of motion and started to move in the same direction along that line (Fig. 1C). This repetitive motion could induce ambiguity of the direction of motion, although the direction could be easily reported visually (preliminary reports). Note that a dot moved on the same trajectory for a longer distance than the two-point discrimination threshold for the forearm [7].

As shown in Fig. 2A, averaged performances of ten subjects were above chance level under the “hand” condition, while they were slightly lower when the dot was moving to the upper right (RU). Meanwhile, under the “arm” condition, the observed pattern of responses differed dramatically from that under the “hand” condition. Contrary to intuition, the performance not only dropped but was biased in a particular direction. The subjects tended to report the direction from upper left to lower right instead of that from upper right to lower left, regardless of the physically presented stimuli (see schematic illustrations in Fig. 2A). Note that the observed patterns do not simply reflect response “key-pressing” bias, since the subjects pressed “L” or “R” and “U” or “D” at roughly equal probability.

Fig. 2.
figure 2

Results of direction-judgment experiment. (A, B) Averaged response of 10 subjects. In the confusion matrices, columns represent presented direction of the stimuli and rows represent perceived direction. Diagonal lines represent correct responses. In the schematic pattern of the observed trend, where blue arrows represent correct responses and red ones represent incorrect responses. (C) Averaged correct rates of 10 subjects for each 2-AFC. Error bars represent 95% confidence intervals. The table lists the main effect group comparison (Ryan’s method, α = .05) of ANOVA. (Color figure online)

Two remaining conditions were designed to test whether mechanoreceptor distribution or receptive-field size could explain this apparently odd anisotropic representation of perceived orientation on the arm. In the “s_hand” condition, the stimulus area was reduced one-quarter of that under the “arm” condition to compensate for the difference in two-point discrimination thresholds for the palm and the forearm [7]. Although the performance dropped, it did not show similar directional bias as that under the “arm” condition (e.g., the RU stimuli were perceived roughly equally as applied in all directions). Since previous literature reported that distances feel longer along the mediolateral axis than along the proximodistal axis of the arm [1,2,3,4], the same experiment was conducted under the “l_arm” condition, under which the stimuli area was elongated 1.4 times in the vertical direction only. Although the direction discrimination performance slightly improved, it showed a similar directional bias to that under the “arm” condition. Subjects’ reports (Fig. 2C) were entered into a two-way repeated ANOVA with four stimulus location and two 2-AFCs. The ANOVA results indicated that all main effects and interactions were significant (F(3, 27) = 19.9, p < 0.0001 for stimulus location; F(1, 9) = 67.8, p < 0.0001 for 2-AFC; F(3, 27) = 32.2, p < 0.0001 for interaction). Main-effect group comparison (Ryan’s method, α = .05) revealed significant differences between all stimulus location pairs except the pair of the “arm” and “l_arm” conditions (Fig. 2C). These statistical tests suggest that neither the difference in two-point discrimination thresholds nor the difference in the tactile space can explain the difference between perceived orientation on the hand and that on the arm.

3.2 Orientation-Description Experiment

Since anisotropic distortion of perceived orientation on the arm was unexpectedly observed, the experimental task was changed, and this phenomenon was investigated in more detail. Subjects were asked directly to indicate the orientation of stimuli with respect to the stimulator’s surface. The stimuli were dots aligned in one of eight possible orientations (Fig. 1D) with four different dot sequences presented: dots appear one by one in one direction (“move+” condition), in the other direction (“move−”), in random order (“shuffle”), and appear at once (“static”).

As shown in Fig. 3, ten subjects achieved good orientation-description performance when the aligned dot stimuli were presented on the hand: the coloured radar charts have well-defined peaks along the black dashed line. Calculated bias and variance are small regardless of stimulus orientation or dot sequence (“move ±,” “shuffle,” or “static”). Meanwhile, their performance degraded when the stimuli were presented on the arm: both bias and variance became large with stimuli presented around 135° (enclosed by the yellow dotted line in the figure). The intensity of bias was similar for different dot sequences, but variance was smaller under the static condition (purple line in the graph). Anisotropic distortion of perceived orientation on the arm was also observed in this experiment. Perceived orientation on the arm was biased inward (i.e., clockwise for left hand/arm) when the stimulus angle was 135°; however, it was not biased outward nor inward when the stimulus angle was 45°. Note that the 135° stimuli in this experiment and the RU and LD stimuli in the direction-judgment experiment were not identical, but they had the same orientation, and in both cases, the subjects could not properly report the orientation of the stimuli. In addition, particular distortion patterns of each stimulus depending on the area of stimulus presentation (e.g., the stimuli on the lower half of the forearm are more biased compared to those on the upper half) were not observed. Rather, the response patterns roughly kept a linear symmetric shape. According to our pilot test with a smaller number of subjects, this phenomenon may show “body-central symmetry”: perceived orientation on the right arm was biased inward when the stimulus angle was 45°, but it was not biased when the stimulus angle was 135°.

Fig. 3.
figure 3

Results of orientation-description experiment. Each column represents the results obtained with a stimulus presented at 0 to 157.5°. A stimulus at 0° was defined with respect to the front edge of the braille display placed parallel to the table. Black dashed lines represent orientation of the presented stimuli, and each coloured line represents the proportion of reported orientation under each condition. Bias represents the mean discrepancy between reported and veridical orientations of the stimuli, while variance represents variability for each stimulus. Error bars represent 95% CI. (Color figure online)

One unique characteristic of haptic modality is its multiple reference frames. People can easily change their hand/arm posture, so the brain has to remap tactile input signals on skin (somatotopic) coordinates into environmental (spatiotopic or allocentric) coordinates. To consider in which reference frames the observed orientation distortion on the arm occurred, the same orientation-description task was repeated with different hand/arm postures. The subjects were asked to report perceived orientation in the environmental (not skin) reference frame. The stimulator stayed in a constant location with respect to the external world. If the distortion occurred in the environmental (i.e., eye/body-centred) reference frame, the reported orientation pattern would be similar regardless of hand/arm posture. If, on the other hand, the distortion occurred in the skin reference frame, the pattern would shift. In pilot test, it was observed that reported orientations became obscure (i.e., radar charts do not show sharp peaks) with divergent and convergent posture conditions. Thus, this experiment was conducted only with the static stimuli with which the lowest variance of reported orientation was observed under the normal posture.

Reported orientation was distorted in the “divergent” and “convergent” conditions even when the stimuli was presented on the hand (Fig. 4), and this finding is in line with that of the previous study that conducted an orientation-matching task with aluminium bars [8]. Observed variances in these conditions were higher than that in the “normal” condition, suggesting a higher level of task difficulty under the former conditions. According to the bias, the influence of the skin reference frame on perceived orientation was observed. In the divergent condition, the orientation in the environmental reference frame is shifted clockwise compared to that in the skin reference frame. Indeed, the reported orientation shifted clockwise. In contrast, the reported orientation shifted counter clockwise in the convergent condition. Bias for each posture condition was almost constant regardless of the stimulus orientation. On the other hand, when the stimuli were presented on the arm, bias of reported orientations varied according to the stimulus orientation in all posture conditions. The baselines (i.e., averaged performances across all orientations) differed according to posture condition, and this difference seems consistent with observed biases under the hand condition. Observed patterns of biases seem to be roughly consistent with the hypothesis that the distortion on the arm occurred in the skin coordinate, since it peaked around 90° under the divergent condition and around 0° under the convergent condition. Note that it remains unclear at this moment that whether the observed discrepancy with varied posture reflects the difference in discrepancy between the skin and environmental reference frames or reflects the difference induced by remapping difficulty and/or tightness of posture.

Fig. 4.
figure 4

Results of orientation-description experiment with varied posture. Hands/arms of subjects were oriented straight ahead (“normal,” represented in red), rotated outward (“divergent,” blue), or rotated inward (“convergent,” green). The subjects were asked to report perceived orientation of the static stimuli with respect to the stimulator surface (environmental coordinate, represented as arrows in the figure). Note that the result obtained under the normal condition is a re-posting of the result presented in Fig. 3. (Color figure online)

4 Discussion

Direction-discrimination performance and orientation-description performance at different body sites were measured. Reported orientations when the stimuli were presented on the arm were distorted in relation to those observed when the stimuli were presented on the hand (palm). In particular, inwardly inclined trajectory/shape is perceived more inwardly inclined. This distortion cannot be simply explained by the difference in receptor distribution, and shifted according to the skin reference frame. This study showed a clear example that the representation of simple stimuli is distinctly different when the stimuli are presented on different body sites. There might be a difference between central touch and peripheral touch in terms of computational processing.

The perceptual asymmetry of the mediolateral axis and proximodistal axis lines/motions on the arm has been reported. Jones et al. [4] presented moving stimuli by a three-by-three array on the volar surface of the forearm, and they reported that across-arm movement appears to be more easily recognized than along-arm movement. That result suggests that the edges of the arm may serve as landmarks for localizing cue. Closer-to-reality stimuli (in terms of resolution) were used in this study, and a similar trend was observed: subjects made more mistakes when presented with two alternative forced choices of direction between proximal or distal on the arm rather than those between medial or lateral (Fig. 2C). Note that the reported proximal-distal direction was even “flipped” in the present direction-discrimination experiment, and performance varied both under the movement and the static conditions in the present orientation-description experiment. These results seem difficult to fully understand in the context of previously introduced hypotheses on, for example, gravitation of anchor points, stretching of tactile space, receptive field shape, and the pixel model [1,2,3,4].

The perceptual asymmetries of inwardly and outwardly inclined lines/motions on the arm, on the other hand, have never been reported. Studies about distortion in haptic perception of parallelity on the hand might be relevant to the current findings, though the stimuli and task were not identical. Kappers, et al. [8] repeatedly and systematically investigated the distortion of perceived orientation by using a matching task of two oriented bars (a stable one for reference and a rotatable one for orientation matching). Their findings are consistent with the results presented here in the sense that the perceived orientation is not represented isotopically in all orientations and the distortion pattern changes according to hand posture. It may also be worthwhile considering the influence of dermatome difference. Mechanoreceptors on the hand/arms are distributed across multiple dermatomes, and the responses of each dermatome are projected to the brain through individual spinal segment. Though a previous study reported a minor effect of dermatomes during intensity discrimination task on the arm [9], the orientation (spatial relationship) might be calculated differently depending on whether the stimuli are presented within or across dermatomes. It seems that our inwardly and outwardly inclined stimuli were presented to the same degree over two dermatomes (C6 and T1) [10]; however, roll rotation of the arm actually non-uniformly stretches and rotates skin, and this uniformity remains unclear. In addition, the differences in cortical representation of each skin area (i.e., cortical magnification) may be related to the observed distortion on the arm, since the correlation with acuity of shape perception on the finger has been reported [11]. These are issues awaiting further investigation. It would be useful if the simple tasks used in this study could work as a probe to reveal the underlying remapping process of spatiotemporal perception by touch.