Introduction

Humans perceive their own body and the surrounding environment through several sources of information that reach our brain simultaneously. Nevertheless, the nervous system (NS) is perfectly capable of organizing and processing all this information. The processed sensory cues could be about complementary aspects of the environment. In this case, the brain collects more and more different information that are combined to disambiguate the perceptual event [1,2,3]. Alternatively, the signals could contain redundant information, e.g. both visual and proprioceptive cues that provide knowledge about the position of the hand in space. In this other case, the NS uses the redundant information to obtain percepts of the environment and the body that are more robust than the ones achievable using only unisensory cues. This process is defined as sensory integration (Fig. 1), and an optimal model of such an integration mechanism has been provided in the seminal work by Ernst & Banks [4•]. But when facing an amputation, the somatosensory modality related to the disfigured limb is lost forever, with dramatic consequences. As demonstrated by seminal experiments [5], vision is not suitable to extract tactile information from the environment. Accordingly, the manual dexterity and motor ability of these individuals with reduced somatosensory sensation is incredibly impoverished [6•], even when the limb is replaced by a robotic prosthesis. A first important objective in planning new and advanced prostheses is to design devices that are able to restore the lost physical ability to users. Recently, implantable neural interfaces have yielded promising results in providing somatosensory feedback to amputees. Thus, knowing whether and how the artificial sensory feedback is combined with other sensory information became a fundamental step to re-establish the natural flow of multisensory information prior to the amputation. The humans’ intact NS is able to fully understand when multisensory information relates to the same object and should be merged and integrated into a unique percept. But how does it decide that it is reasonable to integrate the sensory signals? How does the brain know when the signals come from a unique object or body part, and when they arise from separate objects of perception? A straightforward answer to this question is still missing; however, more accurate knowledge about this topic is needed to design fully body-integrated artificial limbs. Furthermore, the artificial sensory information could modify the perception of the prosthesis itself. It favors a more natural perception, having effects on the central representation of the patient’s body as real limbs [7,8,9]. Yet, to date, many limb amputees reject current artificial limbs or only use them sporadically, for their limited controllability and also because they are not yet perceived as real limbs [10], relying only on visual, instead of tactile and proprioceptive feedback [11, 12]. In addition, suffering the loss of somatosensory sensations (after an amputation, a stroke or as in deafferented individuals) has been proven to be linked with drastic changes in the perception of the body. The incorporation of prosthetic devices is further hindered by persisting distorted sensations of the missing limb (phantom limb syndrome) [11, 13], strongly impacting quality of life of amputee patients [14] and limiting prostheses acceptance [13]. The exact etiology of such phenomena is not clear, and recent literature suggests that it might be linked with an impairment of the multisensory signals processing [15••, 16, 17]. Reframing the problem in the probabilistic terms of perceptual integration with which we started, we might question whether the abnormal sensation of not perceiving the arm as one’s own (alien limb/somatoparphrenia or lack of embodiment) might be interpreted as a failure of the nervous system to put together sensory information that should actually be combined. Likewise, disturbances in body awareness such as the phantom perception of a limb with abnormal sizes/shapes/positions, might result from a problem in the balance with which information is integrated by the brain.

Fig. 1
figure 1

Multisensory experience in humans. Mechanisms and perceptual consequences of multisensory processing is illustrated. On the right the unisensory processing of visual (top) and somatosensory (bottom) information is shown. Sensory receptors receive sensory information and transmit them to the brain where they will be first transformed into the same coordinates and units (i.e. combined), then integrated to obtain more robust percepts. An optimal integration model quantitatively describing integration between and within sensory modalities has been provided by Ernst & Banks (2002) (top left). Finally, multisensory information contributes to build robust percept of the external environment and our own body (bottom left). Adapted with permission from [19] and [6•]

The multisensory integration of artificial somatosensory feedback in hand prostheses holds great potential for tailored therapy and treatment to promote prosthesis embodiment while reducing perceived phantom limb distortions [18]. The next paragraphs will offer an overview on what features future neuroprostheses must have in order to be fully integrated for the complete restoration of the perceptive abilities. It will also discuss how research on advanced and body-integrated technologies have proven to be a new promising method for answering basic research questions studying mechanisms of multisensory integration and bodily self-consciousness.

Mutisensory in Neuroprosthetics

A major goal of somatosensory neuroprosthetics is to design artificial limbs that are experienced (“embodied”) like real limbs [10]. In the case of an upper limb amputation, the sensory system of the subject is severely impaired invalidating his/her motor repertoire. As sensory feedback is pivotal for achieving seamless and effective motor control, it stands to reason that a prosthesis would function better if users could rely on it [20]. While commercial prostheses restore an acceptable level of grasping functions, they do not provide any sensory feedback information to the user (absence of sensory-motor integration).

Different technological and surgical approaches have been developed to restore missing sensory information in people with limb amputation. Invasive (where a surgical intervention is required to exploit the technology) and non-invasive (technology in contact only with the skin) solutions have been proposed as promising tools able to improve functional, health and cognitive abilities of people suffering from limb loss. Non-invasive strategies exploit vibrators [21], electro-cutaneous stimulators [22] or mechanical probes [23] to deliver a meaningful sensory feedback to prosthetic users in real-time. Previous studies have shown interesting results on non-invasive sensory feedback systems, but their application outside of the laboratory environment together with metrics detailing health and functional benefits have not yet been demonstrated. In the recent past, the use of robotic devices directly connected to the human peripheral nervous system (called neural interfaces) has extensively shown the ability to restore a meaningful tactile experience in patients affected by sensory-motor deficits, as in limb amputees [6•] (Fig. 2). These neuroprosthetic devices utilize electrical neural stimulation to activate sensory afferents in the nerves eliciting sensations referred directly on the phantom limbs (i.e. somatotopic) [24,25,26,27,28,29,30].

Fig. 2
figure 2

Somatosensory neuroprosthesis. Peripheral Neural Interface is inserted in the proximal part of the ulnar nerve through the exposed nerve fascicles and delivers the elicited sensory stream to the spinal cord (top left). Somatosensory information can be used to deliver stimulation pulses (somatosensory feedback; bottom). Real time data can be acquired through a specific encoding strategy, and somatosensory perception is provided by delivering stimulation pulses to elicits sensations (somatosensory feedback) (bottom). The photo on the right shows a peripheral neural link between the prosthesis and the sensory nerves that allows for a high prosthesis embodiment (more body-integrated perception of the artificial limb)

The implant of neural interfaces into the nerves guarantees a direct link between the external robotic devices (e.g. prostheses) and the brain. The ideal artificial sensory signal would be able to fulfill the same functions that our sensory system has in natural motor control: providing sufficient information to allow competent performance in the absence of other sensory inputs and permitting multisensory integration with vision to reduce information variability when both signals are available [31••]. In other words, the artificial sensory feedback provided to the prosthesis user should be well-integrated in the sensory-motor control and with the other sensory modalities [32]. In the recent past, multisensory integration was studied using sensory feedback in amputees exploiting psychophysical methods using both invasive and non-invasive technologies [15••, 33, 34••, 35••, 36••, 37••, 38]. These studies adopted a rigorous and standard methodology to investigate quantitively whether the artificial feedback was integrated with the residual sensory information (e.g. vision) following the same general principle as in the intact NS [4•]. Dadarlat and colleagues (2015) [31••] examined optimal multisensory integration in non-human primates. The primates followed a long training (training regime was employed for approximately 20,000 training trials with the first monkey and 40,000 training trials with the second monkey, at which point the animals showed clear evidence of sensory integration of the visual and artificial tactile signals) to create an unnatural mapping between an artificial multichannel intra-cortical micro-stimulation signal (ICMS) and the direction of hand movement. Their results show that the ICMS information about hand position is integrated with vision to form an optimal estimate of hand movement direction. Furthermore, Risso et al. [34••] adapted the standard methodology [4•] to investigate whether an individual with a neural implant for sensory feedback restoration was able to integrate artificial information as the natural sense of touch. In a short training session (less than 10 min), the sensations elicited by the neural stimulation were mapped to the phantom hand, preserving somatotopic information. However, in a preliminary sensation characterization test, the participant reported a graduated sensation of vibration during the neurostimulation, that is a type of sensation modality-matched (i.e. somatosensory) but not completely natural. Interestingly, the patient was able to optimally integrate the artificial somatosensory information to discriminate different object’s sizes, showing that a full sensation naturalness is not a necessary feature for optimal integration to happen. The same psychophysical method using short training was adopted to investigate visuo-tactile integration of vibration information in lower limb amputees provided with non-invasive tactile feedback. Contrariwise to the previous study, in this case the somatosensory information was homologous (i.e. the vibration sensation was related to a vibrating object so there was a full match in the perception between vision and touch). However, the multisensory signals were not spatially matched since the visual information was provided through virtual reality on the (virtual) foot, while the tactile sensation was provided through electro-cutaneous stimulation on the stump (remapped feedback). Nonetheless, also with these specific features, optimal integration was observed in all the participants [39••].

Taken together, these studies show that humans exploiting artificial sensory feedback behaved very similarly to when they use the intact nervous system in multisensory tasks. More precisely, these findings show that the NS seems to combine multisensory integration in a statistically optimal fashion, weighting the available sensory information according to their reliability. For example, the visual dominance occurs when the variance associated with visual estimation is lower than that associated with haptic estimation, while haptic dominance occurs in the opposite case. Besides proving evidence that optimal sensory integration between artificial sensory feedback and the other natural sensory signals is possible, recent literature also shows the cognitive benefits of such a brain-machine interface. Indeed, Risso et al. have shown that sensory feedback reduced the sensory processing time in a discrimination task. This occurs when multisensory artificial and natural signals are provided together but does not when only vision is available (even if it is the dominant sense) [39••].

Interestingly, vibrating the muscles via a neural-machine interface, Marasco et al. elicited illusory perception movements in amputees provided with brain implants [35••]. This natural kinesthetic feedback was rapidly integrated by participants and enabled clear improvements in their movement control. Moreover, the subjective sense of agency (defined as the feeling of control over actions and their consequences) associated with actions generated via intracortical brain-machine interfaces, has been recently investigated and proven to be relevant for clinical applications of brain-machine interfaces [17]. Indeed, it is noteworthy that these last studies [35••, 39••] highlighted the multisensory benefits in terms of speed of information processing and manual dexterity. However, integrating information across sensory systems is critical to bodily self-consciousness and building coherent representation of one’s own body [40, 41]. Accordingly, the benefits of a multisensory integration approach to prosthesis embodiment and to the treatment of phantom sensations will be discussed in the next section.

Embodiment and Cognitive Integration

Current artificial limbs are used only sporadically because they are of limited functional utility and controllability in daily life activities. Moreover, they are often not yet perceived as real limbs since they are not experienced as part of one’s own body (low prosthesis embodiment). In addition, prosthesis embodiment and acceptance, as well as amputees’ quality of life, are further limited by persisting distorted sensations of the missing limb (i.e. phantom limb syndrome) [13]. These requirements, together with the difficulty to treat phantom limb syndrome, significantly reduce their potential clinical impact and limit their usability in real-life situations.

Recent research has linked bodily self-consciousness to the processing and integration of multisensory bodily signals [40,41,42]. Indeed, activation of multisensory brain areas and multimodal neurons has been found in participants reporting abnormal perception of their body [43,44,45]. Further, experimental support for this claim comes from the so-called rubber hand illusion (RHI) [46]. These classical studies demonstrated that when healthy participants watch an artificial hand being stroked in synchrony with stroking on their own hidden hand (placed behind a barrier), they report to feel touch coming from the artificial hand and the artificial hand to be their own hand [43, 46, 47]. These changes in tactile perception and hand ownership are often accompanied by a drift in the perceived position of one’s own hand toward the artificial hand (i.e. proprioceptive drift). Upper limb amputees also reportedly experience the RHI [48, 49], suggesting the potential relevance of multisensory stimulation to boost embodiment for artificial limbs in amputees [32].

Interestingly, Rognini and colleagues [15••] investigated the effect of multisensory integration on prosthesis embodiment and phantom limb perception in two upper-limb amputees implanted with neural interfaces in the median and ulnar nerves. Visual information was provided in virtual reality while tactile information were provided through direct nerve stimulation. Notably, the multisensory stimulation reduced the distorted phantom limb perceptions in both subjects thereby reducing the “telescoping effect”(13). This result was also confirmed in more natural scenarios, when the sensory feedback was implemented in a closed-loop prosthesis and exploited in real-time in motor tasks [33]. Furthermore, prosthesis embodiment has been also studied longitudinally in an amputee receiving feedback through intraneural and perineural multichannel electrodes implanted in her stump. Results showed that intraneural stimulation produced an extension of peripersonal space (i.e. improved the embodiment of the prosthesis) regardless of the level of anthropomorphism of the prosthesis. In this study, the authors also measured the crossing hand effect in the Temporal Order Judgement (TOJ). In this TOJ task, two stimuli were presented with varying stimulus onset asynchronies and participants are asked to indicate the temporal order of the two stimuli wearing different types of prosthesis. Interestingly, the expected worsening of the TOJ was observed only when the participant was wearing the most used prosthesis, even if it was the less anthropomorphic one. This highlights the relevance of the training and use to prosthesis embodiment [36••].

The implications of such sensory feedback restoration was also recently explored in lower-limb amputees [25, 37••]. The effect of adding neural feedback to prosthetic legs, not only enhanced the prosthesis embodiment, but it also had beneficial effects related to cognitive integration. These studies found increased embodiment of the lower limb prosthesis by decreasing phantom leg displacement perception, and they also found higher scores on subjective questionnaires assessing the embodiment [25, 39••]. In addition Petrini et al. [25] demonstrated easing of the cognitive effort during a dual-task paradigm, through electroencephalographic (EEG) recordings. During this task, the subjects had to walk while listening to tones and paying attention on higher ones. Meanwhile the ERP (Event-Related Potential) component of EEG was measured in order to determine if the mental capacity available for walking was higher or lower with or without sensory feedback. Results indicated that with the feedback users had a larger amount of mental capacity available, as when counting the tones while sitting, while in the case of no-feedback, they had a smaller amount of available mental resources. Therefore, amputees can walk freely while thinking about different activities other than controlling the device. Brain activity measurements and psychophysical tests revealed that the neuroprosthesis is perceived as an extension of the body, as a real limb. This effect was also demonstrated to be associated with the perceived prosthesis weight. Indeed, amputees perceive prostheses as heavy, despite being lighter than natural limbs. Preatoni et al. [37••] showed that intraneural sensory feedback decreases subjective perception of the prosthesis weight. This brain trick is caused by cognitive integration of the sensory feedback, shown in a dual task. It results in an increased embodiment of the device.

Overall, these studies show that the integration of artificial somatosensory feedback in upper and lower limb amputees is effective in increasing embodiment and in reducing abnormal perception of the body. Furthermore, the benefits of multisensory stimulation on bodily perception have recently been extended to anorexia nervosa [50] and also to stroke patients [16], showing the potential of multisensory integration restoration to the treatment of a variety of pathologies lacking somatosensory sensation or characterized by abnormal body perception.

Discussions and Perspectives

The current literature shows that artificial and natural sensory information can be optimally integrated following the same integration pattern as when information sources are both natural. The restoration of the missing sensory modality and its integration with the residual ones showed benefits for multiple aspects of recovery, such as dexterity, speed of information processing, treatment of phantom distortions and increase of embodiment.

Noteworthy, despite dealing with artificial information, the nervous system is quite skilled in this integration mechanism and is able to combine information with very different features such as (i) the type of sensation (natural vs. non-natural), and (ii) the amount of training (short vs. long) or the spatial matching of the multisensory information (congruent vs. non-congruent). This last evidence is quite surprising given that previous traditional research on multisensory processing identified the spatial and temporal congruence of information as fundamental features for perceptual integration to happen [51]. In cognitive neurosciences, longstanding interest has been given to the mechanisms used by the humans’ brain to understand when multiple sensory integration should be merged to create a unique percept and when it should not (i.e. neural bindings problem) [52, 53]. The most recent proposals suggest that in order to integrate the sensory information, they must be thought as belonging to the same object (unity assumption). This belief included all the features that were investigated as potentially fundamental before (e.g. spatial and temporal congruency, the experimenter’s instructions, the semantic congruency) [54]. Indeed, it is impossible to reconstruct the environment by only relying on sensory information, prior knowledge (that might be unconscious) to interpret the world is needed, as formalized in the Bayes’ theory [55, 56]. Not surprisingly, the same Bayesian approach to stimulus processing has recently appeared in the literature on body consciousness. These studies intend to investigate whether the processing of multisensory stimuli related to body parts can be formalized using the same rules applied to the perception of external objects and described by Bayesian models [57]. The previously presented studies on optimal integration of artificial feedback show that the same rules apply to the processing of sensory information related to external objects and to the body (defined as a bottom-up sensory process that can be described by the maximum likelihood estimation model). However, the optimal multisensory integration described by Ernst & Banks is defined as an entirely bottom-up process. We suggest that prior information, and in particular — the prior of common cause (i.e. the prior concerning whether multisensory signals originate from the same source or not) — might differ between healthy participants and amputees. The optimal integration theory suggests that participants were likely to believe that both the visual and tactile cues belong to a single object, which is his/her own limb. Future studies should investigate whether the ability to integrate non-spatially congruent stimuli would also be possible in able-body subjects still having proprioceptive information that would somehow conflict with the artificial information. This would allow, on the one hand, identifying differences in stimulus processing mechanisms between healthy and amputee subjects; on the other hand, it could offer an experimental tool able to measure embodiment objectively and quantitatively. In fact, information would be integrated according to the rules of the optimal integration model only if considered part of the same limb.

Conclusions

Neurotechnologies utilizing electrical stimulation allow one to communicate directly with the nervous system providing an artificial touch experience to people with limb amputation. The artificial sense should be optimally integrated with the residual senses, as in the natural system. This would allow one to experience a more natural sensory experience enhancing the beneficial effect of these technologies on patient motor, sensory and cognitive abilities. Indeed, this sensory restoration guarantees a better prosthesis embodiment and a higher overall cognitive integration with direct positive consequences on the quality of life of people with limb amputation.