We believe that there will one day be a programmable ion layer surrounding our body which will be able to stimulate our individual senses so that we are enabled towards a truly augmented reality without wearables and special hardware such as screens. This is not pure fantasy since a number of people in the Nanotechnology community such as Storrs Hall (http://discuss.foresight.org/~josh/UFog.html) and Drexler (1992) are exploring methods that would allow the creation of a “Utility Fog” (Storrs Hall) that would, among other things, allow our senses to be stimulated and our movement sensed.
Our human-centred work with people with special needs, who are disabled, elderly or in rehabilitation allows us, in certain circumstances and with certain individuals, to get closer to higher nuances of the senses and what the senses mean. By exploring that research and working in cross-disciplinary teams including neuropsychologists, computer scientists, human-computer interaction (HCI) researchers and others, we hope to work slowly towards that goal.
We believe that the immersive “play” within an interactive environment has much potential to do good. Specific to the current work is the belief that everyone is so individual that a system is required that can be tailored to each desire, facility and requirement — this entails adaptability with a capital A. Libraries of input HCI devices, together with libraries of mapping devices and libraries of output software is the optimal way forward. Through non-fixation on interactive spaces through invisible system components — hardware (disappearing computers, sensors embedded in environments etc.) — we will obtain the mapping of body function subconsciousness that can help people. The subliminal pervasive aspects — or really more so terms such as “proactive computing” (that which focuses on improving performance and user experience through speculative or anticipatory actions) and “autonomic computing” (which focuses on improving user experience through the system’s self regulation) in relation to this work are preferred as both relate to the user experience rather than the artefacts often referred to in pervasive computing — are obvious. We also believe that responsive audio/visual/haptic feedback may have much more to offer in the future over and above what we now utilise, in fact we believe that correspondences between synchronised feedback especially sonic and visual (in the first instance) are only just scratching the surface and that there are many new discoveries waiting to be made.
Unable to display preview. Download preview PDF.