Keywords

1 Introduction

People often need to move toward an unseen goal, and to manage this they have to plan their movements by gathering and organizing all the available knowledge about the environment [1]. In other words, they need to form a mental picture or a cognitive map that in turn supports their movement within the environment. The procedure of forming a mental picture is very significant for every day life activities (e.g. go for shopping, go to work) [2].

Individuals with visual impairment are required to rely on all their senses, other than sight, for the perception of spatial information and objects’ attributes, such as (e.g., shape, dimensions) [3]. Although visual input is indeed necessary for spatial coding, lack of it can be compensated for through the development of another sensory modality. For instance, part of this research underlines the effectiveness of touch in specific tasks, which individuals with visual impairments performed as well as or even better than their sighted counterparts [46]. Although spatial information are received in a slower rhythm when relying on touch than in the case of visual reception, this does not have necessarily an impact on the quality of representation produced [7].

Cognitive mapping is in effect a process of mental representation of spatial knowledge [8], during which an individual acquires, stores, recalls, and decodes information about the relative locations and attributes of the phenomena in his/her environment [9]. While cognitive mapping of spaces is a prerequisite to develop adequate orientation and mobility skills [10], most of the information required for cognitive mapping is gathered through the visual channel [11].

Individuals with blindness are facing significant difficulties during their orientation and mobility in space. The majority of the researchers that examined spatial performance of individuals with visual impairments and sighted individuals came to the conclusion that visual experience influences decisively spatial behavior [1214]. Moreover, blindness has a negative impact on the development of blind people’s spatial skills [1416]. However, cognitive maps of individuals with visual impairments appear to contain basic environmental features as streets, buildings, parks, fixed obstacles, bus stops etc. [17] and show that they understand spatial relationships between places when presented on a tactile map [18]. Knowing how individuals with visual impairments understand space and what are the features that their cognitive maps contain could help planning the environment appropriately, make the right information available to them and improve their wayfinding [18].

The study of spatial knowledge seems to be a complex process, especially considering that cognitive maps are dynamic entities which continuously change and evolve [8]. The evaluation of spatial knowledge for individuals with blindness uses techniques divided into route-based techniques and configurational techniques [19]. A widely used technique for examining configurational knowledge is reconstruction tasks, where individuals with blindness are asked to build a haptic model [18]. However, Kitchin and Jacobson [19] revealed a highly interesting dimension of evaluation techniques for spatial knowledge, indicating that the restrictions each evaluative process sets as well as the skills of individuals with blindness related to the performance of knowledge [18] may induce specific cognitive results [20].

Supporting the relative localization of objects, maps lead to the acquisition of survey knowledge; a knowledge than can be obtained more quickly and with less effort than direct experience either from sighted individuals [21] or from individuals with visual impairments [22].

Researchers have pointed out that raised-line graphics of the spatial environment prepare individuals with visual impairment to travel an unfamiliar space more safely and efficiently than work with a verbal description or a sighted guide [23], demanding a smaller cognitive load than direct experience [24]. Thinus-Blanc and Gaunet [24] stated, also, that when an individual with blindness read a haptic map has the ability to maintain a stable reference point. Using points of reference during spatial learning enables allocentric coding which leads to better spatial performance and knowledge [12, 25].

In the case of individuals with visual impairments, maps contribute to the handling of daily living problems inducing autonomy, independence and a better quality of life [23, 26, 27]. Tactile maps are important for spatial awareness [28] of close or distant places supporting wayfinding [29] and orientation and mobility of individuals with visual impairments [30], as well as improving spatial cognition in the long-term [31].

Research on the optimum support that could be offered to individuals with visual impairments, propose the conjunction of the haptic modality with the audio/vocal one [3]. It is undisputable that the amount of information perceived through touch is significantly restricted compared to the amount of information perceived through sight [3]. In order to enhance the amount of information, multi-modal approaches for individuals with visual impairments are being into research [3]. Hence, multimodal interactive maps could enable the access of people with visual impairment to spatial knowledge [32].

Zeng and Weber [33] give an overview of different type of maps used by people with blindness. The authors also mention audio-tactile maps which combine audio with tactile information. In case of Audio-Tactile maps, information can be represented by tactile graphics, audio symbols, tactile symbols, audio-tactile symbols (combined e.g. a tactile symbol that when a user touches it, he can hear additional information) and Braille labels. Audio-tactile maps become available with the use of a touchpad.

Touchpad offers at the same time access to the benefits of tactile maps and verbal aids. The combination of auditory and tactile information may result in a more complete concept [34]. Landau and his colleagues [34] found that individuals with visual impairments can enjoy control and independence coming from the ability to make choices between tactile and auditory information used through a touchpad.

Moreover, touch pads give the ability to use environmental auditory cues, incorporating, in a way, the soundscape into the tactile map. Including auditory cues in a map may promote an individual’s orientation, since individuals with visual impairments are proved to use auditory cues to determine and maintain orientation within an environment [35, 36] and to associate the soundscape with the structural and spatial configuration of the landscape and create cognitive maps [37].

2 Study

The aim of the present study is to examine the impact of audio-tactile aids on the improvement of cognitive maps of individuals with blindness with reference to a familiar area. Specifically, the aim is actually the comparison of the cognitive maps of individuals with blindness before and after the use of the audio-tactile map.

2.1 Participants

The sample of the study consisted of 20 individuals with blindness, (11 males and 9 females). The age ranged from 20 years to 52 years (M = 37.05, SD = 10.35). Seventeen participants were blind or had severe visual impairments and 3 had the ability to detect very large objects. The visual impairment was congenital for 10 participants and acquired for the rest 10 participants.

The participants were asked to state the way of their daily move in outdoor places, by choosing one of the following: (a) with the assistance of a sighted guide, (b) sometimes myself and sometimes with the assistance of a sighted guide, and (c) myself, without any assistance. Moreover, the participants were asked to indicate the frequency of their independent movement using a 5-point likert scale: always, usually, sometimes, seldom, or never. In addition, these two questions were answered from orientation and mobility (O&M) specialists, who were familiar with the participants and could assess the latter’s ability of independent movement. Tables 1 and 2 present the answers of the participants and O&M specialists.

Table 1. Ability of independent movement according to the answers of participants and O&M specialists - the score represent the number of participants in each group.
Table 2. Frequency of independent movement according to the answers of participants and O&M specialists - the score represent the number of participants in each group.

2.2 Instruments

The main research instrument was audio-tactile maps of a familiar city area. An audio-tactile map was created to represent each of the three different routes. These routes had approximately the same length and the same number of turnings.

Researchers visited each route, recorded the spatial information (as far as absolute location and kind of information are concerned) and selected 30 of them to be mapped out. The choice of spatial information was made in a way that the existence of spatial information on every street of the route was assured.

Moreover, sound recording for each route was made at a certain time, during evening hours and for 20 s at each point. Sound was recorded at the beginning and the end of each route, at all intersections and at some places with special auditory information, such school, café, car wash etc. For the recording a Telinga Stereo Dat-Microphone was used with the recording system Zoom H4n-Handy Recorder.

Adobe Illustrator CS6 was used for the creation of digital tactile maps. These maps were then printed on microcapsule paper and the 3 tactile maps (one for each route) were developed. On each tactile map dots were placed at the locations of spatial information and short length vertical lines were placed on the locations where sounds were recorded. Moreover, the street names and the spatial information were presented through synthetic speech; there were no Braille labels.

The software application Iveo Creator pro 2.0 together with the device touchpad, were used to develop the audio-tactile maps. Both of them are products of “ViewPlus® Technologies” company. The files produced by the software are saved in Scalable Vector Graphics (SVG) format. The touchpad device is a pointing device consisting of specialized surface that can translate the position of a user’s fingers to a relative position on the computer screen. When used in combination with a tactile image, this device has the potential to offer tactile, kinaesthetic and auditory information at same time [38].

A laptop, a touchpad device and headphones through which participants listened to audio information (street names, spatial information and sounds) were used by them to read the audio-tactile maps.

In the phase where their cognitive map was depicted, a range of different materials were used by the participants. The materials included a kappa fix carton on which an A3 sheet was fastened. Moreover, a string was placed in the position of roads, thumbtacks to fasten the laces and twist them when there were turnings were used, and different type of thumbtacks were placed in the position of obstacles.

2.3 Procedures

The examination procedure was carried out individually in a quiet environment. Initially, participants were informed about the procedure of the experiment and the haptic model they should create at the end.

The subjects participated in two experiments. During the first experiment the participants depicted their cognitive maps of a familiar city route on a haptic model using the materials given. In the second experiment the participants read the audio-tactile map of this same route, and then they were asked to depict anew their cognitive map of the route on a haptic model. This means that each participant was examined in the same route before and after the reading of the audio-tactile map.

The experiments were not conducted all in one day to prevent the effect of fatigue impinging on the results. A circular design was applied with reference to routes. For instance, the first participant was examined on the cognitive map of the first route, the second participant was examined on the cognitive map of the second route, the third participant was examined on the cognitive map of the third route, the fourth participant was examined on the cognitive map of the first route, and so on. This design was applied in order to avoid any error resulting from differences in the areas’ degree of difficulty.

In the second experiment the examination included participants reading the audio-tactile map through the use of a touchpad device and then depicting their cognitive map. Initially, the tactile map was placed on the touchpad device and a familiarization process took place. Then the audio-tactile map reading phase followed. Each participant read the audio-tactile map using he/her using touch, and by tapping the streets he/she listened to their name, by tapping the dots he/she listened to the information they represent, and finally by tapping the small vertical lines he/she heard the sounds of the particular area.

The maximum time that was offered for the map reading was 15 min, in which participants had to learn the route, street names and 30 pieces of spatial information. They could refer to the map and listen to the information as many times as they wish during the 15 min, while they could stop reading before the time span of 15 min was completed. A five-minute pause followed. Then the participants used the materials given by the researcher to depict their cognitive map.

At the end of each experiment, the participants created a haptic model representing their cognitive map of the route under examination. There was no time limit for the creation of the haptic model. Each time a participant touched an item on the haptic model, the researchers pointed out what this item stood for so that he/she could make a review.

After the completion of the haptic model, the researchers were drawing the maps, by drafting the outline of the materials of the haptic model on the A3 sheet. The recording of the data on the cognitive maps and their analysis followed.

During the processing of the cognitive maps (haptic models), the following variables were recorded and calculated by the researchers as to their accuracy: (a) the number of the streets, (b) the names of the streets, and (c) the number of spatial information participants placed on the haptic model. Specifically, with respect to streets, variables that were examined included how many streets participants placed properly and how many names of streets were indentified right. Regarding the amount of spatial information, the variable “correct information” was calculated; correct was considered the piece of information that was defined accurately with reference to its kind as well as location on the street.

3 Results

The findings have arisen after the comparison of the cognitive maps of the participants before and after they had read the audio-tactile map, with reference to 3 variables: “number of streets-correct,” “street names-correct”, “information-correct”. The mean and standard deviation (SD) of scores are presented in Table 3. Moreover, repeated-measures ANOVAs were conducted for the 3 variables: number of streets-correct, street names-correct, and information-correct.

Table 3. Mean (M), and standard deviation (SD), of correct answers regarding the number of streets, street names, and spatial information.

The implementation of repeated-measures ANOVAs revealed significant differences for the variables: number of streets-correct [F (1, 19) = 9.000, p < .01], and information-correct [F (1, 19) = 9.408, p < .01].

4 Conclusions

The results of the present study highlight the contribution of audio-tactile maps to the improvement of the existing spatial knowledge of individuals with visual impairments. In this case it could be assumed that sound and/or soundscape have a supportive role for memory, while in combination with touch lead an individual to better organize the storage and recall of spatial information. Theorists have already suggested that combining tactile and auditory information may lead to a more complete concept [34].

It should be noted that the assessment tool used in the present study might have influenced the results. Transferring a cognitive map on a haptic model implies no special scale adaptations, while transferring a cognitive map created through walking experience in the physical environment on haptic model could possibly entail an inherent difficulty relative to scale adaptations. Future research should try to examine and compare the cognitive maps of individuals with visual impairments with multiple-scale tools or methods, for instance monitoring walking behavior in the physical environment.

Moreover, it is worth noting that the divergence between the cognitive map before and the cognitive map after the audio-tactile map could be greater if the participants were skillful users of the touchpad and the audio-tactile map. The repeated use of the aid could result in the improvement of individuals with blindness regarding the coding of information in cognitive maps.

The results of the present study provide a new dimension on orientation and mobility training. Audio-tactile maps appear to be quite a supportive tool in the cognitive mapping of an area, and therefore significant aid for orientation and mobility within familiar areas.

The research presented in this paper constitutes part of an extended research completed in the context of the Research Funding Project: “THALIS - University of Macedonia - KAIKOS: Audio and Tactile Access to Knowledge for Individuals with Visual Impairments”. From this project a series of publications have arisen, some of which may resemble with each other. Several similarities exist between the present paper and the paper “The contribution of audio-tactile maps to spatial knowledge of individuals with visual impairments” [39]. However, both the research aim and the research area were different in each study. Moreover, the participants were not the same in each research. Papadopoulos and Barouti [39] studied the ability of individuals with blindness to create cognitive maps of a city area through the use of audio-tactile maps. This area was unfamiliar to the participants and, thus, there was no previous spatial knowledge with reference to the area. On the other hand, the aim of the present study was the comparison of the cognitive maps of individuals with blindness before and after the use of the audio-tactile map. That means that we examined the impact of audio-tactile aids on the improvement of cognitive maps that the participants had already developed for a familiar area. Furthermore, the main research instrument of the present study was audio-tactile maps of a familiar city area, while on the research of Papadopoulos and Barouti [39] the main research instrument was audio-tactile maps of an unknown city area.