1 Introduction

The use of touch as a mode of communication was proposed by Geldard in the twentieth century [1, 2]. Touch sensation can be an effective method for interaction and perception. People with eyesight often do not realize the extent to which the sense of touch is useful in knowing about surroundings. They perceive the surroundings mostly with their eyes and ears. On the other hands, people who are visually impaired are bound to use their non-visual senses in their daily life activities. These non-visual senses include hearing/audition, taste, smell, and touch. In the absence of visual ability, the sense of hearing is primarily used [3]. However, speech feedback may not be very useful in public places and other noisy environments [4]. The use of tactile senses [5] can be beneficial for the people who are visually impaired and people with visual and hearing impairment. The usage of sense of touch has evolved through the years to guide humans about surroundings through the reception of non-visual stimuli like pressure, pain, and changes in temperature guiding the movement in the space around and also about positioning and movement of the limbs [6]. Haptic technology combines the use of the tactile senses with the field of computer science and engineering [7].

Jones and Sarter [8] provide a comprehensive guidance on the usage of tactile displays for human–computer interaction. One of the seminal contributions in the area of usage of haptics for people with visual impairment is the work of P. Bach-Y-Rita et al. [9].

Vibrotactile devices have been used mainly for guiding the directions while walking or other complex navigational tasks for the visually disabled [10]. People who are visually impaired develop sharper working memory [11] and have better tactile acuity than that of sighted people [12] [13].

People who are visually impaired can use screen readers and text-to-speech converters for accessing the web. Such accessible technologies can be helpful, but they can obstruct their sense of hearing.

In this work, we intend to provide accessibility of the digital world to people who are visually impaired and people with both visual and hearing impairment, by presenting a framework (HELF) based on the principles of vibration patterns. This can enable people with visual impairment to understand and read digital text on their screens using different patterns of small vibration pulses and long vibration pulses. Simple swiping gestures can be used for the digital text using HELF. Reading and writing using HELF may initially entail time and patience to learn, but can be highly beneficial once learned. HELF can potentially be an effective way of communication for our intended users and can enable them to get convenient access to the digital world. We developed a new haptic language for the framework (named HELF Script) instead of using braille or any such language already available for people who are blind, because digital braille is difficult to recreate in terms of development as well as usage without the need of any extra hardware. Also, braille typing is prone to errors when used with flat touch screens of smartphones. The HELF Script developed, on the other hands, can be more accessible, as it can conveniently use the existing haptic hardware present in modern smartphones without using any external hardware or device apart from the user’s smartphone.

We evaluated the performance of HELF as a framework using the HELF-based application by assessing its application and ease of usage on both the visually impaired and sighted volunteers.

The remainder of the paper is organized as follows. Section 2 highlights the related work. In Sect. 3, we explain the working of HELF. Section 4 further elaborates on the working of HELF through Android application and explains the set up for testing of the application. The results are presented in Sect. 5. Section 6 mentions the threats to validity of the presented results. The last section concludes the paper and gives the directions for future extension of the presented work.

2 Related work

The use of haptic devices for the visually impaired is quite relevant. Most of the assistive technologies designed for them make use of touch as the sensory substitution of vision. The book [14] lays emphasis on the touch skill ability of the people who are visually impaired and advocates the usage of haptic devices for perception.

To help the deaf-blind individuals, a variety of tactual communication methods have been employed serving as a substitute for vision and/ or hearing. These methods include the Tadoma method of speech reading [15], the sign language [16] and the tactually receiving the fingerspelling [17].

Geldard [2] developed a method of communication through touch named ‘vibratese.’ It comprises tactile equivalents of the 45 basic elements of the English language. However, vibratese is no longer used due to unavailability of much literature.

Fritz et al. [18] demonstrated a haptic visualization system. They rendered mathematical data in a form that can be understood by people who are visually impaired using a haptic environment for data visualization.

Sjöström et al. [19] gave an insight into the efforts of using haptics for providing new computer interaction techniques to design assistive technologies for the people with visual disabilities. They have elaborated on the PHANToM and other devices.

Zajicek and Powel [20] describe web chat as a text-to-speech service for reading out the contents of a webpage to enable people who are visually impaired to get access to the web content.

Avizzano et al. [21, 22] modified the GRAB system (a European project), by accumulation of a haptic device and audio commands to enable the visually challenged to explore the 3D world through touch and audio commands help.

Pascale et al. [23] proposed the usage of haptic devices with the second life (A Virtual world: product of Linden Lab of California) that can potentially help the visually impaired to get access to the virtual world.

Flores et al. [10] gave a vibrotactile belt to enable them to get haptic directional information to blind walkers.

An idea to enable reading braille on mobile devices with touch screens was presented [24]. The authors gave three interaction methods using temporal tactile feedback to enable reading of six dot braille characters on touchscreen devices without any additional display attached to it.

Pawluk et al. [25] conducted a survey of behavioral research which can directly impact the design of assistive technology using haptics for the visually impaired.

Chang et al. [26] developed a handheld vibrotactile device that when used as a sleeve on the back of a mobile phone was able to convert the hand pressure into vibrational intensity to enable real-time interpersonal communication over a tactile channel. The device developed is bi-directional, that is, both the users were able to send as well as receive signals simultaneously.

Murphy and Darrah [27] created twenty computer applications (apps) for the visually impaired to enable them to better learn maths and science. These apps incorporated computer haptics, high contrast visuals, and auditory prompts. The analysis of testing six of these applications in a classroom environment shows that significant learning gains can be achieved by using such apps.

Reed et al. [28] developed a wearable tactile device to simulate phonemic-based tactile codes at the forearm. A collection of distinct tactile symbols formed to symbolize 39 of the phonemes of the English language. Their results depict that ten participants could learn haptic symbols of these English phonemes after one to four hours of training and the accuracy of phoneme recognition was found to be 86%. The authors have recently extended this work [29] to test the acquisition of about 500 words on 51 participants, and the results depict the successful transmission of English words using the TActile Phonemic Sleeve (TAPS) with a reasonable amount of effort spent in learning the phonemic codes.

The HaptiComm project makes use of haptics to support linguistic communications within the community of people with both visual and hearing impairment [30]. It is a hardware and software platform that aims to serve the community of people with both visual and hearing impairment, by providing them a way to use and explore their natural form language(s).

3 HELF overview

Since people with visually impairment and people with both visual and hearing impairment utilize their non-visual senses to interact with the world, HELF hereby uses this special ability to help them interact with the digital world. The proposed and implemented framework resembles in working with the Morse codes that uses dots and dashes to denote symbols and letters and is conceptually similar to vibratese language developed by Geldard [2]. Much literature on vibratese is not available, and hence, we have developed our own script and tried to utilize it to enable tactile communication using smart devices. The framework uses HELF script developed using two lengths of vibrations and gaps. The libraries for HELF have been designed for the English language. The HELF framework can also be used with laptops, smartphones, and smart kiosks. The subsequent sub-sections give details of the HELF script and libraries.

3.1 HELF script

The HELF script encodes 26 English letters (A-Z) and numerals (0–9). The script can be extended to include special characters and punctuations. The HELF code is a dynamic vibration pattern which consists of a sequence of vibrations and gaps (named as HELFabets (shown in Table 1)). We make use of two variants of vibrations (short and long) and two types of gaps (letter gap and word gap) making one HELF code for the script. For ease, we provide a nomenclature to the HELF characters (variations of vibrations and gaps used). The nomenclature is provided in Table 2. These codes have been designed to approximate the inverse of the letter frequency in the English language character. The frequency of words has been used from the work of Jones and Mewhort [31]. The combined frequency (Com. Freq.) is obtained by summing the upper case frequency and lower case frequency of the alphabets (listed in Table 2). Most frequently occurring letters will have the shortest and easiest HELF code. (Number of vibrations patterns will be least and have least complex HELFabet combinations.) These codes can be memorized by people who are visually impaired and people with visual and hearing impairment. Once skilled in it, these codes are interpreted easily and thus can be effortlessly used for reading and writing using smartphones.

Table 1 Nom of HELFabets
Table 2 HELF Codes

Table 2 lists the HELF codes for the alphabets and digits.

3.2 HELF architecture

This subsection elaborates the architecture of HELF. HELF typing explains how we can use HELF to write digital text. The HELF reading explains the reading of digital text using HELF. To use HELF with smartphones, we provide an input system for typing the English text using HELF script, and for the reading part, text is converted to HELF encoded vibration patterns.

3.2.1 HELF typing

To use HELF, a typing system has been designed that recognizes the swiping gestures and takes the desired input and converts it into digital text. The gestures used to designate the HELFabets are shown in Table 3. The users are simply required to memorize the HELF script and use swiping gestures to provide input in the form of digital text.

Table 3 Gesture FOR HELF codes

3.2.2 HELF reading

We make use of vibrotactile feedback to understand digital text. HELFabets can be recognized by the duration of the vibrations felt and duration of void between vibrations. A HELF code can be recognized by using a combination of HELFabets (Table 2). For instance, the HELF code for letter “I” is “DoDa” that means a short vibration followed by a long vibration. So, users can recognize the text if they memorize the vibration patterns for the HELF codes thoroughly. To make the reading of HELF user-friendly, the duration of HELFabets is kept as variable by defining four different options where option 4 is the easiest and consequently option 1 being the fastest. The users can start with option 4 and steadily move to option 1 when the user gains proficiency. The value for these options is decided by the method of trial and error to determine the most suitable values which are recognizable and can be distinguished from each other. Subjects were asked to try all the speed options to check which is the most comfortable for them to distinguish between Do and Da, most of the subjects choose Option 2 to be the most comfortable. Table 4 gives the duration of vibrations and letter gap in milliseconds for different speed options available in the HELF Android application for reading.

Table 4 Variable speed options for reading (G- Gap between Do and Da)

4 Set up for user testing

We developed an Android application to present HELF as a proof of concept. We tested the application on five visually impaired individuals, collected and analyzed the results to estimate the performance of HELF.

4.1 Android application

The working of android applications is shown in Fig. 1.

Fig. 1
figure 1

Using HELF Android application

As shown, the screen (Fig. 1A) on HELF testing app is divided into three major sections:

  • HELF Reading Section: Part of the HELF application screen allows the trainer to enter text manually and play the vibration patterns for the same. The text is converted to HELF vibrations patterns which can be perceived by the user to understand the text.

  • HELF Typing Section: The middle part of the HELF application screen allows the user to type text using HELF codes by swiping on the screen.

  • HELFabet Speed Control: The bottom part of the screen allows the trainer to select the speed option at which the HELF output is displayed for each HELFabet from available speed options (shown in Table 4).

For writing the letter ‘E,’ the swipe direction is left and then followed by a tap (Fig. 1B). A tap here indicates the end of HELF code for a character to be typed. To type the letter ‘T,’ the user needs to swipe right and then tap the screen (Fig. 1C). To type space between the words, the user needs to swipe up (Fig. 1D). Swipe down gesture is for backspace (Fig. 1E). Double tap is to read out the typed text using text-to-speech (intended for visually impaired) and make the screen ready for next input (Fig. 1F).

4.2 Testing of the application

All authors tested the application thoroughly before actual testing by the volunteering participants. Participants included eight sighted people and five visually impaired users. The sighted individuals are from a technical institute who voluntarily participated for testing of HELF application. The five visually impaired users who voluntarily participated in testing of the applications are member of the Blind Relief Association in New Delhi, India. Participants needed to be 15 years of age or older and understand the basics of English language.

Before the testing phase, participants were required to undergo a basic training, wherein they were required to learn the HELFabets and HELF codes. The pre-training was conducted in the presence of a researcher along with a team of three researchers to help with training and data collection. The visually impaired users were pre-trained and thereafter underwent the testing phase in a room at the Blind Relief Association, New Delhi. Sighted users are graduate students of a technical institute; hence, they underwent the training and testing phase at their institute itself. During the pre-training phase, the visually impaired subjects were recited the HELF codes for each character and using HELF mobile application, subjects typed the HELFabets on their phones, following which, they were asked to practice and memorize the HELF codes. Volunteers made sure that subjects were trained enough for actual testing by giving them a few easy word phrases to type. This process was conducted for two days, to make them understand the overall working of the application. After two days of pre-training, four out of five visually impaired users themselves volunteered to move to the testing phase. The fifth visually challenged user was then convinced that they were ready to undergo the training phase.

In Level 1 training, participants were required to learn the codes of E to L ((shown in Table 2) as they were shorter and easy to learn) and understand and memorize the gestures shown in Table 3. Level 2 training requires learning of all the codes shown in Table 2. Level 3 required the learning of codes for digits 0–9.

For testing the application after the training phase, three testing modules were designed to evaluate user performance. The writing module was taken first where the subjects were asked to type the phrases from each module. The results were noted based on time taken by the subject to type each phrase correctly by the subject based on which the CPM was calculated later. The subjects themselves got to know if the typed character is absolutely correct or not, by listening to the announced character typed by text-to-speech module used in the test HELF Android application. The reading module was given later followed by a short break after the writing test. We have counted only the absolutely correct words to calculate the CPM score, so even if one alphabet goes wrong in the desired word, we waited for the user to correct it and the correction time was also included in the CPM. Appendix I shows the three modules used for assessing the usage of the proposed framework using the application developed. The testing data for the reading and writing module were collected for content analysis. The reading part of the application could be best tested by people who have both visual and hearing impairment. This, however, required training such users, which in turn required skilled professionals. Also, due to the pandemic (Covid-19) situation, we were unable to test it on people with both visual and hearing impairment.

5 Results and analysis

The participants’ performance during the testing of the HELF application was thoroughly recorded and analyzed. The performance analysis was done for both writing and reading the HELF codes. The test sheets used for evaluation of the usage of the HELF using the android application are shown in Appendix 1.

Table 5 shows the recorded results of using the HELF application by the volunteers (Table 5 for sighted volunteers and Table 6 for visually impaired volunteers). Figure 2 presents the analysis of the recorded results in a graphical manner. The bar graph presents the performance of participants. The horizontal axis represents the modules or number of characters required to read and write using HELF. The vertical axis represents the number of percentage accuracy of reading, and the average characters per minute (CPM) for writing of each module in the testing phase. Figures 2 and 3 show the graphs for the eight sighted volunteers, and Figs. 4 and 5 show the graphs results for the five visually impaired users.

Table 5 Result sheet of sighted volunteers (CPM shows the characters typed in a minute for writing, and the score of each individual depicts the reading score)
Table 6 Result sheet of visually impaired volunteers (CPM shows the writing score, and the score of each individual depicts the reading score)
Fig. 2
figure 2

Results depicting average reading accuracy of sighted volunteers

Fig. 3
figure 3

Results depicting average writing accuracy of sighted volunteers

Fig. 4
figure 4

Results depicting average reading accuracy of visually impaired volunteers

Fig. 5
figure 5

Results depicting writing accuracy of visually impaired volunteers

As is evident from Fig. 2, average accuracy for reading in case of sighted users slightly decreases throughout. The plausible reason could be that Module 1 consists of characters with the easiest and shortest HELF codes, whereas Module 2 consists of characters with HELF codes having higher complexity. Similarly, characters in Module 3 consist of numbers along with least used alphabets in English language which have the most complex HELF codes as compared to the other two modules. Hence, there is a slight linear drop in the reading accuracy graph. This is due to the increasing complexity of HELF codes with each module, which needs time to learn and memorize, especially in case of sighted people who are not accustomed to using such a system.

The writing accuracy of sighted users increases from Module 1 to Module 2, and then, it slightly falls (as depicted in Fig. 3). With Module 1, users gradually get accustomed to HELF gesture typing system, and as they move to Module 2, the practice from Module 1 helps users to type faster in Module 2, in case of Module 3, the HELF codes are longer and require more swipes; hence, the graph in Fig. 3 shows a slight drop in the CPM for Module 3.

As can be seen from Fig. 4, the average reading accuracy is slightly different for volunteers with visual impairment as compared to sighted people, as people with visual impairment usually develop sharper senses with prolonged use, thus helping them to easily understand HELF codes and can easily differentiate between short haptic pulses and longer haptic pulses, hence getting better the reading accuracy as compared to sighted people. Having a close look at the graph, we observe a slight drop in accuracy for Module 2, which is because Module 1 has shorter HELF codes and smaller words but as we move to Module 2 the word size HELF codes size increases drastically. As depicted in Fig. 5, the writing accuracy of visually impaired users initially increases from Module 1 to Module 2, as the users gradually get used to HELF gesture typing system and as they move to Module 2, practice from Module 1 helps users to type faster, but in Module 3, the HELF codes are longer and require more swipes; hence, the graph shows a slight drop in the CPM for Module 3.

Overall, the reading accuracy can be approximated to 91%, and the average CPM is found to be 25.7. These results are comparable or even higher than the braille-based frameworks found in the literature [32], 33. If we consider the average CPM of HELF shown by only visually impaired (which is quite higher than that of sighted individuals), then HELF shows highest typing speed (CPM ~ 34) as compared to Flight [32], which is a braille-based system (having CPM ~ 33) and is much higher than the other similar works (having CPM ~ 12) [33]. The empirical analysis of the HELF shows encouraging results as the reading accuracy is found to be quite high, and the CPM in case of writing is also found to be good enough.

6 Threats to validity

The application was tested in a controlled environment with the participants. This section presents the threats to validity of the presented results.

6.1 The threats to internal validity:

The following factors related to the experimental condition may have affected the results of the research:

  1. 1.

    Participants’ selection: The participants of the study volunteered to test the application and were limited in number. The application can be better tested with random selection of participants from various different organizations and with a larger number of participants involved.

  2. 2.

    Testing and training: The training phase was constrained by a time limit, and the testing was performed using three modules. Extended training and further tests can assure the validity of the presented results. Testing the application on participants with hearing and visual impairment could further proclaim the effectiveness and usability of the application.

  3. 3.

    Different testing hardwares: All participants were provided with different smartphones from different brands. Not all phones have the same haptic engines, which can lead to slight lag in vibrations or a significant difference can be felt for the intensity of vibrations.

6.2 The threats to external validity:

The following may affect the generalizability.

  1. 1.

    Participants’ age group: The age of the participants of the study is between 20 and 35 years. The participants were already using smartphones. So, it was easy for them to understand the application during the training phase. People of an older age group may not be comfortable with smartphones.

  2. 2.

    Participants’ experience on computer-based application: The participants were already comfortable with assistive technology-based smart applications. Not all visually impaired may be well-versed with it.

7 Conclusion and future work

In this paper, we presented HELF as an application for people with visual impairment and people with both visual and hearing impairment, to enable them to connect with the digital world. The usage of this framework can impact the lives of almost 30 million individuals over the world who are currently devoid of the expanse of resources that the World Wide Web can provide smartphones and other such devices. The use of HELF can empower the visually impaired to be socially connected to their loved ones without impacting their capacity to listen and be vigilant when in public. The application can be a great support for people with both visual and hearing impairment (referred to as deaf-blind people).

The preliminary analysis of HELF as an Android application gave encouraging results. The users (both sighted and visually impaired volunteers) found the presented framework usable as a mode of communication.

Future work includes the following:

  • Conduct longitudinal study of the proposed work, wherein more visually impaired users as well as users who have both visual and hearing impairment can be involved, so as to congeal the capacity and effectiveness of usage of the HELF for such users.

  • Hardware modules can be developed for deploying HELF with devices which do not have haptic engines and/or touchscreen interfaces.

  • Developing a library or an application programming interface (API) module of text to HELF code to enable developers to use HELF with any mobile application.

  • Developing a third-party keyboard for android and iOS users with HELF input gestures.

  • Developing a smart watch application for Android Wear OS and Apple Watch OS to allow typing using gestures from smart watch.