1 Introduction

According to the World Health Organization (WHO), there are 285 million people in the world who have some form of visual impairment, ranging from low vision to complete blindness [1]. Between 2009 and 2014, mobile usage among visually impaired people increased from 12 to 82% [2]. Touchscreens provide the primary means of interaction with most mobile electronic products, including smartphones, tablets, smartwatches, etc. Hsinfu Huang [3] explored the accessibility features of touchscreen interfaces with 32 visually impaired participants. Huang summarized the design guidelines for accessible touchscreen interfaces for visually impaired people and suggested that the operational style of these interfaces should be redesigned. Touchscreens allow direct operation and are an intuitive and effective input device [4,5,6], thus providing a usable and convenient interface for most users. Modern smartphones provide various accessibility functions but still pose significant problems for visually impaired people. Rodriguez-Sanchez [7] notes that despite considerable research on accessible technology and assessment, visually impaired people find using smartphones challenging because the features currently available are not enough to meet users’ expectations and requirements. Since visually impaired people are increasingly using smartphones and similar devices [8,9,10], usable and responsive interfaces must be developed to fulfil their needs. Peischl et al. [11] integrate user-centered design in the early stage of mobile application development.

Many studies have been conducted regarding accessibility. Oliveira et al. [12] evaluated different types of keyboards for blind users to observe and evaluate their difficulties: NavTouch, MultiTap, QWERTY and Braille-Type. Guerreiro et al. [13] investigated the NavTouch and NavTap for touchscreens that were significantly faster than the button-based system. Yfantidis and Evreinov [14] developed interactive buttons for touch screens, allowing users to input data by using a single finger swipe in eight basic directions from any position. Several smartphone-based applications are also very useful for visually impaired students. Kuribayashi et al. [15] developed a smartphone-based system for blind people that enables them to identify and avoid obstacles and thus supports navigation. This system used an RGB camera and a built in infrared depth sensor to identify the surrounding pedestrians along with their position. This system also provides audio and vibration feedback to notify the user. EyeMath [16] is a cloud-based smartphone application that captures images and describes their content. It processes the images by dividing them into smaller segments, then separates mathematical symbols from plain text and presents them as an Abstract Syntax Tree (AST).

There are also several web-based and other solutions that are very helpful for visually impaired students. MyA + Math [17] is a web-based prototype that facilitates visually impaired students in learning mathematics. They provided an interactive learning environment by adding speech control and implementing an algorithm to solve mathematical problems. Dominik et. al [18] suggested a method that evaluates computer-aided mathematics teaching and learning for people with visual impairments. This evaluation consists of behavioral, emotional, cognitive, social, distractors, motivational, and modeling factors. They compared the proposed technique with traditional, classroom-based teaching of the same material to visually impaired people, and observed an improvement. Maćkowski et. al developed a multi-medium method [19] that significantly enhances mathematical education in blind people. They assessed the cognitive aspects based on knowledge implementation, self-correction ability and understanding of the correct way to solve the problem. Additionally, they performed a comparison of classic and proposed alternative multimedia methods. Their proposed method significantly enhances the results in the context of knowledge operationalization.

The current project set out to explore various issues concerning the design of touchscreen-based mobile devices which might affect people with a visual impairment, such as the arrangement of icons, color combination, auditory feedback and feedback using vibration. Based on this work, a mobile e-learning application was developed which helps visually impaired children to learn mathematical skills easily. The application consists of two modules: “Learn to Draw Digits” and “Perform Basic Mathematics Operation,” and provides feedback on both modules.

The remainder of this paper is organized as follows. Section 2 reviews related work; Sect. 3 provides the overview of the experimental procedure; Sect. 4 describes the participants and methodology, including the procedure, development of design guidelines and second user study employing the revised application; Sect. 5 presents the results of the user study; Sect. 6 discusses the findings of the study and draws conclusions; Sect. 7 suggests possible future work.

2 Related work

Previous studies have shown that a limited number of tools in the field of mathematics are accessible to visually impaired people. For example, studies have shown that haptic technology plays a vital role in improving the learning ability of visually impaired students in mathematics and science [20, 21]. The Lambda System [22, 23] provides a specialized form of mathematical representation using a Braille display, but it uses an 8-dot Braille notation and the necessary hardware is very costly. MathMelodies [24] is an iPad application designed to help primary school children learn mathematics. It uses sophisticated graphics and interaction techniques that are accessible for visually impaired children. An intelligent tutoring platform called “The Project for Math” [25] offers an interactive system for supporting mathematical education. It is based on a collection of exercises for students who want to study particular mathematics skills independently (i.e., mathematical formulas, structural information of mathematical expressions and equations) and can be used to enhance their knowledge. Microsoft has developed a plug-in for Internet Explorer called MathPlayer which presents MathML [26] visually. It includes several features to make various mathematical expressions easily accessible to visually impaired people, i.e., Fractions, Radicals, Subscript–superscript Pairs, Tables and Matrices. MathPlayer can be used with screen readers such as Window-Eyes and JAWS. It also integrates with various TextHELP! products aimed at students with learning disabilities [27]. Another approach involves decomposing mathematical exercises into a sequence of non-decomposable sub-exercises [28]. This makes it possible to resolve mathematics exercises and assess the correctness of solutions interactively at each step. There is also an Android application designed to teach geometry, traditionally a very difficult subject to teach to visually impaired children, which uses vibrotactile feedback [29].

Researchers have also been working on making e-learning websites accessible for visually impaired users. For example, AudioMath works with material that is already in MathML and renders MathML-coded mathematical expressions into text-to-speech (TTS). This research has the potential to improve the accessibility of e-learning websites and brought huge benefits for visually impaired people [30]. Other researchers [31] have developed techniques to make web pages containing scientific documentation more accessible using a mixture of Braille-based devices and speech synthesis. In [32], the authors proposed a web-based application that uses speech dialog to write mathematical expressions. Visually impaired persons can easily access mathematical materials by using voice controls.

3 Overview of experimental procedure

The work is divided into two parts. In the first part, we set out to identify the most appropriate design guidelines for an application for visually impaired people, concerning matters such as the arrangement of icons, color combinations, auditory feedback and feedback using vibration. In the second part, we employed these guidelines in the development of a mobile e-learning application designed to help visually impaired children acquire mathematical skills easily. Figure 1 provides a complete overview of our proposed system.

Fig. 1
figure 1

Complete overview of the e-learning platform

3.1 Initial application design

The application used in the initial user study was developed using standard Android development guidelines, along with some general guidelines dealing with issues such as the contrast between the foreground and background colors. Beyond this, however, no attempt was made to make the application accessible to partially sighted users. The application consists of two main modules “Learn to Draw Digits” and “Perform Basic Mathematical Operations.” The first module is designed to help visually impaired students learn the shapes of the digits and how to draw them. The user is presented with a partially drawn digit in black and white (black background and white text). Provided the student has sufficient vision to differentiate light and dark, s/he will be able to perceive the shape and draw the digit by moving his/her finger around the touchscreen. If the user deviates from the line which forms the digit, vibration feedback is provided, alerting the user to the error. The second module allows students who have learned how to draw digits to perform basic mathematical operations, such as addition, subtraction, multiplication, and division. The initial design of the app is shown in Fig. 2.

Fig. 2
figure 2

Initial design

4 Participants and methodology

The experiment was conducted in a school for blind and visually impaired children. A total of 10 visually impaired pupils, from 6th grade to 9th grade, took part in the study. While the number of participants is quite small, this is usually unavoidable in research focusing on visually impaired users [33]. The age range of participants was between 11 and 16. Three participants were not familiar with the use of smartphones while the remainder were familiar with smartphone devices. We have considered WHO distance vision impairment categories: Mild (visual acuity worse than 6/12 to 6/18), Moderate (visual acuity worse than 6/18 to 6/60) and Severe (visual acuity worse than 6/60 to 3/60) [34]. One participant (P1) perceives bright light in the severe category, five (P2-P6) having a mild impairment and four (P7-P10) having a moderate impairment. The study protocol was assessed and ethically approved by the institute in which the experiments were conducted. Before the study was conducted, each participant completed an informed consent form. Table 1 provides detailed information regarding the participants.

Table 1 Demographic information of participants

4.1 Procedure

Before agreeing to take part, potential participants were given a detailed overview of the study and an explanation of its objectives. They were informed of how it may help them to learn the basics of mathematics. After agreeing to take part, participants were given a brief introduction of the tasks and requested to attend all the training sessions. Two separate sessions were conducted to help participants become familiar with the application and understand how to use it, i.e., to learn to draw digits and perform basic mathematical operations. The training sessions lasted 25 min each. During the training session, participants were also given information about interaction with smartphone applications. At first, they were not very comfortable interacting with the application, especially those who had less familiarity with smartphones. However, all participants started feeling comfortable after understanding the interaction mechanism.

The study was conducted in the classroom of the school. A task was designed and deployed on Samsung S4 smartphones. The task required each participant to perform a basic mathematical operation, i.e., “Draw the first digit,” “Select mathematical operation,” “Draw the second digit,” “Double-tap on the ‘ = ’ button,” and “speak out a result.” Auditory feedback was provided for each activity performed by the participant. Speech feedback alerted users if they made errors or ran out of time, e.g., if the time exceeded two minutes, the system would respond with the message “Time is Over.” Table 2 contains all the basic instructions for completing the task.

Table 2 Instructions pool

4.2 First user study, employing the initial design

The first user study was divided into two parts. In the first part of the experiment, participants made use of the app to perform arithmetic operations (addition, subtraction, multiplication, and division) and some other related tasks, i.e., drawing of digits on-screen on a smartphone using the shape drawing application. Participants were given a fixed amount of time (2 min) for each task, during which they were asked to complete the task repeatedly. Every participant was asked to perform the same task repeatedly until they had completed it 10 times in two minutes or less and without error. Only those attempts that were completed in two minutes or less and without error were included in the calculation of mean task completion time. Participants who were familiar with the use of mobile devices found it quite easy to perform the task, taking less time and making fewer mistakes than participants who were unfamiliar with the use of smartphones, who reported more difficulty while performing the task.

In the second part of the experiment, participants were observed while performing the task using different designs. They were asked to perform the tasks using different color combinations and different font styles and sizes. The intrinsic brightness of colors was taken into account in choosing combinations of colors. Table 3 provides detailed information on the intrinsic brightness of the color combination. The color combinations used in the design were as follows (background and foreground respectively):

  • Yellow and Blue

  • Black and White

  • Red and Green

  • Black and Blue

Table 3 Information of intrinsic brightness background and foreground colors

Participants used various applications (i.e., Be My Eyes and TapTapSee) as well as the initial version of the application proposed in this paper. Most of them were comfortable with the color combination of black (background) and white (foreground) along with a large, bold font. It was easily visible to visually impaired participants, regardless of whether they were familiar with the use of the smartphone or not. Participants reported difficulties in observing text and buttons on the screen using other color combinations and smaller text sizes. Participants were observed during the performance of the task, and their task completion times were recorded. They were also asked to fill in a questionnaire after completing the task (e.g., see Sect. 5–Results).

4.3 Development of Design Guidelines

To improve the basic application, relevant research literature was reviewed and a series of guidelines were derived from the findings of key studies and applications. Table 4 lists the applications, their important features, the platform on which they run, the target users, research evidence and developer information.

Table 4 Features of the application considered for finding design guidelines

As well as deriving guidelines from earlier research, participants were observed while performing the tasks and the resulting observations were used in the development of guidelines. These observations can be categorized as follows:

  1. 1.

    Issues affecting task performance on the initial version of the app and similar applications;

  2. 2.

    Issues regarding adequacy/appropriateness of feedback;

  3. 3.

    Issues concerning icons, buttons and text sizes;

  4. 4.

    Issues concerning the perception of color combination for design purposes.

In the light of the literature review and the study findings, the following design guidelines were collected:

  1. 1.

    Use the best possible Color Contrast between background and foreground (white and black);

  2. 2.

    Ensure sufficient button size and surrounding padding that should range from 32 × 32 dp to 48 × 48 dp;

  3. 3.

    Provide immediate and consistent feedback;

  4. 4.

    Provide a clear navigation mechanism;

  5. 5.

    Arrange icons around the corners and in the center of the screen;

  6. 6.

    Do not use color as the only visual means of conveying information.

The feedback from the first study indicated that numerous participants felt comfortable with the color combination of black (background) and white (foreground). While experimenting it was observed that bold letters/text are more visible to visually impaired participants. The experiments helped in enhancing the design of the application. The final design was drafted after conducting the experiments and using the guidelines established after completing the experiments. The revised design is shown in Fig. 3.

Fig. 3
figure 3

Evolutionary design

4.4 Second user study, employing the revised application

The user study was repeated, but this time using the revised application. The same participants took part, and they were asked to perform the same task as in the first part of the initial study, under the same conditions.

5 Results

The mean task completion time was measured for both the initial version of the application and the revised version. The mean task completion times for participants P1-P10 using the initial version were 80, 76, 72, 75, 79, 83, 73, 77, 71, and 70 in seconds. When using the revised design, the task completion times were 56, 50, 49, 48, 55, 54, 48, 49, 45, and 46 in seconds. These results are shown in Fig. 4.

Fig. 4
figure 4

Mean task completion time

After both the first and second user studies, participants were asked to fill in a questionnaire that contained four questions designed to assess satisfaction/ dissatisfaction with the app. Each question required a response on a 5-point Likert scale (1 = Disagree strongly, 5 = Agree strongly).

The questionnaire data were analyzed using a Wilcoxon signed-rank test, the results for which show that the participants reported higher levels of satisfaction when using the modified version of the app. The result of the Wilcoxon signed-rank test expressed that z value is 2.395, the p value is 0.016, W value is 4 and the critical value for W at N = 10 (p < 0.05) is 8. Results show that participants feel more satisfied at p < 0.05.

After each user study, participants were also asked to complete a questionnaire that contained five questions designed to assess how meaningful they found the feedback provided by the app. A one-tailed paired t test was used to compare the results when using the original version of the app with those obtained when using the modified version of the app. The results show that the modified app (developed using the design guidelines) provided more meaningful information to participants (Tcal = 4.8503, t = 1.671, df = 9, p < 0.05).

After completing the second user study, participants were shown five word pairs and asked to indicate their feelings about the modified app by selecting one or the other words in each pair. The responses were: feeling Satisfied (8)–Unsatisfied (2), Like (8)–Dislike (2), Fun (7)–Hatred (3), Joyful (8)–Painful (2) and Comfortable (8)–Uncomfortable (2).

Finally, a Mobile Application Rating Scale (MARS) was used to rate the quality of the modified application. The MARS scale has 23 items, grouped under five headings: engagement, functionality, aesthetics, information and subjective quality. Each MARS item uses a 5-point scale (1-Inadequate, 2-Poor, 3-Acceptable, 4-Good, 5-Excellent). Table 5 shows the MARS subcategories and the mean scores obtained under each subcategory.

Table 5 MARS items with a mean score of the apps

Table 6 provides independent ratings on the overall MARS total score of the application which demonstrated an excellent level of inter-rater reliability (2-way mixed ICC = 0.53, 95% CI 0.007–0.134). The MARS total score had excellent internal consistency (Cronbach alpha = 0.37) and was highly correlated with the MARS star rating item (#23), r (1) = 0.53, P < 0.001.

Table 6 Results of ICC Calculation Using Absolute-Agreement, 2-Way Mixed-Effects Model

Figure 5 demonstrates the mean score of each subscale of the Mobile Application Rating Scale. The mean score of the Engagement subscale is 3.2, Functionality subscale has a mean value of 4.2, Aesthetics subscale has a mean value of 2.8, Information subscale has a mean value of 3.4, Subjective quality has a mean score of 3.3 and, lastly, Perceived Impact was determined as 3.9. Perceived Impact is another subsection of the MARS questionnaire that evaluates the impact of the app on users’ awareness, knowledge, attitude, intention to change, help-seeking and behavior change.

Fig. 5
figure 5

Mean values of MARS subscales

6 Conclusion and discussion

In the absence of vision, hearing becomes the primary channel for communication among people with visual impairments. Keeping this principle in mind, the researchers set out to develop an accessible system for visually impaired people. The proposed application helps visually impaired children to learn basic math skills by using an auditory feedback mechanism. This application is specifically designed for moderately visually impaired children from grade 6 to grade 9. Traditionally, visually impaired students have relied on Braille for learning, but this presents its own problems. Only a small percentage of visually impaired people can read braille [33]. Our proposed application enables visually impaired students to learn and draw digits along with basic mathematics operations by using auditory feedback.

After conducting the experiments, it is concluded that using smartphones for learning has advantages over other approaches because they are cheap and widely available. Auditory and touch feedback (using vibration) is a better medium for conveying meaningful information. During the experiment, it was found that most of the participants were comfortable with the color combination of black (background) and white (foreground). Bold letters/text is more accessible for moderately visually impaired participants.

The mean task completion time was lower when participants used our enhanced application than it was when they used the initial version of the application. The results also show that our application provides meaningful information to the participants, which improves their awareness of the actions taken during the use of the application. The participants also expressed higher levels of satisfaction when using the enhanced version of the application. Furthermore, the MARS results indicate that the proposed application is quite reliable for visually impaired children. The proposed application has a sufficient rating among participants. It can also be used to provide a checklist for the design and development of new high-quality apps. Finally, the MARS results show that the participants were comfortable with the feedback provided by the application.

7 Future work

The proposed application only supports basic mathematical operations. The application can be enhanced by adding voice-based input while using the calculator module for better accessibility. Currently, the proposed method is designed for 6th to 9th-grade students with visual impairments. However, the intention is to extend the application by adding further mathematical operations such as percentage, decimal point, square root, pi, power (i.e., square, cube) and to make it accessible for students in higher grades.

Incorrect attempts by users were not considered in this study. In future studies, it might be of value to investigate the nature and frequency of errors.

The current application focuses on mathematical skills only and does not cover other educational areas. In the future, the application could be extended with new and advanced features related to English reading for children. These features will be based on Artificial Intelligence, Machine Learning and Natural Language Processing. This will make the application smarter by allowing it to analyze the speech of a child/user reading text from a book, and provide feedback and assistance if the user reads incorrectly. These features could be used by anyone, whether visually impaired or not.