Keywords

1 Introduction

According to the last Brazilian census 23.9Ā % of the population have at least one type of physical or sensorial disability [1], with 18.6Ā % of the population being visually impaired. Although, the visual disability is the most predominant disability among the population, there is a number of barriers and challenges that these individuals have to face in their everyday life. Access to information is one of them; mainly for those with severe disability or blind people.

Globally speaking, societies have become dependent of the information and the communication provided by digital technologies. For the people with disabilities, the internet represents an opportunity to access information that would be unavailable otherwise. However, websites that are difficult to access and use are countless, excluding most of the visually impaired people [2ā€“4].

Similarly, in Brazil, the internet is an acknowledged source of general information and service, including those provided by the government. Since 2004 it is compulsory for any Brazilian government websites to comply with accessibility guidelines established in the e-MAG - Brazilian web accessibility standard for e-government [5, 6]. The intention of such resolution is to enable people with disabilities, including blind individuals, to access any government information. However, some studies indicate that these websites have not accomplished this standard [7ā€“10]. More recently the government has used the web to inform people with disabilities regarding their rights and to assist them in the purchase of assistive technology. In this case, specifically, it is essential to ensure that a diverse range of people is able to access the information available in such websites. Thus, this study evaluates the accessibility and usability of a government website developed to inform disabled people about assistive technologies available on market. It is called the national digital catalogue of assistive technology (CNPTA) ā€“ http://www.assistiva.mct.gov.br.

Nevertheless, the relationship between web accessibility and web usability is often understood in two distinguished ways. The former as the problems encountered by users with disabilities which prevent them from using the website; and, the latter as the issues found by non-disabled people [14]. Following this concept in 1994 the World Wide Web Consortium started to develop web accessibility guidelines which became a global reference of guidelines, the WCAG - Web Content Accessibility Guidelines [15].

Since it was first created the WCAG has progressed towards promoting universal access. The guidelines have improved the way to evaluate accessibility by adopting different methods, including usability methods, like user testing [12]. This improvement may be a response to past studies that indicated the need to connect website usability with disabled people [14, 16ā€“19]. As highlighted by Theofanos and Redish (2003),

"observing, listening to, and talking with representatives of the target audience in this case, users of screen readers are critical. To truly meet the needs of all users, it is not enough to have guidelines that are based on technology. It is also necessary to understand the users and how they work with their tools."

In the case of blind users, web accessibility is dependent of screen readers. This type of assistive technology translates the information provided in the website in audio outputs. Most of them use standardised protocols to identify icons, texts, hyperlinks, menus, and other graphic interfaces [15]. Hence, the efficiency of such assistive technologies combined with the design of the website are fundamental to allow the use of the website by blind people.

The present study analysed whether the catalogue complies with the WCAG 2.0 through automated and user testing. Although the legislation indicates the Brazilian guidelines e-MAG, we understood that the e-MAG cover the same criteria established in the WCAG 2.0, but the e-MAG are in Portuguese what makes the guidelines more accessible for web-developers in Brazil [6, 11]. However, the WCAG 2.0 are more detailed and up-to-date guidelines, which is important for the purpose of the research project that this study is one part. Moreover, the WCAG 2.0 guidelines cover different types of web-accessibility evaluation, such as inspection methods, automated testing, screening techniques, subjective assessments and user testing [12, 13].

Considering the WCAG 2.0 conformance requirements [20] the study presented here explores the usability test and the automated accessibility test in order to answer the following question: In the website under investigation, is there any relationship between the score in the automated testing and the results of the user testing with blind users?

2 Methods

New studies on digital interface design consider accessibility in use, taking into account evaluation tools such as those used for usability [13, 14]. In this preliminary study we evaluated the national digital catalogue of assistive technology (CNPTA), which is shown in Fig.Ā 1.

Fig.Ā 1.
figure 1

National digital catalogue of assistive technology http://www.assistiva.mct.gov.br

2.1 The Website

The research group belongs to the Ergonomics and Interface Laboratory (UNESP-Bauru, Brazil) in which several research projects are related to assistive technology. The researchers are familiar with the website chosen. In fact, this was the reason for choosing the website. The research group found many usability problems every time they accessed the CNPTA searching for some assistive technologies. Five of the issues are the following:

  1. 1.

    lack of information regarding the assistive technologies available in the catalogue;

  2. 2.

    products without any image, or images are in very low resolution;

  3. 3.

    the information supplied do not follow the same structure. Some products are described in details, others are briefly described and others do not have a description. In other cases, some products require a login to access the information;

  4. 4.

    some of the products in the CNPTA are out-of-date, in some cases they are out of the market;

  5. 5.

    a faulty searching system:

    • users cannot filter the options by the type of products they are searching for;

    • the suppliersā€™ links offered in the website are not links to connect to the products they found in the catalogue, it links to the home page of the supplier, which means that users have to do all the search again in the website of the supplier. In some cases, the product is no longer available. As a result, the user is only linked to another catalogue;

Many of the problems encountered by the research group were visually identified and thus, a question was raised to understand whether the website would be accessible for the intended users - people with disabilities.

2.2 User Testing

Blind people are the users who encounter the most difficulties in using the web [14]. Thus, they can potentially highlight accessibility and usability problems whenever a website is developed [17]. In the present study, the participants were volunteers who receive professional training in an institute for blind people - Lar Escola Santa Luzia para Cegos - based in the city of Bauru (SP-Brazil). Two blind individuals who are familiarised with online resources and frequent users of the internet participated in the study.

Procedure.

Before we run the test a consent form was read to each participant, complying with the ERG BR 1002 - ergonomist code of practice [21]. The form assured the following aspects of the study:

  1. 1.

    the evaluation was to test the website, not their skills to use the internet;

  2. 2.

    the data collected would be coded to protect their anonymity;

  3. 3.

    their right of leaving the study anytime without the need to explain the reason for doing so.

We asked their permission to record all evaluation sessions by using a digital camera for later viewing and analysis. Both participants gave permission to record and signed the consent form.

The tests were conducted in the institute where the participants use the computers, which means that they were familiarised with the equipment, including the keyboard and the screen-reader (NVDA screen reader).

Two types of questionnaires were applied: one at the beginning and another at the end of each task. The former was applied just before we run the test, in which the participants shared personal data, like age, computer experience, including familiarity with the internet (years of use), their expertise, the frequency of use, etc.; whilst the latter was applied after each task, in which we enquired about their opinion regarding the accessibility and usability of the website.

We divided the usability tests into four distinguished activities to be performed by the participants. All activities were designed to be brief and to ensure that it would not cause any embarrassment to them.

The Tasks.

We gave the four tasks one at a time:

  1. 1.

    the first activity involved to find the website of the national digital catalogue of assistive technology ā€“ http://www.assistiva.mct.gov.br.

  2. 2.

    the second activity was to seek for supporting products of communication and information on the initial page of the website. Among the products encountered, the participant should find products for reading and among them, the one of interest.

  3. 3.

    the third activity was to seek for packs (game cards) or Braille printers on the website and then, based on the information available to choose one of interest.

  4. 4.

    the fourth activity involved searching among the products related to visual impairment on the page four of the catalogue, the product most interesting to them.

The ISO 9241 principles were used to evaluate effectiveness, efficiency and satisfaction, which were measured by the following criteria:

  • effectiveness: the objectives successfully achieved;

  • efficiency: the tasks completed correctly within the variables of time and degree of difficulty - measured by the number of errors made, and;

  • satisfaction: the answers given in a questionnaire.

2.3 Automated Testing

We chose the WAVE accessibility checker to run the automated testing. Initially, the accessibility checker was chosen based on the resources available online and the easy-of-use [22]. It also counted in the selection, the type of outcomes provided by the checker, whether oriented to issues encountered by blind people - like the sequence browsed by screen readers [23]. We ran the test on all the WebPages accessed in the user testing.

3 Results

The results of the automated testing were compared with the results of the user testing ran with two blind people.

3.1 User Testing

Both participants are expert users, who works with computer. They teach blind or people with severe visual impairments to use the computer and the internet. Their experience in using the internet is over 10 years, which includes using it to socialise, to search, for shopping, to work, etc. Although they heard about the CNPTA website, they were not familiar with the website, neither had ever accessed it.

The first participant (P1) used the web browser Mozilla Firefox, while the second (P2) used the Internet Explorer. Both used an open source screen readers named NonVisual Desktop Access - NVDA. Although, they navigated in the website through different keyboard commands, the outcomes of the user testing were very similar. For example, P1 used a searching tool provided by the screen reader that helped him to search for the products, listing them by their initials; whereas P2 navigated the website following the sequence established for the screen reader in the page.

According to the participants, they performed the tasks without much effort. The amount of time spent by them in each task is shown in Fig.Ā 2. In the third task, when the first participant sought ā€œpacks in Brailleā€ he could not find the packs available in the website. The time spent performing this task is shown in Fig.Ā 2, which represents his difficulty to find the product. Although he was not able to finish the task he was still satisfied with the majority of the features of the website, apart from the searching system.

Fig.Ā 2.
figure 2

Participants (P1 - dark line; P2 - light line) performance accessing the CNPTA. (Color figure online)

Similarly, although the second participant for few times found himself navigating in another website, due to unexpected links in the heading of the website, he was satisfied with all the features of the website (structure, navigability, searching system and information).

For many times, during the second and third tasks, both participants tried the searching form and did not succeed. They typed the word correctly but, due to an error that occurs in the searching system, the results were unrelated, incomplete or incorrect. However, both mentioned that the amount of time spent is the usual time when navigating and exploring new websites. Moreover, both explained that although the website presents some issues, they were satisfied with it. The reason gave, in their words, is:

ā€œamong the websites available, it [the type of problems in the CNPTA] is just normal. We could say [that this is a problem], but it is present in all websites, for example, every time we come back to the page it [the NVDA] reads all the headings text by text. This is something that cannot change, it would break the website structure, it cannot change. So, it [CNPTA] is just an average websiteā€. P1 - 00:22:50.

ā€œ[generally we have problems if] the structure or links are not specified, or a function or information is in Flash, or other issues [i.e. as mentioned later, security codes do not have audio] that cause problems during the navigation. These type of problems I havenā€™t found in this website [CNPTA]ā€. P2 - 00:42:01.

Additionally, in the third task, the second participant chose a product of interest, but the product could not be accessed, due to another error that occurs in the website. Some products do not have any information, any description and any link to suppliers; they are listed there only or; in some cases, they are in the CNPTA, but not in the market anymore. However, when asked, the participant described himself as satisfied with the information provided.

For P2 the most important and positive aspect of the website is that he was able to use it without asking any help. Moreover, for him it was important that all products listed in the website are ISO certified, which is a guarantee of their quality. However, he pointed out some issues in the website based on his experience in teaching people with visual impairment: the website do not offer accessibility features, such as the ones to increase the size of the text or to increase the contrast level between text and background. He also mentioned that he benefits from keyboard shortcuts, that are available in some websites, which allow the user to skip to the content; to go to the searching form; to go to the initial page; to skip headings. He mentioned some websites that makes good use of such resources (i.e. http://www.fundacaodorina.org.br).

At the end of all tasks P1 mentioned that the productsā€™ descriptions could be better if they provide more details about the products, suppliers and price.

3.2 Automated Testing

The WAVE accessibility checker were used in every page the participants accessed to understand the relationship between the issues indicated in the automated testing and the ones found in the user testing. FigureĀ 3 shows an example of the pages assessed.

Fig.Ā 3.
figure 3

Results of WAVE test in the initial page of the CNPTA

Among all the pages the WAVE evaluated the majority of the results are alerts instead of errors. The errors are divided into two types: one error is a missing form label in the main search bar, and; twelve errors are empty links throughout the initial and secondary pages. The latter issues are not directly related to the problems encountered in the user testing. The former, however, can be one of the reasons for the difficulties found by the participants whenever they used the search bar, added to the fault in the searching system itself.

The automated accessibility checker highlighted the issues encountered in the website. It does not offer a checklist for web developers to check if some essential features to improve accessibility and usability are available. This may explain the low rate of errors encountered.

4 Discussion

This study confirmed the need to establish usability tests with disabled people to enhance website accessibility, as suggested in past studies and proposed by WCAGĀ 2.0 [14, 16ā€“19]. User testing not only evaluates what is in the website, but also elucidates the user needs based on problems encountered. Based on this preliminary study, we can underline that only the use of automated testing would not highlight accessibility features that are missing in the website. For example, the possibility of changing text size or contrast level is missing in the CNPTA, but the WAVE evaluates the website content, not what is omitted. In reality, in this study, the user testing indicated more needs and features to improve than the errors highlighted in the automated checker.

Based on the preliminary findings, we cannot affirm whether there is a relationship between the score in the automated testing and the results of the user testing with blind users. If on the one hand, the twelve ā€˜empty link errorsā€™ indicated in the automated checker did not seem to us that directly affected the use of the website, on the other hand, the single ā€˜missing form label errorā€™ in the main search bar affected both participants. In fact, this error related to the search bar was the only error highlighted in both tests - automated and user. The participants had difficulties whenever they tried to search in the website. In the case of the first participant the problem unable him to finish the task. The second participant were challenged by the feature also, though he tended to alleviate its performance by saying that the time spent is normal.

In reality, in both user testing the responses gave by the participants were highly positive (they were very satisfied with the website in most aspects); whereas the performance observed were rather more difficult (with low efficiency in most of the tasks and in one case ineffective). There may be many reasons for the disparity between the difficulties found by the users (recorded in the observation), and the positive responses (gave in the questionnaire); here we discuss three of them:

  1. 1.

    it possibly highlights the inadequate use of ranking questions at this stage. We understood that the use of ranking questions in the questionnaires used after tasks did not add much, comparing to the comments that the participants made at the end of the study. The comments elucidated their needs and problems when accessing websites in general. A more qualitative approach will be taken in the next steps of the research.

  2. 2.

    it indicates that the participants have low expectation regarding the accessibility and usability of websites based on their everyday experience. When the second participant explained the reasons for being satisfied with the website, he outlined many aspects that cause difficulty to use, such as websites that use FlashĀ®; do not specify the links; have security codes; have plugs incompatible with the screen reader; and other issues. Comparing to these websites the CNPTA is good. They are habituated to finding usability problems when accessing the internet.

  3. 3.

    it may reflect that they expect to ā€œmake mistakesā€ which can take time to find the way back to what they are seeking. Both participants highlighted that the time spent in the tasks are normal for first access. Comparing to the usability issues we (researchers) found whenever we had accessed the CNPTA, we can say that, differently from the participants, we attribute most of our unsuccessful attempts to the websiteā€™s errors (messages that we see when accessing) and lack of information in it.

In summary, although we cannot state that there is a direct relationship between the automated testing and the user testing, we can affirm that the user testing with the two blind people brought to light several aspects for improvements that would not be recognised or even indicated through automated testing. The next section describes some of them.

5 Recommendations

In this section we outline some recommendations to improve the accessibility and usability of the CNPTA website:

  • an efficient searching system, with a label identifying the main search bar - currently it does not work if the type of disability is not identified, and; it is too difficult to search for a category of products;

  • features that allow adjustments on text size and contrast level;

  • features that allow shortcuts through keyboard keys - in the case of government websites intended for people with disabilities, the shortcuts could be the same, avoiding variations that could affect the usability of these websites;

  • the use of high resolution images;

  • the use of standardised information to be provided by the suppliers;

  • features that allow automate update of the catalogue if products have changed by suppliers or are unavailable in the market;

  • adjustment of the links of the products in the catalogue, connecting to the same products on the suppliersā€™ websites;

  • change the sequence of the main menu, with the visual disability being the first option.

6 Conclusions

A careful search for methods to assess the accessibility and usability of websites has to be considered when preparing a study or developing a website. Mainly, in cases of websites intended for people with disabilities. Based in this preliminary study, we can underline that only the use of automated testing would not highlight accessibility features that are missing in the website. In this case, user testing is fundamental to understand usability problems that drives to inaccessible information, or in other words to accessibility problems. Thus, it is difficulty to distinguish where accessibility ends and usability starts when discussing website accessibility for disabled people. User observations and open questions that encourage the user to speak their needs are resources that highlight the real problems and help web developers to translate accessibility guidelines to needs and practical solutions. Consequently, such methods are complementary and should be broadly used in accessibility test.