Keywords

1 Introduction

We consider a system for mobile commerce (M-Commerce). M-Commerce systems make it possible to perform bank style transactions through mobile phones. The M-Commerce market is growing rapidly in developing countries in Africa. The Swedish branch of the Ericsson telecommunication company is developing an M-Commerce system called Ericsson Wallet Platform (EWP).

Due to globalization and outsourcing, it is becoming increasingly common to have a considerable distance between the developers and the actual users. In this case, the long geographical distance between the site of development (Sweden) and the context of use (Africa) is a problem. The solution architects [14] (SA-team) are collecting requirements for the EWP at customer sites. Such information gathering is capturing contextual information and provides a basis for later evaluation activities [1].

Various studies within the banking and payments sector have shown the importance of context of use [2, 3, 16]. Due to the geographical distance, the SA-team from the integration department in Sweden replaces the actual users in this usability test. We will therefore refer to the SA-team as proxy users (proxy users are sometimes referred to as surrogate users [15]). Our usability test approach is thus based on the idea that the SA-team’s insight and customer understanding brings the actual users’ needs to the EWP development in Sweden.

When the usability test was completed, a group of actual users came to Sweden to attend a course. In order to get additional input, the usability test was repeated with the group of actual users. It turned out that the results from both usability tests (with the SA-team and actual users, respectively) were very similar.

2 Background

There are a number of definitions of usability [4–6]. Testing with real users is the most fundamental usability evaluation method. Our focus is to investigate if proxy users can be used to overcome the fact that there is a long distance between the development site (Sweden) and the place where the system is used (Africa). Some alternative approaches to proxy users are Personas [7, 11] and Focus groups [4]. Focus groups can be subjected to remote testing, with the use of video conferencing and other forms of electronic networks, which is an inexpensive way of conducting usability testing.

In remote usability testing user research can be conducted with participants in their natural environment with the use of modern technology like screen sharing or online remote usability services. The problems with remote usability testing include restricted or no view of the participants body language, and technical difficulties due to the distance and network capacity.

3 Usability Test of the EWP

There are a number of stakeholders in EWP. These stakeholders include consumers, agents, merchants, and customer care. EWP has a number of interfaces and the object of our usability testing is the graphical user interface that supports the work of customer care agents, financial controllers and compliance officers.

Our usability test combined a set of tasks with the application of the think aloud protocol [12, 13]. During the tasks we measured completion time and number of errors. Interviews were conducted with the test participants and a questionnaire was designed in order to steer the interviews. We used semi-structured interviews [8]. The usability test consisted of the following steps:

  • test preparation

  • test participant selection

  • task definition – the test cases performed

  • test execution.

The steps described in detail are in the subsections below.

3.1 Test Preparation

The test began with a brief description of tasks, techniques used to collect data, and the aim of the usability test. This was presented to each participant. After the test participants completed the tasks and the times to complete the tasks were measured and recorded, the interview process began.

3.2 Test Participants

According to Nielsen, the optimal number of testers for a usability test is 5; with adding more users the same findings are observed and not much new is discovered [17]. The five test participants in the usability test were selected among the SA-team, as one of their job duties is collecting requirements for system adjustments and configuration, requirements that expand the system in order to support specific tasks discovered as potentially beneficial in the EWP solution. In order to do their job, the SA-team at Ericsson is required to visit customer sites and gather information. The main purpose of the solution architect role is converting the requirements into an architecture and design that will become a blueprint of the solution [14].

The interface used for this usability test is used by customer care personnel at a mobile network operator (MNO). When the usability test with the SA-team was completed, a group of four actual users from Africa came to Sweden in order to attend an educational course. This gave us a good opportunity to repeat the usability test with these actual users.

3.3 Task Definition

We selected 4 tasks for the usability test and the SA-team reviewed the relevance of the selection:

  1. 1.

    Transfer of funds. An amount of 10 EUR to be transferred between account holder A and account holder B.

  2. 2.

    Edit account holder information. The user is to search for an account holder and edit the account holder information – in this case update of the street number in the address field.

  3. 3.

    View transaction history. The user is to search for an account holder and for the specific account holder select view historical data and view vouchers.

  4. 4.

    Four-eye principle. The four-eye principle is a process that prevents users from performing certain operations in a single step. An approval is required by another user, for example a supervisor.

3.4 Test Execution

The usability test was conducted at Ericsson, Sweden. The participants received individual invitations to the usability test session via the Outlook tool and team rooms were booked for the session in order to prevent possible interruptions and distractions. Participants used the evaluator’s laptop, in order to access the environment containing the latest EWP installation with the interface.

4 Test Results

All five participants from the SA-team completed the test in less than 20 min. The number of errors per participant did not exceed 3 and the time spent to support the participants was between 1 and 4 min.

The four actual users completed the usability test in 11–16 min, i.e., they needed approximately the same time as the SA-team proxy users. They all needed 1–2 min support; the minimum number of errors was 0 and the maximum number of errors was 4. Again, approximately the same values as for the SA-team.

The first task proved to be challenging, since four of the SA-team as well as all actual users needed support. The main problem was that the interface was not self-explanatory as to where to find the transfer option. The time to complete the task was approximately the same for the SA-team members and the actual users. The participants, SA-team as well as actual users, expressed the need for revising the terms used for actions and an inconsistency with input fields was noted as well.

The second test case was completed with minor support provided to the first two actual users. Some improvement suggestions were also noted from both groups of users (SA-team and actual users): a search button should be present, a going back button should be added and a system message of successful edit should be displayed.

As a part of account holder management, a view of transaction and voucher history is presented and this is the focus of the third test case. Usability issues were detected since not all SA-team participants knew where to start and the interface not being self-explanatory was highlighted again. Both groups thought that the display of transaction history was hard to understand, and suggestions of explanation of the colours used in the table were recorded. One actual user also expressed the need to make the data in the transaction history exportable as a PDF file.

The enforcement of the four-eye principle is a very common part of tasks within the financial industry, due to the possibility of errors or misuses. It consists of special permissions in order for some actions to be approved by a supervisor. The fourth test case concerns the four-eye principle. This task proved to be the most confusing one for the SA-team as well as for the actual users and many improvement possibilities were detected, such as approval history, information of pending approval tasks being displayed for all users, and more logical positioning of the approval action.

Our tests showed that we got (almost) the same comments and improvement suggestions from the SA-team and the actual users. Also, the number of errors as well as the time and support needed were similar for both groups.

5 Discussion and Related Work

Two examples of successful usability engineering projects with no or limited access to actual users are the Information Map Studio (IMS) at the SAS institute, and the Nokia Communicator (N9000) mobile device.

IMS is an application that enables a technical user to create a business view of data that are relevant to analytical needs of business users which use reporting tools. The project had no established customer base. Initial testing of the original design showed a significant amount of usability issues. After the identified issues were fixed, a second iteration with a follow up usability test was conducted. The results of the test showed that the IMS application had made usability gains on every aspect of the user interface. The first iteration of testing included SAS employees who had experience with IMS (see pages 112–134 in [9]).

At Nokia, usability tests are used to evaluate the flow of tasks that have been found critical for mobile devices. They faced a challenge with a new product and there were no real users to conduct the testing. Due to the competitive market, the device was not to be shown to the people outside of the development team. During the development, different methods were used to discover potential problems (usage scenarios, focus groups) and user testing was conducted after the product was released to the public. Users were given the device for some weeks and were asked to report positive and negative features. The results confirmed the developer’s concerns about the effects of consistency with other similar applications that run on desktop machines (see pages 464–474 in [10]).

In both cases discussed above, the unavailability of real users was an obstacle. While the approaches to usability testing were different, the usability on the products was eventually improved using proxy users.

While using proxies has its benefits, there are limitations to consider as well [15], e.g., the proxy users may know the product too well or have been too involved in the design.

6 Conclusions and Future Work

We have done a usability test that used solution architects as proxies for the actual users of the interface to the M-Commerce system. The results using the proxies and the actual users were very similar; the same usability issues were identified and the proxy users and the actual users needed approximately the same support and the same time to complete the tasks. The conclusion from our study is therefore that proxy users can be an alternative when the actual users are not easily accessible due to geographical distance or other reasons. Previous reported projects at SAS and Nokia support this conclusion [9, 10].

This method can be further developed and applied in different parts of the EWP solution, since this study targeted only one interface and one specific group of users. We will extend this study in two ways: we will conduct usability tests for other interfaces of M-Commerce, and we will also investigate if it is possible to use other groups than solution architects as proxy users.