Keywords

1 Introduction

Literature overview in Human-Computer-Interaction suggests two challenging issues to be faced: firstly the views of individual users with particular limitations were traditionally not sought to inform the design process, causing fails in satisfying the needs of a large proportion of the populance and reluctance to engage with new Human-Machine-Interfaces [1]; sencondly actual high-fidelity prototypes used to assess sample end-user response are generally costly, leading companies to neglect participatory approaches [2].

The research goal is to study and develop a low-cost high-fidely prototype exploiting haptic feedback technologies to stimulate different sensorial channels according to the individual user abilities while interacting with the interface. In order to fulfill target users needs and create a prototype that is applicable, appropriate and accessible to many users as possible, the adoption of an inclusive design approch becomes imperative [3]. The research work intention is actually to provide a solution that works as effectively for less able users as they do for able-bodied ones.

Most user-centric design approaches, which inclusive design belongs to, involve sample end-users in product design through iterative cycles of requirement gathering, prototype development, implementation and evaluation [1]. The construction of interactive prototypes to conduct empirical testing at each design stage is difficult to achieve in short time and low cost [4]. In the last ten years the Virtual Reality (VR) technologies have been introduced to provide novel human-computer interfaces through which creating accurate prototypes that are less expensive than high-fidelity ones. According to [5], VR technologies can be classified according to the sensorial modalities they stimulate (i.e. vision, touch and hearing) respectively in visualization and sound displays and haptic devices. Haptic devices are of particular interest for this research as they stimulate the sense of touch, that is one of the substituting modalities adopted by blinded people [6]. Haptic devices provide both tactile and kinesthetic feedbacks [5]. They allow the simulation of some object features such as weight, shape, surface texture, etc. [7]. A set of haptic devices is the so-called tangible user interfaces (TUI). They adopt physical objects to translate user action into input events in the computer interface [8, 9]. The adoption of TUIs for creating a high-fidelity prototype for experiments with target end-users represents a good solution to provide an aesthetically satisfying as well as a functional handle onto elements of the digital world. Although a lot of studies have been carried out to analyze how different VR technologies is able to support blinded people in exploring virtual environments, in rehabilitation, in assisting mobility and in information access on mobile devices and web-based applications [1012], any research have been conducted yet to develop a tangible interface that facilitate them in using household appliances. Moreover, researches on designing such interfaces did not describe if and how to build low cost high-fidelity prototypes to improve the involvement of end-users in the overall design process and then guarantee the success of the solution.

The above-mentioned issues also emerged in a real industrial design case, represented by a long-term project called “MEET: Multi-Experience for wellness of life EnvironmenT”, funded by the Italian Marche Region, involving three Small and Medium-Sized Enterprises and two research centers. Its goal is the development of an innovative multisensory shower oriented to elderly and visually impaired able to save energy and water and inform the user about current consumptions. The product has to integrate an interactive human-computer interface to achieve this second objective.

In this context, the present paper describes the design method adopted for conceiving the user interface and building a high-fidelity prototype. The adopted inclusive design approach has lead to the development of a dedicated user interface that consists of a graphic display where digital contents are projected to create the visual feedback, and a tangible knob providing both kinesthetic and vibrotactile feedbacks to emboss Braille texts and enable information understanding both to visually impaired and sighted users. An Arduino board compatible with Intel Galileo is used to implement the whole prototype hardware and software architecture, that represents a novelty both for applicative and technological domains.

2 Related Work

2.1 Applying User Centric Design to Product Development

Traditional design approaches have been accused of failing in case of less-able persons (e.g. older people, children, visually impaired) due to the insufficient involvement of end-users in the design process, with a consequence of compromising commercial opportunities and making the interactional experience of users poorly attractive [13]. For this reason, in the last twenty years numerous user-centric design approaches have been developed to push the design focus on the thing being designed (e.g. artifact, communication media, space, interface, service) looking for ways to ensure that it meets the need of an individual user and at the same time to a proportion of population as large as possible. The common element among the different UCD approaches is the involvement of sample end-user at all stages of product development, from information gathering, to design requirements, till alternative solutions’ evaluation and experiments with final prototypes. For instance, inclusive design is an approach that aims to create interfaces, artifacts and products that are applicable, appropriate and accessible to many users as possible within the constraints of the design specification [3]. Similarly, participatory design aims to develop systems with the close involvement of all process stakeholders and end-users through cycles of requirement gathering, prototype development, implementation and evaluation [14]. Experience Design aim is to design users’ experience of things, events and places by studying the communication process between the user and the object/service/environment [15]. One of the main critical issues in adopting the above-mentioned approaches regards the construction of interactive prototypes to conduct empirical testing at each design stage in short time and at low cost. This task becomes more critical in case of involvement of people with some kind of impairment [12].

Two main prototyping techniques are known in literature: low-fidelity prototyping (e.g. paper sketches, cardboard mock-up) and high-fidelity prototyping (e.g. software-based and virtual prototypes, physical mock-up). The second is able to make users realistically appraise the aesthetic attributes and functionalities of the product [4] but is expensive and cannot be implemented at the early stages of design. To overcome this limitation, in the last ten years VR technologies have been introduced to intuitively manipulate ad explore virtual prototypes as well as to simulate product behavior in different working conditions. Some studies demonstrate how virtual prototype interaction via VR can be used to rapidly carry out usability testing while reducing evaluation costs and time [16]. However, VR shows several technological limitations as it is still characterized by low sense of immersion, poor physical interaction, high complexity, intrusiveness and non-intuitiveness. Mixed Reality represents a compromise solution in which real and virtual worlds are combined in various proportions and presented as a unified whole [17]. Mixed Reality environments often exploit TUI technologies (e.g. haptic and tactile displays) to reproduce the real contact with the object while manipulating the virtual prototype. Although a lot of studies have been carried out to analyze how VR and MR are able to support elderly and disabled people in numerous contexts [18] any research have been conducted yet to create a tangible virtual prototype to involve the blind in user interface design. The research interest for the development of a TUI to create a reliable high-fidelity prototype for usability testing is twofold: it can be used as a mean for sensory substitution in case of visually impaired and despite many other interfaces it has the advantage to provide a natural affordance in tangible objects coupling digital information to physical artifacts and environments [19].

2.2 Tangible User Interfaces as Means for Prototyping Design Solutions

A TUI is a type of human-computer interface that leverages physical representation to connect between physical and the digital worlds [8]. Their main potentialities are as follows: they support learning due to their hand-on nature, which allows physical manipulation of objects [20]; they demonstrate superior performance in terms of memory and learning enhancement once a haptic signal is added to the visual and audio ones [21] as well as in terms of sense of presence in the virtual environment; they are more inviting, engaging, enjoying and intuitive than traditional graphic interfaces [19] due to the rich feedback and realism they provide.

Haptic displays are a set of TUIs that are generally used to physically manipulate digital information. They can be divided in force feedback devices (FFDs), and tactile devices. FFDs allow point or multi-point contact with virtual objects to provide kinaesthetic cues [22], while tactile displays reproduce the shape and the contact effect of the object surface by stimulating the mechanoreceptors of the human fingertips by providing cutaneous cues [23]. Three types of most relevant FFDs are [22] point-based devices (e.g. the PHANToM, the MOOG-HapticMaster and the Haption-Virtuose), multi-point based devices (e.g. Haptex system), physical rendering systems based on vertical pin displacement. On the other side, the field of research on tactile displays is rapidly growing. They spread from miniature pin array tactile modules based on elastic and electromagnetic forces to stimulate the human’s mechanoreceptors [24], to electrotactile displays where the finger mechanoreceptors are stimulated by a current flow or an electric potential, to electro-vibration-based friction displays able to modulate electrotactile or friction forces between the touch surface and the sliding finger [24, 25]. There are also some examples of integration of force and cutaneous feddbacks. Kim et al. [26] proposed a lightweight tactile display integrated into a haptic glove; Oshii et al. [27] developed a tactile display using airborne ultrasound. Although the numerous studies, most systems lack of usability and have limited capabilities [28].

Due to the lack of visual contact, people with reduced sight obtain information from the surroundings using the other perceptions, especially touch and hearing. Most of the interfaces developed for visually impaired people adopt two types of feedbacks: haptic and/or tactile and acoustic. Focusing on the first class of means for sensory substitution, several technologies have emerged which can be used to present information to the sense of touch [6] as follows:

  • raised paper displays that imply the reproduction via embossing or heat-raised paper of Braille texts, pictorial information, maps, etc.

  • vibrotactile displays that consist of a single element stimulator to encode information in temporal parameters of the vibration signal. They are used as non-audio-based indicators of an incoming call or a warning;

  • force feedback displays via single-point interaction probes that render the contact with virtual objects;

  • tactile displays that produce touch sensations through electrical, thermal or mechanical stimulations.

Applications of force and tactile displays can be found for the development of user interfaces to support blind and visually impaired people for medical or rehabilitation scopes [10]. They are also used for orientation and mobility purposes, where the haptic cues are used to translate some visual information gathered from sensors or other input devices into touch information. For instance, Brown et al. [29] investigated the use of multiple vibration motors on the user’s arm to access calendar information, while Kammoun et al., [30] proposed the usage of two vibrotactile bracelets to aid blind people during orientation and navigation. However, in all cases the adopted design approach appears more technological-oriented than user-oriented. Moreover, none example of their application for creating high-fidelity prototypes to support inclusive design has been found in literature.

3 The Design of the User Interface

3.1 The Design Approach

The approach adopted to design the proposed user interface consists of the following steps:

Step1::

Information gathering about user needs to define a set of design requirements. Both surveys and semi-structured interviews are used for this purpose;

Step2::

Design of the user interface including graphics, dynamics, and the physical components that are used to set the product items and receive proper feedbacks about the selected options. This stage includes the creation of a flow diagram of the interface elements and the conceptual design of the physical components (i.e. screen, control knobs and buttons);

Step3::

Prototyping of alternative solutions for user testing. It includes the creation of the proposed high-fidelity prototype exploiting haptic technologies;

Step4::

Usability testing via task analysis with sample end-users on the developed high fidelity prototypes;

Step5::

Creation of a document containing all design specifications and guidelines deriving from the elaboration of usability tests’ results to detail the best design solution.

Step6::

Implementation of the selected solution (embodiment design and detail design of the interface) and creation of a final working prototype for evaluation.

As mentioned before, the proposed approach derives from the state of art in inclusive design and from the issues emerged during the MEET project. The MEET project is actually based on the idea to research and develop a new innovative shower able to save energy (−30%) and water (−40 () and offer a multi-sensorial experience to elderly and visually impaired persons. The user interface has to: (i) make the user aware of water and energy consumptions and achieved savings; and (ii) enable the user to select a proper combination of product items to create a custom wellness treatment. The functions offered by the multisensory shower are listed as follows:

  • water functions:

    • rain effect;

    • cascade;

    • adjustable hand shower;

    • upper/central/lower massage;

    • vertical jets;

    • lower jets;

    • vaporized;

It is also possible to set the water flow rate and adjust the temperature.

  • color therapy functions:

    • RGB led and white lights;

  • music functions:

    • select genre, artist, album and playlist.

According to the selected sound track the system combines the other functions to give a coherent multi experience.

The first step concerns the analysis of user needs and the creation of a proper set of design requirements to start with the conceptual stage. This study is carried out by a survey on traditional user interfaces adopted in wellness sector and by interviews on their current usability. A semi-structured interview is submitted to 50 sample end-users aged between 50 and 70 years, about which 40 % are visually impaired. The main identified drawbacks regard the limitations of adopted touch displays that do not provide an accurate visual feedback when positioned into a wet environment and of voice controls that are not reliable due to the noise produced by water and selected sound tracks. Finally there are not any systems with solutions appropriate for impaired users’ needs.

Thanks to the outcomes of the information gathering stage and to the proved affordance of TUI, it has been chosen to develop a user interface consisting of a LCD graphic display, where the digital contents can be visualized, and a haptic knob to select the shower items with functionalities appropriate both for able-bodied and visually impaired users. In order to include both sighted and blind persons in the shower environment, the haptic knob is designed according to the following requirements:

  • Providing an ergonomic grip;

  • Enabling a complete control of the interface without never removing the hand;

  • Embedding “ok” and “back” buttons to allow the user to advance or regress into the hierarchical menu;

  • Providing a tactile feedback to emboss Braille text.

The graphics interface is developed adopting a hierarchical tree structure to be compatible with the physical interface (i.e. knob). The followings requirements drive the design of its graphics and dynamics:

  • Clear layout made of two distinctive areas, one for navigation and for information to be input or output and one for status information;

  • Using more texts than symbols to express the information in an easy way;

  • Enlargement and careful placement of items in the interface structure;

  • Creating a contrast between items and background to make the information easily readable;

  • Adopting visual and/or audio and/or touch cues to inform the user about the current event or the selected item (feedbacks);

  • Mapping between provided functions and icons.

The designed graphic interface consists of a static column, on the left side, where the shower status is displayed (e.g. on/off, water temperature, set program, selected sound track) and a dynamic area, where information changes according to the selected options in the main menu. The menu is represented as a circle where items are listed around it. A study is conducted on the hierarchical menu in order to reduce the number of interactions necessary to perform a task. A flow diagram is created for that purpose. When the user selects one of the item in the menu, the text is enlarged and a new section appears to control the related sub functions (Fig. 1).

Fig. 1.
figure 1

Two screenshots of the graphic user interface

The circle displayed in the graphic interface is also physically reproduced through a circular haptic knob whose rotation is associated to the scrolling functions of the graphic menu. Two buttons, corresponding to “ok” and “back” functions, are positioned at opposite knob sides. They allow the user to go over and back into the hierarchical menu. The knob is divided into two parts, a fixed one where the buttons are positioned and a mobile one that rotates to allow the user to select the menu items. A Braille pad is placed on the top surface of the rotating ring, that clearly remains fixed (Fig. 2).

Fig. 2.
figure 2

Concept of the knob

Both rings of the haptic knob have a diameter of 8 cm. Some preliminary ergonomic studies have been conducted to set the dimensions. The knob is realized in Duralight®, an innovative acrylic-based material patented by one of the industrial project partners, characterized by a nice finishing, particularly when it is wet by water, and by a flexible manufacturing process that allows the creation of custom shapes. A soft touch coating is additionally used to cover the areas of the haptic knob where the user goes in touch to increase the pleasantness of touch and provide a better grip for people that have some impairments in hand movement control and finger grasping.

4 Prototyping the User Interface: The Haptic Knob

The combination of different graphics of the user interface (e.g. dimensions and colors of icons), behaviors of the knob (e.g. hard or soft rotation) and shape, size and finishing of on-touch surfaces (e.g. coarse or smooth finishing) requires the building of low-cost prototypes to perform preliminary usability testing and select the most appropriate solution to develop the final one. Haptic technologies are then exploited to simulate the behavior of the knob while providing a tangible interaction to the user and visualizing the digital contents on the graphic display.

The proposed high-fidelity (Hi-Fi) prototype is composed of three basic elements: (i) the graphical user interface (GUI) for the reproduction of visual information; (ii) the control system to manage user interactions with the haptic knob and the coherent configuration of the GUI; (iii) the haptic knob with capacitive buttons and an electro-tactile pad. The prototype consists of both Hardware/Firmware and Software components that are integrated to guarantee system reliability and I/O synchronization.

The physical prototypes of differently sized knobs are realized by Rapid Prototyping techniques. A commercial LCD display is used to visualize the digital contents generated by the software prototype. The “ok” and “back” buttons have not physically prototyped because of their fragility in case of small dimensions of 3D printed shapes with respect to finger pressure. For that reason, two capacitive buttons are introduced to operate as pressure sensors to activate the same functions. The software prototype is developed in an Open-Source object-oriented programming language that is Processing. It manages the graphics and dynamics of the user interface. Arduino that is an Open-Source electronics platform based on easy-to-use hardware and software, is used to implement the Firmware (FM) to carry out data elaboration.

Figure 3 shows the overall system prototype and how the information flows across the HW and FW modules.

Fig. 3.
figure 3

Human-computer Interaction flow in the interface prototype

The Hardware architecture is composed by:

  • A 7-inches LCD display connected to a laptop computer via VGA. It is used to visualize the screenshots of the graphical interface;

  • A laptop with i7 processor at 2.2 GHz and 16 GB of RAM, to upload the interface manager FW on Arduino board and make the Processing application running. In addition, the laptop is used as a tool for serial communication between the Arduino board and Processing application;

  • An Arduino-based board called Intel Galileo with an Intel® Quark ™ X1000 SoC, 32-bit 400 MHz processor, 20 Digital I/O Pins (12 used) and with an input voltage of 12 V, to handle both digital and analog components;

  • The physical prototype of the knob that is made of two elements, one fixed to guarantee the knob positioning on the shower wall, and one rotating where the capacitive buttons are mounted. The knob is finished by different soft-touch paints;

  • Hardware components that allow to generate input signals of the interface: two capacitive buttons for navigating back/forward among the interface pages and a rotary encoder to navigate through the menu items;

  • An electromagnetic brake that regulates the torque functions of the knob to define its behavior and the number of clicks corresponding to the menu items. The brake is mounted on the rotation pivot of the knob and connected to a variable power supply circuit controlled by the Intel Galileo board;

  • The tactile pad positioned on the top of the knob surface to emboss Braille texts. The pad is realized through a matrix of 2 × 3 pins connected to the six-pins of the board and a ground terminal - common to all the pins – used to the current back-flow. A Bare Conductive fluid is used to paint the output pins, the connections to the board (horizontal lines in Fig. 4) and the ground connections (vertical lines in Fig. 4). It is also applied inside the knob to create floating links to create the knob connections when the array is in the portrait mode. Two main advantages are achieved by the use of this electrically conductive paint: (i) there are no problems about tangled wires due to the knob rotation, (ii) the risk of inverted matrix and resulting wrong mapping pins is avoided, and (iii) it has not any performance degradation when covered by soft-touch paint to improve tactile sensations and grip effects.

Fig. 4.
figure 4

On the left the Arduino-based board connected to the laptop where programming runs; on the right the final prototype without the soft-touch coating.

Two programming platforms are used to implement the software as follows: (i) Arduino, that manages all electronic components and at the same time is used as a tool for communication with the GUI; and (ii) Processing, that controls the GUI through a serial communication with the Arduino-based Firmware. The choice of these two development systems is due to the strong integration between Arduino programming language and the Processing one. The software is implemented according to the flow chart developed in the conceptual design stage. It synchronizes the haptic interactions (i.e. pressure switch of the capacitive “ok” and “back” buttons, knob rotation) with the displayed graphics and the coherent reproduction of the Braille text.

Once detected a specific type of interaction, the system ensures the management of two consecutive events: (i) the update of the graphical interface and (ii) the configuration of Braille Pad to communicate the selected menu item to the blind user. The mapping of each graphic screenshot with all possible user actions (e.g. knob interactions) enables the implementation of the whole system. Pins configuration and graphics are interlinked into the system database. The Arduino firmware allows the operator to set the supply voltage of the electromagnetic brake in the setup phase, in order to control the torque function and hence the knob behavior.

5 Conclusion and Future Work

Learning from past research and development in the application of technology for the blind, this paper proposes an innovative user interface to assist visually impaired and sighted users in setting and customizing wellness treatments offered by a multi-experience shower. It consists of a graphic user interfaces coupled with a haptic knob embedding a refreshable Braille pad. As designing devices for the blind is more arduous than expected and requires a fully understand of their needs, an inclusive design approach has been applied. A high-fidelity prototype exploiting both force and electrotactile feedbacks displays is developed to conduct iterative evaluations of design solutions’ usability and accessibility. The main proposed challenges regard:

  • the application field of haptic technology for the blind. In literature it is generally used to create aids for orientation and rehabilitation. In this paper it is used both as a new interaction paradigm to create novel user interfaces for household appliances and as a tool to develop low-cost high fidelity prototypes for testing.

  • the integration of a state-of art tactile pads [25] into an haptic knob to support the refreshment of Braille text to enable visually impaired in accessing information from the graphic user interface.

Future work will be focused on the experimentation of the developed haptic knob into the blind community in order to test the reliability of the adopted inclusive design method, the usability and acceptability of the developed user interface and finally the ability of the low-cost high fidelity prototype to accurately simulate the behavior of the designed interface.