Keywords

1 Introduction

The Internet of Things is a new technological paradigm that is changing the way we interact and relate with digital contents and services, reshaping our world. As an evolution of the concept of Ubiquitous and Pervasive Computing [2] physical elements like objects and spaces are becoming connected and able to share information. From the perspective of the user it means that digital contents and services may be accessed by a multiplicity of new channels (i.e. a smart fridge could suggest a recipe and then share it with a smart food processor) but the most important point is that the fusion of digital and physical worlds, together with AI and machine learning, enables a huge number of new services and possibilities (i.e. a smart fridge could learn the habits of the user, suggest a diet that takes into account the user’s activity level and her health conditions and so on).

Designing this new type of services and products means to face a wide set of new challenges [1] like trust, privacy, human-human and human-machine relationship and many others. It requires also multidisciplinary and specialized competencies and presents an high level of complexity due to the multiplicity of variables to be considered. For this reason new methods and tools are needed to support designers and professionals involved in the design of connected products and omnichannel services. The IoT Design Deck, as described in our previous work [1], is a method to help multidisciplinary teams to co-design the User Experience of connected products and omnichannel services. In this work we present the evolution of the method based on a series of tests and a redesign phase.

2 IoT Design Deck

As described in our previous work [1] The IoT Design Deck is a method for the co-design of the User Experience of connected projects and services. Its main objectives are to help design teams to:

  • to create a common language and a common knowledge base to help multidisciplinary teams to work together, also with the involvement of users and stakeholders;

  • to focus on UX aspects rather than on technological ones;

  • to discover and use the potential of the IoT and to take into account its threats;

  • to have a service oriented approach rather than a device centrical one;

  • to accelerate the design process.

The method could be used for the design of “real” projects but also for exploring and teach how to design the UX for a connected product.

It is physically composed by several decks of cards and “boards”. The method follows a design process that mixes the use of cards and boards with common techniques and methods for user research, idea generation and sorting, testing and prototyping, to guide the team from the first idea to the prototype.

2.1 Testing the IoT Design Deck

After the development of the first version of the method, we decided to test its effectiveness in order to improve it. At the best of our knowledge there is no agreement on a standard method to evaluate a co-design method, so we decided to use a qualitative approach to measure if the proposed method was able to help a design team in the expected way.

Moreover we were invited to take part to the IoT Ideation Expert Workshop organized by the Chemnitz University of Technology. During this workshop several co-design method for the IoT were tested and compared.

In the following paragraphs we will describe the tests and subsequent redesign phase based on the tests results.

First Tests.

The first tests were intended to be merely qualitative as, according to the objective of the method, the outcomes are more related to the experience of the participants rather than to a measurable parameter. Moreover, according to the intrinsic characteristics of the method and to the need of a facilitator, it was possible to manage just few small groups that, however, could be representative of a real-world condition. As the main objective was to test the usability and the effectiveness of the tools we adapted the well-known Nielsen heuristics [3] and used those evaluation criteria:

  • Self-explanation and understandability (system visibility, consistency and standards, help and documentation): ability of the tools to be understood without explanation. Awareness of the user about why she is doing something and which are the next steps. This is also related with the understandability of the labelling and the textual contents by an heterogeneous audience but also at the logical order of the elements in the cards.

  • Efficiency and ease of use (flexibility and efficiency): it refers to the ability of the cards to support the action they are designed for with the right amount of user effort i.e. if there is enough space to write the characteristics, but also if the card dimensions allows to be sorted, managed and red by several people at once. Regarding the flexibility, we were also interested to see if experts could “bend” the tools to use them with the methods they are familiar with.

Moreover, as a way to measure efficiency, we added another indicator:

  • Group interaction: the ability of the method and the tools to encourage the interaction within the group, the discussion and the exchange of ideas.

We organized a series of workshops to test the method both with non-experts (students) and experts of UX and Service Design. During the workshops people were asked to develop a possible solution (concept) starting from a brief (randomly extracted), defining personas, context of use, functionalities and touchpoints, and creating a user journey. The workshop had a duration of 3 h.

To collect the data we used the following techniques:

  • Observation: made by the facilitators and by the researchers to detect the behaviour of the participants and the interactions within the group.

  • Thinking aloud: the participants were asked to comment while they were using the cards.

  • Focus group: made at the end of the workshop to collect information about the three points reported above and to let emerge participants’ opinion.

Four tests were made with design students with basic knowledge of UX Design. We recruited 12 students (different for each test), aged between 23 and 25 years. They were divided in 3 groups.

The second test was made with the involvement of the members of the UX Book Club of Rome. Participants were experts and professionals working in the field of UX, Visual and Service design and software development. They were 18, aged between 25 and 47. They were divided into 4 groups.

There were two facilitators to help the groups. The facilitator had the role to help the group to use the method and to focus on the project, suggesting tools and techniques and keep the group discussion moving smoothly.

Results.

Regarding the three main criteria selected we obtained the following qualitative results:

  • Self-explanation: The tools are quite self explaining. Even non-experts were able to identify the scope and the use of the cards. The preferred one was the “Personas” card. Even experts and non-expert were able to use it intuitively. Although the “Functionality” cards was not easily understood by non-experts. Moreover the process was perceived as not clear, the participants couldn’t easily guess the next steps of the design process and they needed to be guided by the facilitator. Also the usefulness of “Action” cards was understood only when explained by the facilitator.

  • Efficiency and ease of use: The participants found the dimension of the cards useful to write and draw concepts. The limited space was identified by someone as an invitation to be concise. The majority of the participants preferred to write instead of sketching inside the “Context” and “Functionality” cards.

  • Group-Interaction: Participants used the cards to discuss ideas within the group. As an example they intuitively used the “Personas” and the “Context” cards to describe some scenarios or possible use cases to other participants. The “Touchpoints” and the “Threats” cards were used as a library to find inspiration.

During the focus group with the experts we asked them if they would use the method or some tools in their work. Especially the “Personas”, the “Context” and the “Threats” cards were addressed to be useful and that could be used even outside the method.

The IoT Ideation Expert Workshop.

On October 2018 our team was invited to the two days IoT Ideation Expert Workshop organized by the Chemnitz University of Technology, during which several co-design methods for the IoT were tested and compared to find the benefits and different approaches of the different ideation tools. For each method there was a brief workshop follow by a post-hoc questionnaire and a q&a session.

We condensed the main phases of the method in one hour and a half and we focused on the main steps of the design process. We started from predefined briefs, randomly chosen by the participants. The successive steps were defining personas, context of use, functionalities and touchpoints and creating at the end of the process a short user journey. The participants were divided into three groups of six persons each, composed by IoT experts (designers and engineers) and students from the Chemnitz University of Technology. In each group there was a facilitator who guided the participants during the different phases.

This test was very helpful to analyze and improve the method and also to compare it with the six other IoT design techniques. It allowed us to focus on specific topics as: the target (experts or not), the focus (teaching, exploring, designing), the type of support used (cards, objects, etc.) and the necessity or not of a facilitator during the workshop.

It was useful to see how the different methods involved were different from each other, despite all of them was IoT ideation tools. After all the workshops we identified some main differentiation between the different ideation methods and we so have been able to position our method more clearly in relation to the identified variables:

  • The ideal target for which the method is created (students, experts, etc.). Looking at the ideal target for which the tools have been designed we notice that some methods seems to be more appropriate for teaching context while others are more business oriented. The IoT Design Deck can start from an idea generation process, useful for consultancy purposes in business contexts, or can start from a predefined brief for shorter teaching context.

  • The focus (on technology, user experience, product design, etc.). From this point of view, the IoT Design Deck has clearly its roots in the UX Design, while other methods seems to have more influences coming from product design or from computer science. The biggest difference with the other methods is that in the “Actions” cards there are some user research methods that can be used in the first part of analysis.

  • The type of supports used (cards, objects, etc.). About the type of supports used, all the methods have cards, most of the methods have boards and, in some cases, they use also 3D printed elements (i.e. U4IoT [4] and Cards’n’Dice [5]). The IoT Design Deck doesn’t include other elements in addition to the cards and to the canvas.

  • The necessity of the facilitator’s presence during the workshop. Each method has a different flow that has to be known or explained at the beginning of the workshop. The IoT Design Deck needs a facilitator, expert of UX, that masters the method and that can perform actions like giving the “Actions” and the “Threats” cards to the participants. This could be a weakness if compared to more self explaining methods like Tile Cards [6] or Know Cards [7].

Results.

The feedback from the post-hoc questionnaire and the q&a session with the other experts have highlighted different aspects of the IoT Design Deck, already partially emerged after the first qualitative results. The received feedbacks were very useful for the redesign of the second version of the method and we grouped them according to the criteria previously identified:

  • Self-explanation and understandability: The method was perceived generally as not self-explaining for non experts. Indeed it was perceived especially suitable for design experts, like product designers, interaction designers and service designers. Half of the workshop participants has also perceived it useful for technology experts (i.e. engineers and computer scientist) and professionals working in the smart city, smart home, health and cultural heritage fields. It was interesting to note that only few experts involved in the workshop session perceived the tool useful for non-experts, aspect closely related to the necessity of the presence of a facilitator during the design phases. The facilitator has been indicated as fundamental for the understanding of the process and the correct execution of the workshop. The aspects considered less clear were principally the order of when introduce the different cards, the lack of a overview of all the design steps and the relation and interaction between the cards, the brief and the concept. Another pain point emerged during the workshop was the lack of a support to document the scenario, the ideation process and the solution. Some participant suggested to create a user manual and/or boards that could explain better the design sequence also to non-experts and making the presence of a facilitator less essential.

  • Efficiency and ease of use: The aspects considered most positive were the good color scheme used, that allows a clear distinction between the card groups, and the appealing design. The “schemas” cards, with fields to be filled up with text or sketches, were considered useful to enlarge and customize the project, allowing to go beyond a simple assemblage of reference cards.

  • Group Interaction: The tool was considered a useful tool for generating, detailing, presenting and discussing ideas with others.

As regards the application context of the IoT Design Deck it was perceived as not focused on a specific context and useful to design different connected products or services. From the methodology point of view “Action”, “Input”, “Output” and especially “Threats” cards were considered the most interesting to provide inspiration during the design phases.

3 Redesign of the IoT Design Deck

According to the tests results we began a redesign process to fix the problems emerged but also to integrate new research outcomes and new technological trends. The redesign process was focused on the following key point:

  • Self-explanation and self-use of the method: the method should be used without the presence of the facilitator and the design process should be smooth composed by clearly separated steps. In order to achieve this results other two sub-steps where needed:

    • Process redesign: to make it simpler and clearer;

    • Design of a Quick-Start Guide: an handbook to guide an individual or a group to the use of the method even without a facilitator;

    • Cards and boards redesign: fixing the labelling problems (to allow self explanation) and design/re-design of the boards to support the new process.

  • Cards update: adding new cards according to new research outcomes and new technological trends.

3.1 Process Redesign

The objective of the redesign of the process was to make it faster, agile and self explanatory, with the possibility to avoid a facilitator using a handbook. The first step was to consider the starting point of the method. As it could be used on real projects or for teaching we must take into account if the users will start from a brief or should generate ideas. For this reason we kept and updated the “Brief” cards, a deck with a set of precompiled briefs useful for teaching purposes. Moreover we designed some techniques to use the cards, to find inspiration and generate new ideas.

Moreover we defined precise steps, defining the inputs, the outputs and the tools and methods to be used (or suggestion well-known methods for experts). To do this we took as a reference the Double Diamond process [8] that maps the design process through two cycles of divergent and convergent thinking, one for the problem definition and one for the design of the solution. As one of the goal of a design team is to produce project documentation, we identified some key moments in the design process in which the team could save useful information like the research results, the value proposition the user journey map, just arranging the cards and taking pictures or writing information on the canvases.

A Quick-Start Guide, further described, was created to guide the user through the new process.

The steps are the following:

  • WarmUp: It is a preparation of the design stage and it shows some exercises for ice-breaking before starting to design; moreover, the warm up section also illustrates the Design Double Diamond.

  • Discover: This is the first step of the design process and it helps users to seek for a specific problem to solve. Designs do not have all the same starting point: some designs arise from a specific request from a customer or a company, while others start from a given brief or an intuition or a problem to be solved. Whatever it is the starting point of a design, it is important for the designers to discover the context they are moving around, to find out as much as possible about the problem to be solved, its characteristics and current solutions. Exploring a problem’s space at this stage will help later to clearly define what is the problem to be solved. This stage of the design process is mainly based on research, that could be made in several ways, starting from sources like online or academic sources, personal experience or observation and interviews with customers. The problem discovery stage can be accomplished by filling the “Problem Discovery Canvas”.

  • Define: The definition step is an important moment of the design process that can compromise or support the future design choices. As the design process is iterative, designers should carefully refer to the “Problem Discovery Canvas” filled before and check whether if they are following the emerged insights or if they are going off the road. The definition stage consists in the filling of the final section of the “Problem Discovery Canvas”: the “How Might We” section.

  • Concept: The concept stage drives the design team into the definition of the value proposition of the concept they are willing to develop. The concept stage consists in diverging and then converging on ideas: a first part of the exercise asks designers to generate as many ideas as possible, by using the “how might we” section on the “Brainstorming Canvas”. Each participant of the design session tries to address five creative answers, keywords, inspirations related to the “how might we” challenges. All ideas collected should be selected, organized in clusters, voted in order to define a ranking of emerged ideas; then each participant tries to ideate, sketch and present to the others a personal concept idea related to the selected themes, by using the Brainstorming Canvas. A further discussion is necessary to select the concepts presented. Participants should be aware that this process of divergence and then convergence on a concept could lead to divert from the initial challenge. The concept stage is iterative and could be repeated as many times as needed. If designers notice that they are going off-topic or if ideas are not satisfying, the concept stage could be started again.

  • Design: The design stage has the objective to help designers in defining all the core elements of the project and to let them use these elements to visualize all the project in a glance. Using a physical support for each element of the project helps designers, especially in co-design sessions, to quickly prototype a user journey and easily understand whether if it works or not, or to have an overview of the touch points in the project ecosystem. During the design process, designers define personas, contexts and micro moments by using the specific fillable cards. Subsequently the guide drives designers in the definition of the core functionalities, the touch points through which functionalities are enabled and the related inputs and outputs.

  • Prototype and Test: this phase is to be seen as complementary of the Design one. Indeed prototyping is a way to answer to a question. For this reason there are different types of prototypes like functional prototypes, to check is the idea is feasible, or experiential prototypes, to test if our design choices will give a good experience to the user. During the prototype phase designers will be invited to create and test experiential prototypes even of a single feature of the project especially if they need to decide among design alternatives.

3.2 Cards Redesign and Update

During the redesign process we undertake different actions:

  • redesign of the structure and labelling of existing cards (“Functionality” had become “Function” cards);

  • reorganizing the content and the category of existing cards (“Action” cards had become “Tool” Cards);

  • creation of new set of cards, “Micromoment” cards, useful in certain part of the process;

  • updating of the content of the existing cards: revising the text of the cards adding some new research outcomes and popular technological trends.

Function Cards.

Since the first test we noticed that the “Functionality” cards where hard to be understood in their scope and use so we redesigned them. As nobody used the sketch field we canceled it and we provided 4 fields: function’s name, what the system does, user’s action/intent supported, notes. In that way we want the designer to focus on the objective/task that the function helps to accomplish and what the system should do (i.e. the user wants to eat and the systems automatically orders the preferred food). We also changed the name in “Functions”.

Tool Cards.

This was a rearrangement and an update of the “Action” cards. A problem in the comprehension and the autonomous use of the “Action” cards was that it contained techniques that are useful in different moments of the design process. We decided to change the name of the deck to make it more understandable and to divide it into 4 sub-categories that are useful in certain steps of the new design process designed:

  • Inspiration: Techniques or design principles to find inspiration. Useful in the idea generation phase and, in general, in every moment when the creative process needs a boost. As an example the “Empathy tools” card suggests to use objects or simulate a situation context to experience how your user would feel like.

  • Research: Techniques to collect qualitative and quantitative data useful to develop the project. As an example the “Quick Ethnography” card suggests to spend some time with people that represent the target, trying to understand how they behave.

  • Sorting: Techniques to sort ideas, useful after a process of idea generation or when you have to choose among different alternatives. As an example there is the “dot voting” technique that helps a group to quickly vote and rank different ideas.

  • Test: Techniques to test the solution designed, even in an early stage, as an usability test.

Micromoment Cards.

This new type of fillable card is used to have a more detailed description of the context and the intention of the user that is about to do something. This is based on a research made promoted by Google [9] which define a micro-moment as an “Intent-rich moment when a person turns to a device to act on a need- to know, go, do or buy”. As Google made this research to understand how people use smartphones to have support in the accomplishment of a certain tasks we adapted the concept to the IoT scenario, focusing on how a user could use smart and connected objects/environments to answer to a certain need. In the “Micromoment” cards the user should indicate the intention of the user in that moment, the contextual needs and constraints and the objects and environments that could be used. Especially the last two fields could be used to have inspiration to create innovative touch-points for the service that will support the user.

The “Micromoment” card could be considered a child of the “Context” card. Indeed in the definition of the user journey the micro-moment is a detail of something that happens inside a context. As an example if the context is a smart-kitchen the micro-moment could be “I want to prepare the dinner”, the contextual need could be finding a recipe based on the food available and the objects that could be “enchanted” with IoT features are the fridge and a frying pan.

Updating of Existing Cards.

We updated the “Input” and “Output” cards stressing more on the interaction with conversational interfaces like vocal assistants. In the “Threats” we added cards about the design for failure (i.e. considering false-positives and false negatives in an Artificial Intelligence system) and the “Inspiration” deck we suggest to design for trust (i.e. trust that a self driving car will turn when we expect it to do it). (Fig. 1)

Fig. 1.
figure 1

“Function” and “Micromoment” cards front and back.

3.3 Design of New Tools

In order to guide the designer during the design process we created some tools that are helpful in certain phases to synthesize ideas and to create project documentation.

Problem Discovery Canvas.

The Problem Discovery Canvas is divided into different sections that examine the characteristics of the problem, the people involved and contexts where the problem takes place. The impact of the problem can be analysed under the social, economic, environmental perspective, so does the context, that can be analysed on physical or social perspectives. The strength of the canvas is that it helps designers to look at the problem from different points of view and to split it in subsections in order to analyse every single element of the problem.

The Problem Discovery Canvas also invites designers to think about the competitors already working on the problem identified, in order to identify who is already solving that problem and how. Analysing the state of the art is also an useful tool to find out best practices in a specific field.

The last section of the Problem Discovery Canvas makes designer think about possible solutions to the problem identified. The “How might we” method is largely used in service and user experience design and its output is not the perfect, final solution, but a set of opportunities for design. A good “how might we” session is a powerful tool to frame the problem identified and to generate insights for the brainstorming stage.

Brainstorming Canvas.

During the idea generation phase the participants needs to share ideas. It is useful to make it in a written form because it helps to share, compare, select but also to save all the alternatives that have been considered. The canvas is really simple, it has a space for a title, a sketch, for a description (that have to be short because the space is small).

Concept Definition Canvas.

Its objective is to help the team to focus on the value proposition of the product/service. A fill-the-blank form helps the user to create a concise and effective value proposition that the team will use in all the design process. The fill the blank is: “Our project (project name) allows (target users) to (users’goal) so (outcome)”.

However, as the process is iterative, this definition could change over time so there is the field notes that will help to keep track of the changes with the corresponding motivations.

Boards.

In the first version we designed some boards or decks to organize the cards in order to visualize specific aspects of the project like the user journey map or the system map. However the dimension of the cards (A6) forced to make the boards really big, moreover many experts have their own way to create those visualization. As a consequence we decided to avoid the use of printed boards just describing on the Quick Start Guide how to use the cards to visualize some specific aspects like the user journey map, but without forcing to a fixed schema.

4 The Quick Start Guide

The Quick Start Guide for the IoT Design Deck was created as an answer to the need for a guidance revealed during the test session in Chemnitz. We decided to design a guide to be included in the IoT Design Deck box, useful to those who want to use the tool by themselves.

The Quick Start Guide is an introduction to the IoT Design Deck methodology and objectives; it is divided into seven sections, each of them corresponds to a different stage of the design process. In order to introduce readers to their first usage of the deck, there is also a “welcome” introduction, describing materials and information that will be valid for all the future design sessions.

A relevant element introduced in the Quick Start Guide is the chance to skip some steps of the design process, according to the needs and time constraints.

Many of the design stages have optional tasks to be performed only in longer design sessions.

Furthermore, the welcome section introduces the icons that occur during the reading, suggesting the use of a certain card in a specific moment. Some of the icons refer to the tool decks and represent a call to action. For example, the “sorting deck” icon appears when it is necessary to pick up a specific sorting technique from the deck.

Another symbol often present during the reading is the camera symbol, that suggests readers to take pictures of what they are doing in order to get documentation of the design process. Taking pictures can also be useful to save all the possible cards configurations: designers can realize a touchpoint matrix or a customer journey without the need of replicating certain cards. Moreover, pictures can be useful tools for further presentations of the project.

The structure of each section in the Quick Start Guide is organized as follows, except for the welcome section.

Every stage is presented with a short description page and step by step indications, that also indicates if a step is mandatory or optional. This allows the team to choose between a full or fast design session.

The short description is a page that quickly illustrates:

  • What designers will do;

  • Which materials of the IoT Design Deck are needed to complete the task;

  • The average duration of the task;

  • How many participants can be involved in this activity, expressed in a range;

  • The energy effort required;

  • Whether the facilitator is necessary, optional or unnecessary;

  • The expected output.

The step by step description, instead, offers a deeper understanding of the actions that will be performed in that stage, including useful tips about the usage of certain cards (i.e. how and when it could be helpful to have insights from the tool decks). (Fig. 2)

Fig. 2.
figure 2

Contents from the Quick Start Guide.

5 Future Works

In relation to its topic the method is in a constant evolution. In the next steps we are planning another test campaign, to evaluate the redesign implemented. Moreover as the method at the moment is general purpose, we are planning to develop domain specific cards (i.e. smart home, smart city, e-health, etc.) in collaboration with domain experts, to make it more specific and helpful for a professional use.