Advertisement

Design and Development of Multimodal Applications: A Vision on Key Issues and Methods

  • Samuel Silva
  • Nuno Almeida
  • Carlos Pereira
  • Ana Isabel Martins
  • Ana Filipa Rosa
  • Miguel Oliveira e Silva
  • António Teixeira
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9175)

Abstract

Multimodal user interfaces provide users with different ways of interacting with applications. This has advantages both in providing interaction solutions with additional robustness in environments where a single modality might result in ambiguous input or output (e.g., speech in noisy environments), and for users with some kind of limitation (e.g., hearing difficulties resulting from ageing) by yielding alternative and more natural ways of interacting. The design and development of applications supporting multimodal interaction involves numerous challenges, particularly if the goals include the development of multimodal applications for a wide variety of scenarios, designing complex interaction and, at the same time, proposing and evolving interaction modalities. These require the choice of an architecture, development and evaluation methodologies and the adoption of principles that foster constant improvements at the interaction modalities level without disrupting existing applications. Based on previous and ongoing work, by our team, we present our approach to the design, development and evaluation of multimodal applications covering several devices and application scenarios.

Keywords

Multimodal interaction Design and development Evaluation 

Notes

Acknowledgments

Research partially funded by IEETA Research Unit funding (PEst-OE/EEI/UI0127/2014), project Cloud Thinking (funded by the QREN Mais Centro program, ref. CENTRO-07-ST24-FEDER-002031), Marie Curie Actions IRIS (ref. 610986, FP7-PEOPLE-2013-IAPP), project Smart Phones for Seniors (S4S), a QREN project (QREN 21541), co-funded by COMPETE and FEDER, project PaeLife (AAL-08-1-2001-0001) and project AAL4ALL (AAL/0015/2009).

References

  1. 1.
    Almeida, N., Silva, S., Teixeira, A.: Design and development of speech interaction: a methodology. In: Kurosu, M. (ed.) HCI 2014, Part II. LNCS, vol. 8511, pp. 370–381. Springer, Heidelberg (2014) Google Scholar
  2. 2.
    Almeida, N., Silva, S., Teixeira, A.: Multimodal multi-device application supported by an SCXML state chart machine. In: Proceedings of EICS Workshop on Engineering Interactive Systems with SCXML (2014)Google Scholar
  3. 3.
    Basson, S., Fairweather, P.G., Hanson, V.L.: Speech recognition and alternative interfaces for older users. Interactions 14(4), 26–29 (2007)CrossRefGoogle Scholar
  4. 4.
    Bodell, M., Dahl, D., Kliche, I., Larson, J., Porter, B.: Multimodal architecture and interfaces. In: W3C (2012). http://www.w3.org/TR/mmi-arch/
  5. 5.
    Bonsignore, E., Quinn, A.J., Druin, A., Bederson, B.B.: Sharing stories “in the wild”: a mobile storytelling case study using storykit. ACM Trans. Comput.-Hum. Interact. 20(3), 18:1–18:38 (2013)Google Scholar
  6. 6.
    Brooke, J.: SUS-A quick and dirty usability scale. In: Thomas, B., Weerdmeester, B.A., McClelland, A.L. (eds.) Usability Evaluation in Industry, vol. 189, pp. 189–194. Taylor & Francis, London (1996)Google Scholar
  7. 7.
    Chen, F., Sun, Y.: An efficient unification-based multimodal language processor for multimodal input fusion. In: Multimodal Human Computer Interaction and Pervasive Services, pp. 58–86 (2009)Google Scholar
  8. 8.
    Cooper, A., Reimann, R., Cronin, D.: About Face 3: The Essentials of Interactive Design, 3rd edn. Wiley Publications, New York (2007)Google Scholar
  9. 9.
    Dahl, D.A.: The W3C multimodal architecture and interfaces standard. J. Multimodal User Interfaces 7(3), 171–182 (2013)CrossRefzbMATHGoogle Scholar
  10. 10.
    D’Andrea, A., D’Ulizia, A., Ferri, F., Grifoni, P.: A multimodal pervasive framework for ambient assisted living. In: Proceedings 2nd International Conference on PErvasive Technologies Related to Assistive Environments, New York, pp. 39:1–39:8 (2009)Google Scholar
  11. 11.
    Dumas, B., Lalanne, D., Ingold, R.: HephaisTK: a toolkit for rapid prototyping of multimodal interfaces. In: Proceedings of International Conference on Multimodal Interfaces, pp. 231–232 (2009)Google Scholar
  12. 12.
    Ferreira, F., Almeida, N., Rosa, A., Oliveira, A., Teixeira, A., Pereira, J.: Multimodal and adaptable medication assistant for the elderly: a prototype for interaction and usability in smartphones. In: Proceedings of 8th Iberian Conference on Information Systems and Technologies (CISTI), pp. 1–6, June 2013Google Scholar
  13. 13.
    Ferreira, F., Almeida, N., Rosa, A.F., Oliveira, A., Casimiro, J., Silva, S., Teixeira, A.: Elderly centered design for interaction - the case of the S4S medication assistant. Procedia Comput. Sci. 27, 398–408 (2014)CrossRefGoogle Scholar
  14. 14.
    Freitas, J., Teixeira, A., Dias, M.S.: Towards a silent speech interface for Portuguese: surface electromyography and the nasality challenge. In: Proceedings International Conference on Bio-Inspired Systems and Signal Processing (BIOSIGNALS), pp. 91–100 (2012)Google Scholar
  15. 15.
    Hassenzahl, M., Monk, A.: The inference of perceived usability from beauty. Hum.-Comput. Interact. 25(3), 235–260 (2010)CrossRefGoogle Scholar
  16. 16.
    Hoste, L., Dumas, B., Signer, B.: Mudra: a unified multimodal interaction framework. In: Proceedings of 13th International Conference Multimodal Interfaces, pp. 97–104. ACM, New York (2011)Google Scholar
  17. 17.
    Ickin, S., Wac, K., Fiedler, M., Janowski, L., Hong, J.H., Dey, A.: Factors influencing quality of experience of commonly used mobile applications. IEEE Commun. Mag. 50(4), 48–56 (2012)CrossRefGoogle Scholar
  18. 18.
    Kühnel, C.,Westermann, T., Weiss, B., Möller, S.: Evaluating multimodal systems: a comparison of established questionnaires and interaction parameters. In: Proceedings of 6th Nordic Conference on HCI: Extending Boundaries, NordiCHI 2010, pp. 286–294. ACM, New York (2010)Google Scholar
  19. 19.
    Lemmelä, S., Vetek, A., Mäkelä, K., Trendafilov, D.: Designing and evaluating multimodal interaction for mobile contexts. In: Proceedings of 10th International Conference Multimodal Interfaces, pp. 265–272. ACM, New York (2008)Google Scholar
  20. 20.
    Lund, A.M.: Measuring usability with the USE questionnaire. Usability Interface 8(2), 3–6 (2001)MathSciNetGoogle Scholar
  21. 21.
    Martins, A.I., Queirós, A., Cerqueira, M., Rocha, N., Teixeira, A.: The international classification of functioning, disability and health as a conceptual model for the evaluation of environmental factors. Procedia Comp. Sci. 14, 293–300 (2012)CrossRefGoogle Scholar
  22. 22.
    Mateo Navarro, P., Hillmann, S., Möller, S., Sevilla Ruiz, D., Martínez Pérez, G.: Run-time model based framework for automatic evaluation of multimodal interfaces. J. Multimodal User Interfaces 8(4), 399–427 (2014)CrossRefGoogle Scholar
  23. 23.
    Naumann, A., Wechsung, I., Hurtienne, J.: Multimodality, inclusive design, and intuitive use. In: Proceedings of British Computer Society HCI Workshop and Conference (2009)Google Scholar
  24. 24.
    Obrenovic, Z., Abascal, J., Starcevic, D.: Universal accessibility as a multimodal design issue. Commun. ACM 50(5), 83–88 (2007)CrossRefGoogle Scholar
  25. 25.
    Oviatt, S.: Designing robust multimodal systems for universal access. In: Proceedings of Workshop on Universal Accessibility of Ubiquitous Computing: Providing for the Elderly, pp. 71–74. ACM, New York (2001)Google Scholar
  26. 26.
    Oviatt, S., Cohen, P., Wu, L., Vergo, J., Duncan, L., Suhm, B., Bers, J., Holzman, T., Winograd, T., Landay, J., Larson, J., Ferro, D.: Designing the user interface for multimodal speech and pen-based gesture applications: state-of-the-art systems and future research directions. Hum.-Comput. Interact. 15(4), 263–322 (2000)CrossRefGoogle Scholar
  27. 27.
    Oviatt, S., Coulston, R., Lunsford, R.: When do we interact multimodally?: cognitive load and multimodal communication patterns. In: Proceedings of 6th International Conference on Multimodal Interfaces, New York, pp. 129–136 (2004)Google Scholar
  28. 28.
    Pereira, C., Teixeira, A., Oliveira e Silva, M.: Live evaluation within ambient assisted living scenarios. In: Proceedings of 7th ACM Conference on Pervasive Technologies Related to Assistive Environments (PETRA) (2014)Google Scholar
  29. 29.
    Pereira, C., Ferreira, N., Martins, A.I., Silva, S., Rosa, A.F., e Silva, M.O., Teixeira, A.: Evaluation of complex distributed multimodal applications: evaluating a telerehabilitation system when it really matters. In: Proceedings of HCII, LA, CA, USA, August 2015Google Scholar
  30. 30.
    Ramsay, A., McGee-Lennon, M., Wilson, G.A., Gray, S.J., Gray, P., De Turenne, F.: Tilt and go: exploring multimodal mobile maps in the field. J. Multimodal User Interfaces 3(3), 167–177 (2010)CrossRefGoogle Scholar
  31. 31.
    de Salces, F.J.S., England, D., Llewellyn-Jones, D.: Designing for all in the house. In: Proceedings of the 2005 Latin American Conference on Human-Computer Interaction, CLIHC 2005, pp. 283–288. ACM, New York (2005)Google Scholar
  32. 32.
    Sarter, N.: Multimodal Information Presentation in Support of Human-Automation Communication and Coordination, vol. 2, pp. 13–35. Emerald Group Publishing Limited, UK (2002)Google Scholar
  33. 33.
    Signoretti, A., Martins, A.I., Almeida, N., Vieira, D., Teixeira, A., Costa, C.M.M.: Trip 4 All (T4A): a gamified app to provide a new way to elderly people traveling. In: Proceedings of DSAI (accepted) (2015)Google Scholar
  34. 34.
    Silva, S., Braga, D., Teixeira, A.: AgeCI: HCI and age diversity. In: Stephanidis, C., Antona, M. (eds.) UAHCI 2014, Part III. LNCS, vol. 8515, pp. 179–190. Springer, Heidelberg (2014) Google Scholar
  35. 35.
    Stevens, C.J., Gibert, G., Leung, Y., Zhang, Z.: Evaluating a synthetic talking head using a dual task: modality effects on speech understanding and cognitive load. Int. J. Hum.-Comput. Stud. 71(4), 440–454 (2013)CrossRefGoogle Scholar
  36. 36.
    Teixeira, A., Hmlinen, A., Avelar, J., Almeida, N., Nmeth, G., Fegy, T., Zaink, C., Csap, T., Tth, B., Oliveira, A., Dias, M.S.: Speech-centric multimodal interaction for easy-to-access online services - a personal life assistant for the elderly. In: Proceedings of DSAI 2013, Procedia Computer Science, November 2013Google Scholar
  37. 37.
    Teixeira, A., Pereira, C.: e Silva, M.O., Alvarelhão, J., Silva, A., Cerqueira, M., Isabel, M., Pacheco, O., Almeida, N., Oliveira, C., Costa, R., Neves, A.J.R., Queirós, A., Rocha, N.: New telerehabilitation services for the elderly. In: Miranda, I., Cruz-Cunha, M. (eds.) Handbook of Research on ICTs for Healthcare and Social Services: Developments and Applications. IGI global, USA (2013)Google Scholar
  38. 38.
    Teixeira, A., Francisco, P., Almeida, N., Pereira, C., Silva, S.: Services to support use and development of speech input for multilingual multimodal applications for mobile scenarios. In: Proceedings of 9th International Conference on Internet and Web Applications and Services (ICIW) (2014)Google Scholar
  39. 39.
    Teixeira, A.J.S., Almeida, N., Pereira, C., e Silva, M.O.: W3C MMI architecture as a basis for enhanced interaction for ambient assisted living. In: Get Smart: Smart Homes, Cars, Devices and the Web, W3C Workshop on Rich Multimodal Application Development. New York Metropolitan Area, US (2013)Google Scholar
  40. 40.
    Wechsung, I.: An Evaluation Framework for Multimodal Interaction: Determining Quality Aspects and Modality Choice. Springer, Switzerland (2014)CrossRefGoogle Scholar
  41. 41.
    Wechsung, I., Engelbrecht, K.P., Kühnel, C., Möller, S., Weiss, B.: Measuring the quality of service and quality of experience of multimodal human-machine interaction. J. Multimodal User Interfaces 6(1–2), 73–85 (2012)CrossRefGoogle Scholar
  42. 42.
    Wechsung, I., Schleicher, R., Möller, S.: How context determines perceived quality and modality choice. Secondary task paradigm applied to the evaluation of multimodal interfaces. In: Proceedings of the Paralinguistic Information and Its Integration in Spoken Dialogue Systems Workshop, pp. 327–340 (2011)Google Scholar
  43. 43.
    WHO: The international classification of functioning, disability and health. In: World Health Organization, vol. 18, p. 237 (2001)Google Scholar
  44. 44.
    Witt, S.: A set of quantitative user experience metrics for multi-modal dialog systems. In: Proceedings of ITG Symposium Speech Communication, pp. 1–4, September 2014Google Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  • Samuel Silva
    • 1
    • 2
  • Nuno Almeida
    • 1
    • 2
  • Carlos Pereira
    • 1
    • 2
  • Ana Isabel Martins
    • 1
    • 2
  • Ana Filipa Rosa
    • 1
    • 2
  • Miguel Oliveira e Silva
    • 1
    • 2
  • António Teixeira
    • 1
    • 2
  1. 1.IEETA – Institute of Electronics and Informatics EngineeringUniversity of AveiroAveiroPortugal
  2. 2.DETI – Department of Electronics, Telecommunications and Informatics EngineeringUniversity of AveiroAveiroPortugal

Personalised recommendations