Welcome to the Future – How Naïve Users Intuitively Address an Intelligent Robotics Apartment

  • Jasmin Bernotat
  • Birte Schiffhauer
  • Friederike Eyssel
  • Patrick Holthaus
  • Christian Leichsenring
  • Viktor Richter
  • Marian Pohling
  • Birte Carlmeyer
  • Norman Köster
  • Sebastian Meyer zu Borgsen
  • René Zorn
  • Kai Frederic Engelmann
  • Florian Lier
  • Simon Schulz
  • Rebecca Bröhl
  • Elena Seibel
  • Paul Hellwig
  • Philipp Cimiano
  • Franz Kummert
  • David Schlangen
  • Petra Wagner
  • Thomas Hermann
  • Sven Wachsmuth
  • Britta Wrede
  • Sebastian Wrede
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9979)

Abstract

The purpose of this Wizard-of-Oz study was to explore the intuitive verbal and non-verbal goal-directed behavior of naïve participants in an intelligent robotics apartment. Participants had to complete seven mundane tasks, for instance, they were asked to turn on the light. Participants were explicitly instructed to consider nonstandard ways of completing the respective tasks. A multi-method approach revealed that most participants favored speech and interfaces like switches and screens to communicate with the intelligent robotics apartment. However, they required instructions to use the interfaces in order to perceive them as competent targets for human-machine interaction. Hence, first important steps were taken to investigate how to design an intelligent robotics apartment in a user-centered and user-friendly manner.

Keywords

Social robot Smart home Human-robot interaction Use-case scenario Usability Intuitive design User-centered design 

References

  1. 1.
    Asimov, I.: AZ Quotes. http://www.azquotes.com/quote/877722
  2. 2.
    Kaasinen, E., Kymäläinen, T., Niemelä, M., Olsson, T., Kanerva, M., Ikonen, V.: A user-centric view of intelligent environments: user expectations, user experience and user role in building intelligent environments. Computers 2, 1–33 (2012)CrossRefGoogle Scholar
  3. 3.
    Chan, M., Campo, E., Estève, D., Fourniols, J.Y.: Smart homes - current features and future perspectives. Maturitas 64, 90–97 (2009)CrossRefGoogle Scholar
  4. 4.
    Pineau, J., Montemerlo, M., Pollack, M., Roy, N., Thrun, S.: Towards robotic assistants in nursing homes: challenges and results. Robot. Autonom. Syst. 42, 271–281 (2003)CrossRefMATHGoogle Scholar
  5. 5.
    Venkatesh, V., Davis, F.D.: A theoretical extension of the technology acceptance model: four longitudinal field studies. Manage. Sci. 46, 186–204 (2000)CrossRefGoogle Scholar
  6. 6.
    Hong, X., Nugent, C., Mulvenna, M., McClean, S., Scotney, B., Devlin, S.: Evidential fusion of sensor data for activity recognition in smart homes. Pervasiv. Mobile Comput. 5, 236–252 (2009)CrossRefGoogle Scholar
  7. 7.
    Chen, L., Nugent, C.D., Wang, H.: A knowledge-driven approach to activity recognition in smart homes. IEEE Trans. Knowle. Data Eng. 24(6), 961–974 (2012). IEEE Press, New YorkCrossRefGoogle Scholar
  8. 8.
    Pavlou, P.A.: Consumer acceptance of electronic commerce: integrating trust and risk with the technology acceptance model. Int. J. Electron. Commun. 7, 101–134 (2003)Google Scholar
  9. 9.
    MeKa Robotics: Aaron Edsinger. Jeff Weger, San Francisco (2006)Google Scholar
  10. 10.
    Moore, R.K.: PRESENCE: A Human-Inspired Architecture for Speech-Based Human-Machine Interaction. IEEE Trans. Comput. 56, 1176–1188 (2007). IEEE Press, New YorkMathSciNetCrossRefGoogle Scholar
  11. 11.
    Kelley, J.F.: An iterative design methodology for user-friendly natural language office information applications. T Inform. Syst. 2, 26–41 (1984)Google Scholar
  12. 12.
    Holthaus, P., Leichsenring, C., Bernotat, J., Richter, V., Pohling, M., Carlmeyer, B., Koster, N., zu Borgsen, S.M., Zorn, R., Schiffhauer, B., Engelmann, K.F., Lier, F., Schulz, S., Cimiano, P., Eyssel, F., Hermann, T., Kummert, F., Schlangen, D., Wachsmuth, S., Wagner, P., Wrede, B., Wrede, S.: How to address smart homes with a social robot? A multi-modal corpus of user interactions with an intelligent environment. In: 10th Edition of the Language Resources and Evaluation Conference, LREC Press, Portoroz (2016)Google Scholar
  13. 13.
    Max Planck Institute for Psycholinguistics: The Language Archive, Nijmegen, The Netherlands: ELAN. http://tla.mpi.nl/tools/tla-tools/elan/
  14. 14.
    Sloetjes, H., Wittenburg, P.: Annotation by category – ELAN and ISO DCR. In: Proceedings of the 6th International Conference on Language Resources and Evaluation, LREC Press, Marrakech (2008)Google Scholar
  15. 15.
    Holthaus, P., Pitsch, K., Wachsmuth, S.: How can i help? Int. J. Soc. Robot. 3, 383–393 (2011)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG 2016

Authors and Affiliations

  • Jasmin Bernotat
    • 1
  • Birte Schiffhauer
    • 1
  • Friederike Eyssel
    • 1
  • Patrick Holthaus
    • 1
  • Christian Leichsenring
    • 1
  • Viktor Richter
    • 1
  • Marian Pohling
    • 1
  • Birte Carlmeyer
    • 1
  • Norman Köster
    • 1
  • Sebastian Meyer zu Borgsen
    • 1
  • René Zorn
    • 1
  • Kai Frederic Engelmann
    • 1
  • Florian Lier
    • 1
  • Simon Schulz
    • 1
  • Rebecca Bröhl
    • 1
  • Elena Seibel
    • 1
  • Paul Hellwig
    • 1
  • Philipp Cimiano
    • 1
  • Franz Kummert
    • 1
  • David Schlangen
    • 1
  • Petra Wagner
    • 1
  • Thomas Hermann
    • 1
  • Sven Wachsmuth
    • 1
  • Britta Wrede
    • 1
  • Sebastian Wrede
    • 1
  1. 1.Cluster of Excellence Cognitive Interaction Technology (CITEC)Bielefeld UniversityBielefeldGermany

Personalised recommendations