International Journal of Social Robotics

, Volume 7, Issue 2, pp 183–202 | Cite as

Semi-Autonomous Domestic Service Robots: Evaluation of a User Interface for Remote Manipulation and Navigation With Focus on Effects of Stereoscopic Display

  • Marcus Mast
  • Zdeněk Materna
  • Michal Španěl
  • Florian Weisshardt
  • Georg Arbeiter
  • Michael Burmester
  • Pavel Smrž
  • Birgit Graf
Article

Abstract

In this article, we evaluate a novel type of user interface for remotely resolving challenging situations for service robots in domestic environments. Our focus is on potential advantages of stereoscopic display. The user interface is based on a control architecture that allows involvement of a remote human operator when the robot encounters a problem. It offers semi-autonomous remote manipulation and navigation with low-cost interaction devices, incorporates global 3D environment mapping, and follows an ecological visualization approach that integrates 2D laser data, 3D depth camera data, RGB data, a robot model, constantly updated global 2D and 3D environment maps, and indicators into a single 3D scene with user-adjustable viewpoints and optional viewpoint-based control. We carried out an experiment with 28 participants in a home-like environment investigating the utility of stereoscopic display for three types of task: defining the shape of an unknown or unrecognized object to be grasped, positioning the gripper for semi-autonomous reaching and grasping, and navigating the robot around obstacles. Participants were able to successfully complete all tasks and highly approved the user interface in both monoscopic and stereoscopic display modes. They were significantly faster under stereoscopic display in positioning the gripper. For the other two task types, there was a tendency for faster task completion in stereo mode that would need to be verified in further studies. We did not find significant differences in perceived workload between display types for any type of task. We conclude that stereoscopic display seems to be a useful optional display mode for this type of user interface but that its utility may vary depending on the task.

Keywords

Human-robot interaction User interfaces Semi-autonomy Telemanipulation Teleoperation 

Notes

Acknowledgments

This research was supported by the European Commission, FP7, project “SRS”, Grant Agreement No. 247772. We would like to thank Thiago de Freitas Oliveira Araújo, Ali Shuja Siddiqui, Markus Noack, Anne Reibke, Bianca Bannert, and Monika Heinzel-Gutenbrunner for supporting work.

References

  1. 1.
    Parasuraman R, Sheridan T, Wickens CA (2000) A model for types and levels of human interaction with automation. IEEE Trans Syst Man Cybern A 30:286–297CrossRefGoogle Scholar
  2. 2.
    Mast M, Burmester M, Krüger K, Fatikow S, Arbeiter G, Graf B et al (2012) User-centered design of a dynamic-autonomy remote interaction concept for manipulation-capable robots to assist elderly people in the home. J Hum Robot Interact 1:96–118CrossRefGoogle Scholar
  3. 3.
    Martens C, Prenzel O, Gräser A (2007) The rehabilitation robots FRIEND-I & II: daily life independency through semi-autonomous task-execution. In: Kummo SS (ed) Rehabilitation robotics. Itech, Vienna, pp 137–162Google Scholar
  4. 4.
    Durand B, Godary-Dejean K, Lapierre L et al. (2010) Fault tolerance enhancement using autonomy adaptation for autonomous mobile robots. In: Proc Conf Control Fault Toler Syst, pp 24–29Google Scholar
  5. 5.
    Qiu R, Ji Z, Noyvirt A et al. (2012) Towards robust personal assistant robots: experience gained in the SRS project. In: Proc IEEE/RSJ Int Conf Intell Robot Syst (IROS), pp 1651–1657Google Scholar
  6. 6.
    Doroodgar B, Ficocelli M, Mobedi B, Nejat G (2010) The search for survivors: cooperative human-robot interaction in search and rescue environments using semi-autonomous robots. In: Proc IEEE Int Conf Robot Autom (ICRA), pp 2858–2863Google Scholar
  7. 7.
    Shiomi M, Sakamoto D, Kanda T et al (2011) Field trial of a networked robot at a train station. Int J Soc Robot 3:27–40CrossRefGoogle Scholar
  8. 8.
    Mason M, Lopes M (2011) Robot self-initiative and personalization by learning through repeated interactions. In: Proc Int Conf Hum Robot Interact (HRI), pp 433–440Google Scholar
  9. 9.
    Campbell CL, Peters RA, Bodenheimer RE, Bluethmann WJ, Huber E, Ambrose RO (2006) Superpositioning of behaviors learned through teleoperation. IEEE Trans Robot 22:79–91CrossRefGoogle Scholar
  10. 10.
    Jenkins OC, Peters RA, Bodenheimer RE (2006) Uncovering success in manipulation. In: Proc RSS workshop manipulation for human environments, PhiladelphiaGoogle Scholar
  11. 11.
    Kemp CC, Edsinger A, Torres-Jara E (2007) Challenges for robot manipulation in human environments. IEEE Robot Autom Mag 14:20–29CrossRefGoogle Scholar
  12. 12.
    ISO 9241–11 (1998) Ergonomic requirements for office work with visual display terminals. Part 11: Guidance on usabilityGoogle Scholar
  13. 13.
    Drury JL, Scholtz J, Yanco HA (2003) Awareness in human-robot interactions. In: Proc IEEE Int Conf Syst Man Cybern, pp 912–918Google Scholar
  14. 14.
    Yanco HA, Drury J (2004) “Where am I” Acquiring situation awareness using a remote robot platform. In: Proc IEEE Int Conf Syst Man Cybern, pp 2835–2840Google Scholar
  15. 15.
    Steinfeld A, Fong T, Kaber D et al. (2006) Common metrics for human-robot interaction. In: Proc Int Conf Hum Robot Interact (HRI), pp 33–40Google Scholar
  16. 16.
    Mast M, Španěl M, Arbeiter G et al (2013) Teleoperation of domestic service robots: effects of global 3D environment maps in the user interface on operators’ cognitive and performance metrics. In: Hermann G et al (eds) Int Conf Soc Robot (ICSR) LNAI. Springer, Cham, pp 392–401Google Scholar
  17. 17.
    Kaber D, Onal E, Endsley M (2000) Design of automation for telerobots and the effect on performance, operator situation awareness, and subjective workload. Hum Factors Ergon Manuf 10:409–430CrossRefGoogle Scholar
  18. 18.
    ISO 9241–210 (2010) Ergonomics of human-system interaction. Part 210: Human-centred design for interactive systemsGoogle Scholar
  19. 19.
    Hassenzahl M (2001) The effect of perceived hedonic quality on product appealingness. Int J Hum Comput Interact 13:481–499Google Scholar
  20. 20.
    Broy N, André E, Schmidt A (2012) Is stereoscopic 3D a better choice for information representation in the car? In: Proc Automot UI, pp 93–100Google Scholar
  21. 21.
    Chen JYC, Haas EC, Barnes MJ (2007) Human performance issues and user interface design for teleoperated robots. IEEE Trans Syst Man Cybern C 37:1231–1245CrossRefGoogle Scholar
  22. 22.
    Nielsen CW, Goodrich MA, Ricks RW (2007) Ecological interfaces for improving mobile robot teleoperation. IEEE Trans Robot 23:927–941CrossRefGoogle Scholar
  23. 23.
    Bruemmer DJ, Few DA, Boring RL, Marble JL et al (2005) Shared understanding for collaborative control. IEEE Trans Syst Man Cybern A 35:494–504CrossRefGoogle Scholar
  24. 24.
    Keyes B, Casey R, Yanco HA et al (2006) Camera placement and multi-camera fusion for remote robot operation. In: Proc IEEE Int Workshop Safety, Security, Rescue robotGoogle Scholar
  25. 25.
    Fiala M (2005) Pano-presence for teleoperation. In: Proc IEEE/RSJ Int Conf Intell Robot Syst (IROS), pp 3798–3802Google Scholar
  26. 26.
    Labonté D, Boissy P, Michaud F (2010) Comparative analysis of 3-D robot teleoperation interfaces with novice users. IEEE Trans Syst Man Cybern B 40:1331–1342CrossRefGoogle Scholar
  27. 27.
    Michaud F, Boissy P, Labonté D et al (2010) Exploratory design and evaluation of a homecare teleassistive mobile robotic system. Mechatronics 20:751–766CrossRefGoogle Scholar
  28. 28.
    Das H, Sheridan TB, Slotine J (1989) Kinematic control and visual display of redundant teleoperators. In: Proc IEEE Int Conf Syst Man Cybern, pp 1072–1077Google Scholar
  29. 29.
    Chintamani K, Cao A, Ellis RD, Pandya AK (2010) Improved telemanipulator navigation during display-control misalignments using augmented reality cues. IEEE Trans Syst Man Cybern A 40:29–39CrossRefGoogle Scholar
  30. 30.
    Pongrac H, Peer A, Färber B, Buss M (2008) Effects of varied human movement control on task performance and feeling of telepresence. In: Ferre M (ed) Haptics: perception, devices and scenarios. EuroHaptics 2008. LNCS, vol 5024. Springer, New York, pp 755–765Google Scholar
  31. 31.
    Sheridan TB, Ferrell WR (1963) Remote manipulative control with transmission delay. IEEE Trans Hum Fact Electron 4:25–29CrossRefGoogle Scholar
  32. 32.
    Stassen HG, Smets GJF (1997) Telemanipulation and telepresence. Control Eng Practice 5:363–374CrossRefGoogle Scholar
  33. 33.
    Buss M, Peer A, Schauß T, Stefanov N, Unterhinninghofen U (2010) Development of a multi-modal multi-user telepresence and teleaction system. Int J Robot Res 29:1298–1316CrossRefGoogle Scholar
  34. 34.
    Honda M, Miyoshi T, Imamura T et al (2011) Tele-operation between USA and Japan using humanoid robot hand/arm. In: Proc Int Conf Hum Robot Interact (HRI), pp 151–152Google Scholar
  35. 35.
    Mavridis N, Giakoumidis N (2012) A novel evaluation framework for teleoperation and a case study on natural human-arm-imitation through motion capture. Int J Soc Robot 4:5–18CrossRefGoogle Scholar
  36. 36.
    Leeper A, Hsiao K, Ciocarlie M, Takayama L, Gossow D (2012) Strategies for human-in-the-loop robotic grasping. In: Proc Int Conf Hum Robot Interact (HRI), pp 1–8Google Scholar
  37. 37.
    You E, Hauser K (2012) Assisted teleoperation strategies for aggressively controlling a robot arm with 2D input. In: Durrant-Whyte H et al (eds) Robotics: science and systems, vol VII. MIT Press, USA, pp 354–361Google Scholar
  38. 38.
    Griffin WB, Provancher WR, Cutkosky MR (2005) Feedback strategies for telemanipulation with shared control of object handling forces. Presence 14:720–731CrossRefGoogle Scholar
  39. 39.
    Nieto J, Slawiñski E, Mut V, Wagner B (2011) Toward safe and stable time-delayed mobile robot teleoperation through sampling-based path planning. Robotica 30:351–361CrossRefGoogle Scholar
  40. 40.
    Bradshaw JM, Feltovich PJ, Jung H et al (2004) Dimensions of adjustable autonomy and mixed-initiative interaction. In: Nickles N et al (eds) Agents and Computational autonomy, AUTONOMY 2003. LNCS. Springer, New York, pp 17–39Google Scholar
  41. 41.
    Schermerhorn P, Scheutz M (2009) Dynamic robot autonomy: investigating the effects of robot decision-making in a human-robot team task. In: Proc ICMI-MLMI, pp 63–70Google Scholar
  42. 42.
    Brookshire J, Singh S, Simmons R (2004) Preliminary results in sliding autonomy for assembly by coordinated teams. In: Proc IEEE/RSJ Int Conf Intell Robot Syst (IROS), pp 706–711Google Scholar
  43. 43.
    Prats M, Fernández JJ, Sanz PJ (2012) An approach for semi-autonomous recovery of unknown objects in underwater environments. In: Proc OPTIM, pp 1452–1457Google Scholar
  44. 44.
    Atherthon JA, Goodrich MA (2009) Supporting remote manipulation with an ecological augmented virtuality interface. In: Proc AISB, pp 16–23Google Scholar
  45. 45.
    Arbeiter G, Bormann R, Fischer J et al (2012) Towards geometric mapping for semi-autonomous robots. In: Stachniss C et al (eds) Spatial Cognition, LNAI, vol 7463. Springer, New York, pp 114–127Google Scholar
  46. 46.
    Hornung A, Wurm KM, Bennewitz M et al (2013) OctoMap: an efficient probabilistic 3D mapping framework based on octrees. Auton Robot 34:189–206CrossRefGoogle Scholar
  47. 47.
    Mast M, Burmester M, Graf B et al. (in press) Design of the human-robot interaction for a semi-autonomous service robot to assist elderly people. In: Wichert R, Klausing H (eds) Advanced technologies and societal change: ambient assisted living. 7. AAL-Kongress. Springer, New YorkGoogle Scholar
  48. 48.
    Liu A, Tharp G, French L, Lai S, Stark L (1993) Some of what one needs to know about using head-mounted displays to improve teleoperator performance. IEEE Trans Robot Autom 5:638–648Google Scholar
  49. 49.
    Lee S, Kim GJ (2008) Effects of haptic feedback, stereoscopy, and image resolution on performance and presence in remote navigation. Int J Hum Comput Stud 66:701–717CrossRefGoogle Scholar
  50. 50.
    Edmondson R, Light K, Bodenhamer A et al. (2012) Enhanced operator perception through 3D vision and haptic feedback. In: Proc SPIE, vol 8387Google Scholar
  51. 51.
    Howard IP, Rogers BJ (2012) Perceiving in depth, vol 2. Oxford University Press, New YorkCrossRefGoogle Scholar
  52. 52.
    Howard IP (2012) Perceiving in depth, vol 3. Oxford University Press, New YorkCrossRefGoogle Scholar
  53. 53.
    Arsenault R, Ware C (2004) The importance of stereo and eye-coupled perspective for eye-hand coordination in fish tank VR. Presence 13:549–559CrossRefGoogle Scholar
  54. 54.
    Hubona GS, Shirah GW, Jennings DK (2004) The effects of cast shadows and stereopsis on performing computer-generated spatial tasks. IEEE Trans Syst Man Cybern A 34:483–493CrossRefGoogle Scholar
  55. 55.
    Sollenberger RL, Milgram P (1993) Effects of stereoscopic and rotational displays in a three-dimensional path-tracing task. Hum Factors 35:483–499Google Scholar
  56. 56.
    Ware C, Franck G (1996) Evaluating stereo and motion cues for visualizing information nets in three dimensions. ACM Trans Gr 15:121–140CrossRefGoogle Scholar
  57. 57.
    Barfield W, Hendrix C, Bystrom K (1999) Effects of stereopsis and head tracking on performance using desktop virtual environment displays. Presence 8:237–240CrossRefGoogle Scholar
  58. 58.
    Drascic D (1991) Skill acquisition and task performance in teleoperation using monoscopic and stereoscopic video remote viewing. In: Proc Hum Fact Ergon Soc Annu Meet, pp 1367–1371Google Scholar
  59. 59.
    Hutto CJ, Vincenzi DA, Hall S, Gangadharan S (2004) The effects of viewing medium on depth perception in human performance of a telerobotics manipulation task. In: Vincenzi DA et al (eds) Proc HPSAA II. Lawrence Erlbaum, Mahwah, pp 112–117Google Scholar
  60. 60.
    Livatino S, Muscato G, Sessa S et al (2008) Mobile robotic teleguide based on video images. IEEE Robot Autom Mag 15(4):58–67CrossRefGoogle Scholar
  61. 61.
    Chen JYC, Oden RVN, Merritt JO (2014) Utility of stereoscopic displays for indirect-vision driving and robot teleoperation. Ergonomics 57:12–22CrossRefGoogle Scholar
  62. 62.
    Livatino S, Muscato G, Sessa S, Neri V (2010) Depth-enhanced mobile robot teleguide based on laser images. Mechatronics 20:739–750CrossRefGoogle Scholar
  63. 63.
    Jameson D, Hurvich LM (1959) Note on the factors influencing the relation between stereoscopic acuity and observation distance. J Opt Soc Am 49:639CrossRefGoogle Scholar
  64. 64.
    Mather G, Smith DRR (2004) Combining depth cues: effects upon accuracy and speed of performance in a depth-ordering task. Vis Res 44:557–562CrossRefGoogle Scholar
  65. 65.
    Kim WS, Tendick F, Stark LW (1987) Visual enhancements in pick-and-place tasks: human operators controlling a simulated cylindrical manipulator. IEEE J Robot Autom 3:418–425CrossRefGoogle Scholar
  66. 66.
    Dosher BA, Sterling G, Wurst SA (1986) Tradeoffs between stereopsis and proximity luminance covariance as determinants of perceived 3D structure. Vis Res 26:973–990CrossRefGoogle Scholar
  67. 67.
    Ware C, Arthur K, Booth KS (1993) Fish tank virtual reality. In: Proc Comput Hum Interact (CHI), pp 37–42Google Scholar
  68. 68.
    Naepflin U, Menozzi M (2001) Can movement parallax compensate lacking stereopsis in spatial explorative search tasks? Displays 22:157–164CrossRefGoogle Scholar
  69. 69.
    van Schooten BW, van Dijk EMAG, Zudilova-Seinstra E et al (2010) The effect of stereoscopy and motion cues on 3D interpretation task performance. In: Proc AVI, pp 167–170Google Scholar
  70. 70.
    Hubona GS, Shirah GW, Fout DG (1997) The effects of motion and stereopsis on three-dimensional visualization. Int J Hum Comput Stud 47:609–627CrossRefGoogle Scholar
  71. 71.
    Bradshaw MF, Parton AD, Glennerster A (2000) The task-dependent use of binocular disparity and motion parallax information. Vis Res 40:3725–3734CrossRefGoogle Scholar
  72. 72.
    Schor CM, Wood I (1983) Disparity range for local stereopsis as a function of luminance spatial frequency. Vis Res 23:1649–1654CrossRefGoogle Scholar
  73. 73.
    Siderov J, Harwerth RS (1995) Stereopsis, spatial frequency, and retinal eccentricity. Vis Res 16:2329–2337CrossRefGoogle Scholar
  74. 74.
    Lloyd CJ (2012) Effects of spatial resolution and antialiasing on stereoacuity and comfort. In: Proc AIAA Model Simul Conf, pp 1–13Google Scholar
  75. 75.
    Jää-Aro K, Kjelldahl L (1997) Effects of image resolution on depth perception in stereo and non-stereo images. Proc SPIE 3012:319–326CrossRefGoogle Scholar
  76. 76.
    Bradshaw MF, Glennerster A (2006) Stereoscopic acuity and observation distance. Spat Vis 19:21–36CrossRefGoogle Scholar
  77. 77.
    Surdick RT, Davis ET, King RA, Hodges LF (1997) The perception of distance in simulated visual displays: a comparison of the effectiveness and accuracy of multiple depth cues across viewing distances. Presence 6:513–531Google Scholar
  78. 78.
    Kim WS, Ellis SR, Tyler ME, Hannaford B, Stark LW (1987) Quantitative evaluation of perspective and stereoscopic displays in three-axis manual tracking tasks. IEEE Trans Syst Man Cybern 17:61–72CrossRefGoogle Scholar
  79. 79.
    Fujiwara T, Kamegawa T, Gofuku A (2011) Stereoscopic presentation of 3D scan data obtained by mobile robot. In: Proc IEEE Int Symp Saf Secur Rescue Robot, pp 178–183Google Scholar
  80. 80.
    Lambooij M, Ijsselsteijn W, Fortuin M, Heynderickx I (2009) Visual discomfort and visual fatigue of stereoscopic displays: a review. J Imaging Sci Technol 53:030201-1–030201-14CrossRefGoogle Scholar
  81. 81.
    Camposeco F, Avilés C, Careaga B et al. (2011) Constraints on human stereo vision for tele-operation. In: Proc LARC, pp 1–6Google Scholar
  82. 82.
    Reiser U, Connette C, Fischer J et al. (2009) Care-O-bot 3 - creating a product vision for service robot applications by integrating design and technology. In: Proc IEEE/RSJ Int Conf Intell Robot Syst (IROS), pp 1992–1998Google Scholar
  83. 83.
    ROS documentation. http://www.ros.org/wiki/. Accessed 11 Nov 2014
  84. 84.
    Fischer J, Arbeiter G, Bormann R, Verl A (2012) A framework for object training and 6DoF pose estimation. In: Proc ROBOTIK, pp 513–518Google Scholar
  85. 85.
    Kunz T, Reiser U, Stilman M, Verl A (2010) Real-time path planning for a robot arm in changing environments. In: Proc IEEE/RSJ Int Conf Intell Robot Syst (IROS), pp 5906–5911Google Scholar
  86. 86.
    ROS documentation: arm navigation. http://wiki.ros.org/arm_navigation. Accessed 11 Nov 2014
  87. 87.
    SRS: multi-role shadow robotic system for independent living. srs-project.eu. Accessed 11 Nov 2014Google Scholar
  88. 88.
    Mast M, Burmester M, Berner E et al. (2010) Semi-autonomous teleoperated learning in-home service robots for elderly care: a qualitative study on needs and perceptions of elderly people, family caregivers, and professional caregivers. In: Proc Int Conf Robot Mechatron, pp 1–6Google Scholar
  89. 89.
    Sharkey A, Sharkey N (2012) Granny and the robots: ethical issues in robot care for the elderly. Ethics Inf Technol 14:27–40CrossRefGoogle Scholar
  90. 90.
    Dumas JC, Fox JE (2008) Usability testing: current practice and future directions. In: Jacko JA, Sears A (eds) The handbook of human-computer interaction, 2nd edn. Erlbaum, Mahwah, pp 1129–1149Google Scholar
  91. 91.
    Koenig N, Howard A (2004) Design and use paradigms for gazebo, an open-source multi-robot simulator. In: Proc IEEE/RSJ Int Conf Intell Robot Syst (IROS), pp 2149–2154Google Scholar
  92. 92.
    ISO 9241–110 (2006) Ergonomics of human-system interaction. Part 110: dialogue principlesGoogle Scholar
  93. 93.
    Goodrich MA, Olsen DR Jr (2003) Seven principles of efficient human robot interaction. In: Proc IEEE Int Conf Syst Man Cybern, pp 3942–3948Google Scholar
  94. 94.
    Steinfeld A (2004) Interface lessons for fully and semi-autonomous mobile robots. In: Proc IEEE Int Conf Robot Autom (ICRA), pp 2752–2757Google Scholar
  95. 95.
    Gibson JJ (1979) The ecological approach to visual perception. Houghton Mifflin, BostonGoogle Scholar
  96. 96.
    Norman D (1988) The design of everyday things. Basic Books, New YorkGoogle Scholar
  97. 97.
    ROS documentation: RViz, http://wiki.ros.org/rviz. Accessed 11 Nov 2014
  98. 98.
    Fong T, Thorpe C (2001) Vehicle teleoperation interfaces. Auton Robot 11:9–18CrossRefMATHGoogle Scholar
  99. 99.
    Yanco HA, Drury JL, Scholtz J (2004) Beyond usability evaluation: analysis of human-robot interaction at a major robotics competition. Hum Comput Interact 19:117–149CrossRefGoogle Scholar
  100. 100.
    Voshell M, Woods DD, Phillips F (2005) Overcoming the keyhole in human-robot coordination: simulation and evaluation. In: Proc Hum Fact Ergon Soc Annu Meet, pp 442–446Google Scholar
  101. 101.
    Bade R, Ritter F, Preim B (2005) Usability comparison of mouse-based interaction techniques for predictable 3d rotation. In: Butz A et al (eds) Smart graphics, SG 2005, LNCS. Springer, New York, pp 138–150Google Scholar
  102. 102.
    3Dconnexion SpaceNavigator. http://www.3dconnexion.com/products/spacenavigator.html. Accessed 11 Nov 2014
  103. 103.
    Franken M, Stramigioli S, Misra S et al (2011) Bilateral telemanipulation with time delays: a two-layer approach combining passivity and transparency. IEEE Trans Robot 27:741–756CrossRefGoogle Scholar
  104. 104.
    Ogre3D. http://www.ogre3d.org. Accessed 11 Nov 2014
  105. 105.
    Jones CM, Healy SD (2006) Differences in cue use and spatial memory in men and women. Proc Royal Soc B 273:2241–2247CrossRefGoogle Scholar
  106. 106.
    Peters M, Laeng B, Latham K et al (1995) A redrawn Vandenberg and Kuse mental rotations test: different versions and factors that affect performance. Brain Cogn 28:39–58CrossRefGoogle Scholar
  107. 107.
    Hart SG, Staveland LE (1988) Development of NASA-TLX (Task Load Index): results of empirical and theoretical research. In: Hancock PA, Meshkati N (eds) Human mental workload. North Holland, Amsterdam, pp 139–183CrossRefGoogle Scholar
  108. 108.
    Hassenzahl M, Burmester M, Koller F (2003) AttrakDiff: Ein Fragebogen zur Messung wahrgenommener hedonischer und pragmatischer Qualität. In: Szwillus G, Ziegler J (eds) Mensch und Computer. Teubner, Stuttgart, pp 187–196Google Scholar
  109. 109.
    Hassenzahl M, Monk A (2010) The inference of perceived usability from beauty. Hum Comput Interact 25:235–260CrossRefGoogle Scholar
  110. 110.
    Wirth W, Hartmann T, Böcking S et al (2007) A process model of the formation of spatial presence experiences. Media Psychol 9:493–525CrossRefGoogle Scholar
  111. 111.
    Wirth W, Schramm H, Böcking S et al (2008) Entwicklung und Validierung eines Fragebogens zur Entstehung von räumlichem Präsenzerleben. In: Matthes J et al (eds) Die Brücke zwischen Theorie und Empirie: Operationalisierung. Messung und Validierung in der Kommunikationswissenschaft, Halem Verlag, Köln, pp 70–95Google Scholar
  112. 112.
    Hill A, Johnson A (2008) Withindows: a framework for transitional desktop and immersive user interfaces. In: Proc IEEE Symp 3D User Interfaces, pp 3–10Google Scholar

Copyright information

© Springer Science+Business Media Dordrecht 2014

Authors and Affiliations

  • Marcus Mast
    • 1
    • 2
  • Zdeněk Materna
    • 3
  • Michal Španěl
    • 3
  • Florian Weisshardt
    • 4
  • Georg Arbeiter
    • 4
  • Michael Burmester
    • 1
  • Pavel Smrž
    • 3
  • Birgit Graf
    • 4
  1. 1.Stuttgart Media UniversityStuttgartGermany
  2. 2.Linköping UniversityLinköpingSweden
  3. 3.Brno University of TechnologyBrnoCzech Republic
  4. 4.Fraunhofer Institute for Manufacturing Engineering and Automation (IPA)StuttgartGermany

Personalised recommendations