How Humans Optimize Their Interaction with the Environment: The Impact of Action Context on Human Perception
- 156 Downloads
This paper reports empirical findings on human performance in an experiment comprising a perceptual task and a motor task. Such findings should be considered in design of robots, since drawing inspiration from natural solutions not only should prove beneficial for artificial systems but also human-robot interaction should then become more efficient and safe. Humans have developed various mechanisms to optimize the way actions are performed and the effects they induce. Optimization of action planning (e.g., grasping, reaching or lifting objects) requires efficient selection of action-relevant features. Selection might also depend on the environmental context in which an action takes place. The present study investigated how action context influences perceptual processing in action planning. The experimental paradigm comprised two independent tasks: (1) a perceptual visual search task and (2) a grasping or a pointing movement. Reaction times in the visual search task were measured as a function of the movement type (grasping vs. pointing) and context complexity (context varying along one dimension vs. context varying along two dimensions). Results showed that action context influenced reaction times, which suggests a close bidirectional link between action and perception as well as an impact of environmental action context on perceptual selection in the course of action planning. These findings are discussed in the context of application for robotics and design of users’ interfaces.
KeywordsAction context Visual perception Action-perception links
Unable to display preview. Download preview PDF.
- 4.Allport A (1987) Selection for action: some behavioral and neurophysiological considerations of attention and action. In: Heuer H, Sanders AF (eds) Perspectives on perception and action. Erlbaum, Hillsdale, pp 395–419 Google Scholar
- 13.Gibson EJ (1977) The theory of affordances. In: Shaw RE, Bransford J (eds) Perceiving, acting and knowing. Erlbaum, Hillsdale, pp 127–143 Google Scholar
- 31.Horswill I (1994) Specialization of perceptual processes. PhD thesis, Massachusetts Institute of Technology, Cambridge Google Scholar
- 32.Okada K, Kojima M, Sagawa Y, Ichino T, Sato K, Inaba M (2006) Vision based behavior verification system of humanoid robot for daily environment tasks. In: 6th IEEE-RAS international conference on humanoid robots, Humanoids 2006, pp 7–12 Google Scholar
- 34.Beetz M, Stulp F, Esden-Tempski P, Fedrizzi A, Klank U, Kresse I, Maldonado A, Ruiz-Ugalde F (2010) Generality and legibility in mobile manipulation. Auton Robots J (special issue on mobile manipulation) Google Scholar
- 35.Beetz M, Jain D, Mösenlechner L, Tenorth M (2010) Towards performing everyday manipulation activities. Robot Auton Syst Google Scholar
- 36.Beetz M, Blodow N, Klank U, Marton Z, Pangercic D, Rusu R (2009) COP-MAN perception for mobile pick-and-place in human living environments. Workshop, IROS Google Scholar
- 37.Klank U, Pangercic D, Rusu R, Beetz M (2009) Real-time CAD model matching for mobile manipulation and grasping. In: 9th IEEE-RAS international conference on humanoid robots, Paris, France, December 7–10 Google Scholar
- 38.Maldonado A, Klank U, Beetz M (2010) Robotic grasping of unmodeled objects using time-of-flight range data and finger torque information. In: IEEE/RSJ international conference on intelligent robots and systems, IROS, 2010 (accepted for publication) Google Scholar
- 39.Cabibihan J-J, So WC, Nazar M, Ge SS (2009) Pointing gestures for a robot mediated communication interface. In: Xie M et al (eds) ICIRA 2009. LNAI, vol 5928. Springer, Berlin, pp 67–77 Google Scholar