International Journal of Social Robotics

, Volume 3, Issue 3, pp 223–231 | Cite as

How Humans Optimize Their Interaction with the Environment: The Impact of Action Context on Human Perception

  • Agnieszka Wykowska
  • Alexis Maldonado
  • Michael Beetz
  • Anna Schubö
Article

Abstract

This paper reports empirical findings on human performance in an experiment comprising a perceptual task and a motor task. Such findings should be considered in design of robots, since drawing inspiration from natural solutions not only should prove beneficial for artificial systems but also human-robot interaction should then become more efficient and safe. Humans have developed various mechanisms to optimize the way actions are performed and the effects they induce. Optimization of action planning (e.g., grasping, reaching or lifting objects) requires efficient selection of action-relevant features. Selection might also depend on the environmental context in which an action takes place. The present study investigated how action context influences perceptual processing in action planning. The experimental paradigm comprised two independent tasks: (1) a perceptual visual search task and (2) a grasping or a pointing movement. Reaction times in the visual search task were measured as a function of the movement type (grasping vs. pointing) and context complexity (context varying along one dimension vs. context varying along two dimensions). Results showed that action context influenced reaction times, which suggests a close bidirectional link between action and perception as well as an impact of environmental action context on perceptual selection in the course of action planning. These findings are discussed in the context of application for robotics and design of users’ interfaces.

Keywords

Action context Visual perception Action-perception links 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    James W (1890) Principles of psychology. Holt, New York CrossRefGoogle Scholar
  2. 2.
    Greenwald AG (1970) Sensory feedback mechanisms in performance control: with special reference to the ideo-motor mechanism. Psychol Rev 77:73–99 CrossRefGoogle Scholar
  3. 3.
    Wolpert DM, Kawato M (1998) Multiple paired forward and inverse models for motor control. Neural Netw 11:1317–1329 CrossRefGoogle Scholar
  4. 4.
    Allport A (1987) Selection for action: some behavioral and neurophysiological considerations of attention and action. In: Heuer H, Sanders AF (eds) Perspectives on perception and action. Erlbaum, Hillsdale, pp 395–419 Google Scholar
  5. 5.
    Rizzolatti G, Craighero L (2004) The mirror-neuron system. Ann Rev Neurosci 27:169–192 CrossRefGoogle Scholar
  6. 6.
    Hommel B, Müsseler J, Aschersleben G, Prinz W (2001) The theory of event coding (TEC): a framework for perception and action planning. Behav Brain Sci 24:849–937 CrossRefGoogle Scholar
  7. 7.
    Prinz W (1997) Perception and action planning. Eur J Cogn Psychol 9:129–154 CrossRefGoogle Scholar
  8. 8.
    Craighero L, Fadiga L, Rizzolatti G, Umiltà CA (1999) Action for perception: a motor-visual attentional effect. J Exp Psychol Hum Percept Perform 25:1673–1692 CrossRefGoogle Scholar
  9. 9.
    Schubotz RI, von Cramon DY (2002) Predicting perceptual events activates corresponding motor schemes in lateral premotor cortex: an fMRI study. Neuroimage 15:787–796 CrossRefGoogle Scholar
  10. 10.
    Goodale MA, Milner AD (1992) Separate visual pathways for perception and action. Trends Neurosci 15:20–25 CrossRefGoogle Scholar
  11. 11.
    Rossetti Y, Pisella L, Vighetto A (2003) Optic ataxia revisited: visually guided action versus immediate visuomotor control. Exp Brain Res 153:171–179 CrossRefGoogle Scholar
  12. 12.
    Humphreys GW, Riddoch MJ (2001) Detection by action: neuropsychological evidence for action-defined templates in search. Nature Neurosci 4:84–89 CrossRefGoogle Scholar
  13. 13.
    Gibson EJ (1977) The theory of affordances. In: Shaw RE, Bransford J (eds) Perceiving, acting and knowing. Erlbaum, Hillsdale, pp 127–143 Google Scholar
  14. 14.
    Tucker R, Ellis M (2001) The potentiation of grasp types during visual object categorization. Vis Cogn 8:769–800 CrossRefGoogle Scholar
  15. 15.
    Grèzes J, Decety J (2002) Does visual perception of object afford action? Evidence from a neuroimaging study. Neuropsychologia 40:212–222 CrossRefGoogle Scholar
  16. 16.
    Grafton ST, Fadiga L, Arbib MA, Rizzolatti G (1997) Premotor cortex activation during observation and naming of familiar tools. NeuroImage 6:231–236 CrossRefGoogle Scholar
  17. 17.
    Bekkering H, Neggers SFW (2002) Visual search is modulated by action intentions. Psychol Sci 13:370–374 CrossRefGoogle Scholar
  18. 18.
    Fagioli S, Hommel B, Schubotz RI (2007) Intentional control of attention: action planning primes action related stimulus dimensions. Psychol Res 71:22–29 CrossRefGoogle Scholar
  19. 19.
    Wykowska A, Schubö A, Hommel B (2009) How you move is what you see: action planning biases selection in visual search. J Exp Psychol Hum Percept Perform 35:1755–1769 CrossRefGoogle Scholar
  20. 20.
    Turner RM (1998) Context-mediated behavior for intelligent agents. Int J Hum-Comput Stud 48:307–330 CrossRefGoogle Scholar
  21. 21.
    Chun MM, Jiang Y (1998) Contextual cueing: implicit learning and memory of visual context guides spatial attention. Cogn Psychol 36:28–71 CrossRefGoogle Scholar
  22. 22.
    Duncan J, Humphreys GW (1989) Visual search and stimulus similarity. Psychol Rev 96:433–458 CrossRefGoogle Scholar
  23. 23.
    Kunar MA, Flusberg S, Horowitz TS, Wolfe JM (2007) Does contextual cuing guide the deployment of attention? J Exp Psychol Hum Percept Perform 33:816–828 CrossRefGoogle Scholar
  24. 24.
    Olson IR, Chun MM, Allison T (2001) Contextual guidance of attention. Human intracranial event-related potential evidence for feedback modulation in anatomically early, temporally late stages of visual processing. Brain 124:1417–1425 CrossRefGoogle Scholar
  25. 25.
    Schankin A, Schubö A (2009) Cognitive processes facilitated by contextual cueing. Evidence from event-related brain potentials. Psychophysiology 46:668–679 CrossRefGoogle Scholar
  26. 26.
    Schubö A, Schröger E, Meinecke C (2004) Texture segmentation and visual search for pop-out targets: an ERP study. Brain Res Cogn Brain Res 21:317–334 CrossRefGoogle Scholar
  27. 27.
    Schubö A, Wykowska A, Müller HJ (2007) Detecting pop-out targets in contexts of varying homogeneity: investigating homogeneity coding with event-related brain potentials (ERPs). Brain Res 1138:136–147 CrossRefGoogle Scholar
  28. 28.
    Bundesen C (1990) A theory of visual attention. Psychol Rev 97:523–547 CrossRefGoogle Scholar
  29. 29.
    Desimone R, Duncan J (1995) Neural mechanisms of selective visual attention. Ann Rev Neurosci 18:193–222 CrossRefGoogle Scholar
  30. 30.
    Wolfe JM (1994) Guided search 2.0: a revised model of visual search. Psychon Bull Rev 1:202–238 CrossRefGoogle Scholar
  31. 31.
    Horswill I (1994) Specialization of perceptual processes. PhD thesis, Massachusetts Institute of Technology, Cambridge Google Scholar
  32. 32.
    Okada K, Kojima M, Sagawa Y, Ichino T, Sato K, Inaba M (2006) Vision based behavior verification system of humanoid robot for daily environment tasks. In: 6th IEEE-RAS international conference on humanoid robots, Humanoids 2006, pp 7–12 Google Scholar
  33. 33.
    Itti L, Arbib M (2006) Attention and the minimal subscene. In: Arbib MA (ed) Action to language via the mirror neuron system. Cambridge University Press, Cambridge, pp 289–346 CrossRefGoogle Scholar
  34. 34.
    Beetz M, Stulp F, Esden-Tempski P, Fedrizzi A, Klank U, Kresse I, Maldonado A, Ruiz-Ugalde F (2010) Generality and legibility in mobile manipulation. Auton Robots J (special issue on mobile manipulation) Google Scholar
  35. 35.
    Beetz M, Jain D, Mösenlechner L, Tenorth M (2010) Towards performing everyday manipulation activities. Robot Auton Syst Google Scholar
  36. 36.
    Beetz M, Blodow N, Klank U, Marton Z, Pangercic D, Rusu R (2009) COP-MAN perception for mobile pick-and-place in human living environments. Workshop, IROS Google Scholar
  37. 37.
    Klank U, Pangercic D, Rusu R, Beetz M (2009) Real-time CAD model matching for mobile manipulation and grasping. In: 9th IEEE-RAS international conference on humanoid robots, Paris, France, December 7–10 Google Scholar
  38. 38.
    Maldonado A, Klank U, Beetz M (2010) Robotic grasping of unmodeled objects using time-of-flight range data and finger torque information. In: IEEE/RSJ international conference on intelligent robots and systems, IROS, 2010 (accepted for publication) Google Scholar
  39. 39.
    Cabibihan J-J, So WC, Nazar M, Ge SS (2009) Pointing gestures for a robot mediated communication interface. In: Xie M et al (eds) ICIRA 2009. LNAI, vol 5928. Springer, Berlin, pp 67–77 Google Scholar

Copyright information

© Springer Science & Business Media BV 2010

Authors and Affiliations

  • Agnieszka Wykowska
    • 1
  • Alexis Maldonado
    • 2
  • Michael Beetz
    • 2
  • Anna Schubö
    • 1
  1. 1.Department of Experimental PsychologyLudwig Maximilians UniversitätMünchenGermany
  2. 2.Computer Science Department, Chair IXTechnische UniversitätMünchenGermany

Personalised recommendations