Springer Handbook of Robotics

pp 2095-2114

Perceptual Robotics

  • Heinrich BülthoffAffiliated withHuman Perception, Cognition and Action, Max-Planck-Institute for Biological Cybernetics Email author 
  • , Christian WallravenAffiliated withDepartment of Brain and Cognitive Engineering, Cognitive Systems Lab, Korea University
  • , Martin A. GieseAffiliated withDepartment for Cognitive Neurology, University Clinic Tübingen

* Final gross prices may vary according to local VAT.

Get Access


Robots that share their environment with humans need to be able to recognize and manipulate objects and users, perform complex navigation tasks, and interpret and react to human emotional and communicative gestures. In all of these perceptual capabilities, the human brain, however, is still far ahead of robotic systems. Hence, taking clues from the way the human brain solves such complex perceptual tasks will help to design better robots. Similarly, once a robot interacts with humans, its behaviors and reactions will be judged by humans – movements of the robot, for example, should be fluid and graceful, and it should not evoke an eerie feeling when interacting with a user. In this chapter, we present Perceptual Robotics as the field of robotics that takes inspiration from perception research and neuroscience to, first, build better perceptual capabilities into robotic systems and, second, to validate the perceptual impact of robotic systems on the user.