Autonomous Robots

, Volume 37, Issue 3, pp 301-316

First online:

Semantic grasping: planning task-specific stable robotic grasps

  • Hao DangAffiliated withComputer Science Department, Columbia University Email author 
  • , Peter K. AllenAffiliated withComputer Science Department, Columbia University

Rent the article at a discount

Rent now

* Final gross prices may vary according to local VAT.

Get Access


We present an example-based planning framework to generate semantic grasps, stable grasps that are functionally suitable for specific object manipulation tasks. We propose to use partial object geometry, tactile contacts, and hand kinematic data as proxies to encode task-related constraints, which we call semantic constraints. We introduce a semantic affordance map, which relates local geometry to a set of predefined semantic grasps that are appropriate to different tasks. Using this map, the pose of a robot hand with respect to the object can be estimated so that the hand is adjusted to achieve the ideal approach direction required by a particular task. A grasp planner is then used to search along this approach direction and generate a set of final grasps which have appropriate stability, tactile contacts, and hand kinematics. We show experiments planning semantic grasps on everyday objects and applying these grasps with a physical robot.


Grasp planning Task-specific grasping Semantic grasping