Autonomous Robots

, Volume 37, Issue 3, pp 301–316

Semantic grasping: planning task-specific stable robotic grasps


DOI: 10.1007/s10514-014-9391-2

Cite this article as:
Dang, H. & Allen, P.K. Auton Robot (2014) 37: 301. doi:10.1007/s10514-014-9391-2


We present an example-based planning framework to generate semantic grasps, stable grasps that are functionally suitable for specific object manipulation tasks. We propose to use partial object geometry, tactile contacts, and hand kinematic data as proxies to encode task-related constraints, which we call semantic constraints. We introduce a semantic affordance map, which relates local geometry to a set of predefined semantic grasps that are appropriate to different tasks. Using this map, the pose of a robot hand with respect to the object can be estimated so that the hand is adjusted to achieve the ideal approach direction required by a particular task. A grasp planner is then used to search along this approach direction and generate a set of final grasps which have appropriate stability, tactile contacts, and hand kinematics. We show experiments planning semantic grasps on everyday objects and applying these grasps with a physical robot.


Grasp planning Task-specific grasping Semantic grasping 

Copyright information

© Springer Science+Business Media New York 2014

Authors and Affiliations

  1. 1.Computer Science DepartmentColumbia UniversityNew YorkUSA

Personalised recommendations