Advertisement

Recognition of Grasp Points for Clothes Manipulation Under Unconstrained Conditions

  • Luz María MartínezEmail author
  • Javier Ruiz-del-Solar
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11175)

Abstract

In this work a system for recognizing grasp points in RGB-D images is proposed. This system is intended to be used by a domestic robot when deploying clothes lying at a random position on a table. By taking into consideration that the grasp points are usually near key parts of clothing, such as the waist of pants or the neck of a shirt. The proposed system attempts to detect these key parts first, using a local multivariate contour that adapts its shape accordingly. Then, the proposed system applies the Vessel Enhancement filter to identify wrinkles in the clothes, allowing to compute a roughness index for the clothes. Finally, by mixing (i) the key part contours and (ii) the roughness information obtained by the vessel filter, the system is able to recognize grasp points for unfolding a piece of clothing. The recognition system is validated using realistic RGB-D images of different cloth types.

Keywords

Clothing recognition Depth image Grasp points Wrinkle analysis 

Notes

Acknowledgments

This work was funded by CONICYT- PCHA/Doctorado Nacional/2014-21140280 and FONDECYT Project 1161500.

References

  1. 1.
    Doumanoglou, A., Kargakos, A., Kim, T., Malassiotis, S.: Autonomous active recognition and unfolding of clothes using random decision forests and probabilistic planning. In: 2014 IEEE International Conference on Robotics and Automation, ICRA 2014, Hong Kong, China, 31 May–7 June, pp. 987–993 (2014).  https://doi.org/10.1109/ICRA.2014.6906974
  2. 2.
    Frangi, A.F., Niessen, W.J., Vincken, K.L., Viergever, M.A.: Multiscale vessel enhancement filtering. In: Wells, W.M., Colchester, A., Delp, S. (eds.) MICCAI 1998. LNCS, vol. 1496, pp. 130–137. Springer, Heidelberg (1998).  https://doi.org/10.1007/BFb0056195CrossRefGoogle Scholar
  3. 3.
    Kass, M., Witkin, A., Terzopoulos, D.: Snakes: active contour models. Int. J. Comput. Vis. 1(4), 321–331 (1988)CrossRefGoogle Scholar
  4. 4.
    Salleh, K., Seki, H., Kamiya, Y., Hikizu, M.: Deformable object manipulation - study on passive tracing by robot. In: Student Conference on Research and Development - SCOReD 5 (2007)Google Scholar
  5. 5.
    Salleh, K., Seki, H., Kamiya, Y., Hikizu, M.: Inchworm robot grippers in clothes manipulation - optimizing the tracing algorithm. In: International Conference on Intelligent and Advanced Systems (ICIAS), pp. 1051–1055 (2007)Google Scholar
  6. 6.
    Sahari, K.S.M., Seki, H., Kamiya, Y., Hikizu, M.: Real-time path planning tracing of deformable object by robot. Int. J. Smart Sens. Intell. Syst. 3(3) (2010)Google Scholar
  7. 7.
    Li, Y., Chen, C.F., Allen, P.K.: Recognition of deformable object category and pose. In: Proceedings of the IEEE International Conference on Robotics and Automation (ICRA) (2014)Google Scholar
  8. 8.
    Li, Y., Xu, D., Yue, Y., Wang, Y., Chang, S.F., Grinspun, E., Allen, P.K.: Regrasping and unfolding of garments using predictive thin shell modeling. In: Proceedings of the IEEE International Conference on Robotics and Automation (ICRA) (2015)Google Scholar
  9. 9.
    Mariolis, I., Peleka, G., Kargakos, A., Malassiotis, S.: Pose and category recognition of highly deformable objects using deep learning. In: ICAR, pp. 655–662. IEEE (2015). http://dblp.uni-trier.de/db/conf/icar/icar2015.html#MariolisPKM15
  10. 10.
    PCL: Point cloud library 1.7. http://www.pointclouds.org
  11. 11.
    Ramisa, A., Alenya, G., Moreno-Noguer, F., Torras, C.: FINDDD: a fast 3D descriptor to characterize textiles for robot manipulation. In: Proceedings of the International Conference on Intelligent Robots and Systems (IROS), pp. 824–830 (2013)Google Scholar
  12. 12.
    Ramisa, A., Alenyá, G., Moreno-Noguer, F., Torras, C.: Determining where to grasp cloth using depth information. In: FernÃąndez, C., Geffner, H., Manyá, F. (eds.) CCIA. Frontiers in Artificial Intelligence and Applications, vol. 232, pp. 199–207. IOS Press (2011). http://dblp.uni-trier.de/db/conf/ccia/ccia2011.html#RamisaAMT11
  13. 13.
    Ramisa, A., Alenyá, G., Moreno-Noguer, F., Torras, C.: Using depth and appearance features for informed robot grasping of highly wrinkled clothes. In: ICRA, pp. 1703–1708. IEEE (2012). http://dblp.uni-trier.de/db/conf/icra/icra2012.html#RamisaAMT12
  14. 14.
    Ramisa, A., Alenyá, G., Moreno-Noguer, F., Torras, C.: Learning RGB-D descriptors of garment parts for informed robot grasping. Eng. Appl. Artif. Intell. 35, 246–258 (2014). http://www.sciencedirect.com/science/article/pii/S095219761400147XCrossRefGoogle Scholar
  15. 15.
    Ramisa, A., Alenyá, G., Moreno-Noguer, F., Torras, C.: Learning RGB-D descriptors of garment parts for informed robot grasping. Eng. Appl. Artif. Intell. 35(Complete), 246–258 (2014)CrossRefGoogle Scholar
  16. 16.
    Ramisa, A., Alenyá, G., Moreno-Noguer, F., Torras, C.: A 3D descriptor to detect task-oriented grasping points in clothing. Pattern Recogn. 60, 936–948 (2016). http://www.sciencedirect.com/science/article/pii/S0031320316301558CrossRefGoogle Scholar
  17. 17.
    Rusu, R.B., Blodow, N., Beetz, M.: Fast point feature histograms (FPFH) for 3D registration. In: Proceedings of the 2009 IEEE International Conference on Robotics and Automation, ICRA 2010, pp. 1848–1853. IEEE Press, Piscataway (2009). http://dl.acm.org/citation.cfm?id=1703435.1703733
  18. 18.
    Rusu, R.B., Bradski, G.R., Thibaux, R., Hsu, J.M.: Fast 3D recognition and pose using the viewpoint feature histogram. In: IROS, pp. 2155–2162. IEEE (2010). http://dblp.uni-trier.de/db/conf/iros/iros2010.html#RusuBTH10
  19. 19.
    Sahari, K.S.M., Seki, H., Kamiya, Y., Hikizu, M.: Clothes manipulation by robot grippers with roller fingertips. Adv. Robot. 24(1–2), 139–158 (2010). http://dblp.uni-trier.de/db/journals/ar/ar24.html#SahariSKH10CrossRefGoogle Scholar
  20. 20.
    Salleh, K., Seki, H., Kamiya, Y., Hikizu, M.: Inchworm robot grippers for clothes manipulation. Artif. Life Robot. 12(1–2), 142–147 (2008).  https://doi.org/10.1007/s10015-007-0456-6CrossRefGoogle Scholar
  21. 21.
    Sun, L., Aragon-Camarasa, G., Rogers, S., Siebert, J.P.: Accurate garment surface analysis using an active stereo robot head with application to dual-arm flattening. In: ICRA, pp. 185–192. IEEE (2015). http://dblp.uni-trier.de/db/conf/icra/icra2015.html#SunARS15
  22. 22.
    Willimon, B., Walker, I.D., Birchfield, S.: Classification of clothing using midlevel layers. ISRN Robot. (2013)Google Scholar
  23. 23.
    Willimon, B., Walker, I.D., Birchfield, S.: A new approach to clothing classification using mid-level layers. In: ICRA, pp. 4271–4278. IEEE (2013). http://dblp.uni-trier.de/db/conf/icra/icra2013.html#WillimonWB13a

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  1. 1.Advanced Mining Technology Center and Department of Electrical EngineeringUniversidad de ChileSantiagoChile

Personalised recommendations