Abstract
What makes a social humanoid robot behave like a human? It needs to understand and show emotions, has a chat box, a memory and also a decision-making process. However, more than that, it needs to recognize objects and be able to grasp them in a human way. To become an intimate companion, social robots need to behave the same way as real humans in all areas and understand real situations in order they can react properly. In this chapter, we describe our ongoing research on social robotics. It includes the making of articulated hands of Nadine Robot, the recognition of objects and their signification, as well as how to grasp them in a human way. State of the art is presented as well as some early results.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Uncanny valley, in Wikipedia. https://en.wikipedia.org/wiki/Uncanny_valley. Accessed 28 Nov 2016
https://funwithluka.files.wordpress.com/2015/03/img2885.jpg. Accessed 13 Nov 2016
Nadine social robot, in Wikipedia. https://en.wikipedia.org/wiki/Nadine_Social_Robot. Accessed 28 Nov 2016
Atlas (robot), in Wikipedia. https://en.wikipedia.org/wiki/Atlas_(robot). Accessed 30 Nov 2016
Y. Sakagami, R. Watanabe, C. Aoyama et al., The intelligent ASIMO: system overview and integration, in IEEE/RSJ International Conference on Intelligent Robots and Systems, 2002, vol. 3, (IEEE, 2002), pp. 2478-2483
G. Metta, G. Sandini, D. Vernon, L. Natale, F. Nori, The iCub humanoid robot: an open platform for research in embodied cognition, in PerMIS’08 Proceedings of the 8th Workshop on Performance Metrics for Intelligent Systems, pp. 50–56
Actroid, in Wikipedia. https://en.wikipedia.org/wiki/Actroid. Accessed 28 Nov 2016
T. Times, Meet Jiajia, china’s new interactive robot. http://www.techtimes.com/articles/150827/20160416/meet-jiajia-chinas-new-interactive-robot.htm. Accessed 30 Nov 2016
Home, http://www.rickyma.hk/. Accessed 30 Nov 2016
R. Girshick, J. Donahue, T. Darrell, J. Malik, Rich feature hierarchies for accurate object detection and semantic segmentation, in CVPR (2014)
R. Girshick, Fast r-cnn, in Proceedings of the IEEE International Conference on Computer Vision (2015), pp. 1440–1448
S. Ren, K. He, R. Girshick, J. Sun, Faster r-cnn: towards real-time object detection with region proposal networks, in Advances in Neural Information Processing Systems (2015), pp. 91–99
J. Redmon, S. Divvala, R. Girshick, A. Farhadi, You only look once: unified, real-time object detection, in CVPR (2016)
W. Liu, D. Anguelov, D. Erhan et al., SSD: single shot multibox detector (2015). arXiv:1512.02325
A. Saxena, J. Driemeyer, A.Y. Ng, Robotic grasping of novel objects using vision. Int. J. Rob. Res. 27(2), 157–173 (2008)
Q.V. Le, D. Kamm, A.F. Kara, A.Y. Ng, Learning to grasp objects with multiple contact points, in IEEE International Conference on Robotics and Automation (ICRA) (2010), pp. 5062–5069
Ian Lenz, Honglak Lee, Ashutosh Saxena, Deep learning for detecting robotic grasps. Int. J. Rob. Res. 34(4–5), 705–724 (2015)
Y. Jiang, S. Moseson, A. Saxena, Efficient grasping from rgbd images: learning using a new rectangle representation, in IEEE International Conference on Robotics and Automation (ICRA) (2011), pp. 3304–3311
J. Redmon, A. Angelova, Real-time grasp detection using convolutional neural networks, in IEEE International Conference on Robotics and Automation (ICRA) (2015), pp. 1316–1322
D. Guo, T. Kong, F. Sun et al., Object discovery and grasp detection with a shared convolutional neural network, in 2016 IEEE International Conference on Robotics and Automation (ICRA). (IEEE, 2016), pp. 2038–2043
Adioshun, Results matching, https://adioshun.gitbooks.io/learning-opencv-3-computer-vision-with-python/content/learning_python_opencv_ch07.html. Accessed 21 Dec 2016
G. Schreiber, A. Stemmer, R. Bischoff, The fast research interface for the kuka lightweight robot, in IEEE Workshop on Innovative Robot Control Architectures for Demanding (Research) Applications How to Modify and Enhance Commercial Controllers (ICRA 2010) (2010), pp. 15–21
S. Rueckhaus, MH24. http://www.motoman.com/industrial-robots/mh24 Accessed 28 Nov 2016
Inmoov project, http://inmoov.fr/. Accessed 1 Dec 2016
P. Slade et al., Tack: design an performance of an open-source, affordable, myoelectric prosthetic hand, in 2015 IEEE ICRA
J.T. Belter et al., Mechanical design and performance specifications of anthropomorphic prosthetic hands: a review. J. Rehabil. Res. Dev. 50(5) 2013
A. Krizhevsky, I. Sutskever, G.E. Hinton, Imagenet classification with deep convolutional neural networks, in Advances in Neural Information Processing Systems (2012), pp. 1097–1105
Layer catalogue, http://caffe.berkeleyvision.org/tutorial/layers.html. Accessed 28 Dec 2016
Theano 0.8.2 documentation (2008), http://deeplearning.net/software/theano/. Accessed 21 Dec 2016
Deep learning framework (no date), http://caffe.berkeleyvision.org/. Accessed 21 Dec 2016
A.M.R Agur, M.J. Lee, Grant’s Atlas of Anatomy 10th ed (1999)
J. Lin, Y. Wu, T.S. Huang, Modeling the constraints of human hand motion, in Workshop on Human Motion, 2000. Proceedings. (IEEE, 2000), pp. 121–126
Acknowledgements
This research is supported by the BeingTogether Centre, a collaboration between Nanyang Technological University (NTU) Singapore and University of North Carolina (UNC) at Chapel Hill. The BeingTogether Centre is supported by the National Research Foundation, Prime Minister’s Office, Singapore under its International Research Centres in Singapore Funding Initiative.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Thalmann, N.M., Tian, L., Yao, F. (2017). Nadine: A Social Robot that Can Localize Objects and Grasp Them in a Human Way. In: Prabaharan, S., Thalmann, N., Kanchana Bhaaskaran, V. (eds) Frontiers in Electronic Technologies. Lecture Notes in Electrical Engineering, vol 433. Springer, Singapore. https://doi.org/10.1007/978-981-10-4235-5_1
Download citation
DOI: https://doi.org/10.1007/978-981-10-4235-5_1
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-10-4234-8
Online ISBN: 978-981-10-4235-5
eBook Packages: EngineeringEngineering (R0)