Advertisement

Human-Robot Collaborative Remote Object Search

  • Jun Miura
  • Shin Kadekawa
  • Kota Chikaarashi
  • Junichi Sugiyama
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 302)

Abstract

Object search is one of the typical tasks for remotely controlled service robots. Although object recognition technologies have been well developed, an efficient search strategy (or viewpoint planning method) is still an issue. This paper describes a new approach to human-robot collaborative remote object search. An analogy for our approach is ride on shoulders; a user controls a fish-eye camera on a remote robot to change views and search for a target object, independently of the robot. Combined with a certain level of automatic search capability of the robot, this collaboration can realize an efficient target object search. We developed an experimental system to show the feasibility of the approach.

Keywords

Human-robot collaboration Object search Observation planning 

References

  1. 1.
    A.A. Makarenko, S.B. Williams, F. Bourgault, and H.F. Durrant-Whyte. An Experiment in Integrated Exploration. In Proceedings of 2002 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 534–539, 2002.Google Scholar
  2. 2.
    R. Martinez-Cantin, N. de Freitas, R. Brochu, J. Castellanos, and A. Doucet. A Bayesian Exploration-Exploitation Approach for Optimal Online Sensing and Planning with a Visually Guided Mobile Robot. Autonomous Robots, Vol. 27, pp. 93–103, 2009.CrossRefGoogle Scholar
  3. 3.
    Y. Ye and J.K. Tsotsos. Sensor Planning for 3D Object Search. Computer Vision and Image Understanding, Vol. 73, No. 2, pp. 145–168, 1999.CrossRefGoogle Scholar
  4. 4.
    K. Shubina and J.K. Tsotsos. Visual Search for an Object in a 3D Environment Uing a Mobile Robot. Computer Vision and Image Understanding, Vol. 114, pp. 535–547, 2010.CrossRefGoogle Scholar
  5. 5.
    F. Saidi, O. Stasse, K. Yokoi, and F. Kanehiro. Online Object Search with a Humanoid Robot. In Proceedings of 2007 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 1677–1682, 2007.Google Scholar
  6. 6.
    A. Aydemir, K. Sjöö, J. Folkesson, A. Pronobis, and P. Jensfelt. Search in the Real World: Active Visual Object Search Based on Spatial Relations. In Proceedings of 2011 IEEE Int. Conf. on Robotics and Automation, pp. 2818–2824, 2011.Google Scholar
  7. 7.
    H. Masuzawa and J. Miura. Observation Planning for Efficient Environment Information Summarization. In Proceedings of 2009 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 5794–5780, 2009.Google Scholar
  8. 8.
    H. Masuzawa and J. Miura. Observation Planning for Environment Information Summarization with Deadlines. In Proceedings of 2010 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems, pp. 30–36, 2010.Google Scholar
  9. 9.
    M. Boussard and J. Miura. Observation planning for object search by a mobile robot with uncertain recognition. In Proceedings of the 12th Int. Conf. on Intelligent Autonomous Systems, 2012. F3B.5 (CD-ROM).Google Scholar
  10. 10.
    B. Bonet and H. Geffner. Labeled RTDP: Improving the Convergence of Real-Time Dynamic Programming. In Enrico Giunchiglia, Nicola Muscettola, and Dana S. Nau, editors, Proceedings 13th Int. Conf. on Automated Planning and Scheduling (ICAPS-2003), pp. 12–31. AAAI, 2003.Google Scholar
  11. 11.
    T. Fong, C. Thorpe, and C. Baur. Advanced Interfaces for Vehicle Teleoperation: Collaborative Control, Sensor Fusion Displays, and Remote Driving Tools. Autonomous Robots, Vol. 11, pp. 77–85, 2001.CrossRefMATHGoogle Scholar
  12. 12.
    S. Suzuki. A Vision System for Remote Control of Mobile Robot to Enlarge Field of View in Horizontal and Vertical. In Proceedings of 2011 IEEE Int. Conf. on Robotics and Biomimetics, pp. 8–13, 2011.Google Scholar
  13. 13.
    K. Saitoh, T. Machida, K. Kiyokawa, and H. Takemura. A 2D-3D Integrated Interface for Mobile Robot Control using Omnidirectional Images and 3D Geometric Models. In Proceedings of 2006 IEEE/ACM Int. Symp. on Mixed and Augmented Reality, pp. 173–176, 2006.Google Scholar
  14. 14.
    N. Shiroma, N. Sato, Y. Chiu, and F. Matsuno. Study on Effective Camera Images for Mobile Robot Teleoperation. In Proceedings of 13th IEEE Int. Workshop on Robot and Human Interactive Communication, pp. 107–112, 2004.Google Scholar
  15. 15.
    T. Fong, C. Thorpe, and C. Baur. A Safeguarded Teleoperation Controller. In Proceedings of 2001 IEEE Int. Conf. on Advanced Robotics, 2001.Google Scholar
  16. 16.
    T. Sawaragi, T. Shiose, and G. Akashi. Foundations for Designing an Ecological Interface for Mobile Robot Teleoperation. Robotics and Autonomous Systems, Vol. 31, pp. 193–207, 2000.CrossRefGoogle Scholar
  17. 17.
    N. Ando, T. Suehiro, and T. Kotoku. A Software Platform for Component Based RT System Development: OpenRTM-aist. In Proceedings of the 1st Int. Conf. on Simulation, Modeling, and Programming for Autonomous Robots (SIMPAR ’08), pp. 87–98, 2008.Google Scholar
  18. 18.
  19. 19.
    D.G. Lowe. Distinctive Image Features from Scale-Invariant Keypoints. Int. J. of Computer Vision, Vol. 60, No. 2, pp. 91–110, 2004.CrossRefGoogle Scholar
  20. 20.
    Point Cloud Library. http://pointclouds.org/.
  21. 21.
  22. 22.
    Y. Ueno, T. Ohno, K. Terashima, H. Kitagawa, K. Funato, and K. Kakihara. Novel Differential Drive Steering System with Energy Saving and Normal Tire using Spur Gear for Omni-directional Mobile Robot. In Proceedings of the 2010 IEEE Int. Conf. on Robotics and Automation, pp. 3763–3768, 2010.Google Scholar
  23. 23.

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  • Jun Miura
    • 1
  • Shin Kadekawa
    • 1
  • Kota Chikaarashi
    • 1
  • Junichi Sugiyama
    • 1
  1. 1.Department of Computer Science and EngineeringToyohashi University of TechnologyToyohashiJapan

Personalised recommendations