Skip to main content

Dynamic Obstacle Avoidance for Application of Human-Robot Cooperative Dispensing Medicines

Abstract

For safety reasons, in the automated dispensing medicines process, robots and humans cooperate to accomplish the task of drug sorting and distribution. In this dynamic unstructured environment, such as a humanrobot collaboration scenario, the safety of human, robot, and equipment in the environment is paramount. In this work, a practical and effective robot motion planning method is proposed for dynamic unstructured environments. To figure out the problems of blind zones of single depth sensor and dynamic obstacle avoidance, we first propose a method for establishing offline mapping and online fusion of multi-sensor depth images and 3D grids of the robot workspace, which is used to determine the occupation states of the 3D grids occluded by robots and obstacles and to conduct real-time estimation of the minimum distance between the robot and obstacles. Then, based on the reactive control method, the attractive and repulsive forces are calculated and transformed into robot joint velocities to avoid obstacles in real time. Finally, the robot’s dynamic obstacle avoidance ability is evaluated on an experimental platform with a UR5 robot and two KinectV2 RGB-D sensors, and the effectiveness of the proposed method is verified.

This is a preview of subscription content, access via your institution.

References

  1. [1]

    BALAN L, BONE G M. Real-time 3D collision avoidance method for safe human and robot coexistence [C]//2006 IEEE/RSJ International Conference on Intelligent Robots and Systems. Beijing: IEEE, 2006: 276–282.

    Google Scholar 

  2. [2]

    REDON S, LIN M C, MANOCHA D, et al. Fast continuous collision detection for articulated models [J]. Journal of Computing and Information Science in Engineering, 2005, 5(2): 126–137.

    Article  Google Scholar 

  3. [3]

    PAN J, CHITTA S, MANOCHA D. FCL: A general purpose library for collision and proximity queries [C]//2012 IEEE International Conference on Robotics and Automation. Saint Paul, MN: IEEE, 2012: 3859–3866.

    Google Scholar 

  4. [4]

    PAN J, ŞUCAN I A, CHITTA S, et al. Realtime collision detection and distance computation on point cloud sensor data [C]//2013 IEEE International Conference on Robotics and Automation. Karlsruhe: IEEE, 2013: 3593–3599.

    Google Scholar 

  5. [5]

    NICOLAI P, RACZKOWSKY J, WÜRN H. A novel 3D camera based supervision system for safe humanrobot interaction in the operating room [J]. Journal of Automation and Control Engineering, 2015: 3(5): 410–417.

    Article  Google Scholar 

  6. [6]

    FISCHER M, HENRICH D. Surveillance of robots using multiple colour or depth cameras with distributed processing [C]//2009 Third ACM/IEEE International Conference on Distributed Smart Cameras (ICDSC). Como: IEEE, 2009: 1–8.

    Book  Google Scholar 

  7. [7]

    FLACCO F, KROEGER T, DE LUCA A, et al. A depth space approach for evaluating distance to objects [J]. Journal of Intelligent & Robotic Systems, 2015, 80(1): 7–22.

    Article  Google Scholar 

  8. [8]

    FLACCO F, KRÜGER T, DE LUCA A, et al. A depth space approach to human-robot collision avoidance [C]//2012 IEEE International Conference on Robotics and Automation. Saint Paul, MN: IEEE, 2012: 338–345.

    Google Scholar 

  9. [9]

    FABRIZIO F, DE LUCA A. Real-time computation of distance to dynamic obstacles with multiple depth sensors [J]. IEEE Robotics and Automation Letters, 2017, 2(1): 56–63.

    Article  Google Scholar 

  10. [10]

    PAN J, MANOCHA D. GPU-based parallel collision detection for fast motion planning [J]. The International Journal of Robotics Research, 2012, 31(2): 187–200.

    Article  Google Scholar 

  11. [11]

    KARAMAN S, WALTER M R, PEREZ A, et al. Anytime motion planning using the RRT [C]//2011 IEEE International Conference on Robotics and Automation. Shanghai: IEEE, 2011: 1478–1483.

    Google Scholar 

  12. [12]

    LEVEN P, HUTCHINSON S. A framework for realtime path planning in changing environments [J]. The International Journal of Robotics Research, 2002, 21(12): 999–1030.

    Article  Google Scholar 

  13. [13]

    SCHUMANN-OLSEN H, BAKKEN M, HOLHJEM Ø H, et al. Parallel dynamic roadmaps for real-time motion planning in complex dynamic scenes [C]//3rd Workshop on Robots in Clutter-Perception and Interaction in Clutter. Chicago: IEEE, 2014.

    Google Scholar 

  14. [14]

    YANG Y M, MERKT W, IVAN V, et al. HDRM: A resolution complete dynamic roadmap for real-time motion planning in complex scenes [J]. IEEE Robotics and Automation Letters, 2018, 3(1): 551–558.

    Article  Google Scholar 

  15. [15]

    YANG Y M. Motion synthesis for high degree-offreedom robots in complex and changing environments [D]. Edinburgh: The University of Edinburgh, 2018.

    Google Scholar 

  16. [16]

    KHATIB O. Real-time obstacle avoidance for manipulators and mobile robots [M]//Autonomous robot vehicles. New York, NY: Springer New York, 1986: 396–404.

    Google Scholar 

  17. [17]

    ZHU J, YANG M Y. Path planning of manipulator to avoid obstacle based on improved artificial potential field method [J]. Computer Measurement & Control, 2018, 26(10): 205–210 (in Chinese).

    Google Scholar 

  18. [18]

    LI Y Q. 3D obstacle avoidance path planning for manipulator based on A* mixed with potential field method [J]. Agricultural Equipment & Vehicle Engineering, 2018, 56(12): 62–66 (in Chinese).

    Google Scholar 

  19. [19]

    FRESE C, FETZNER A, FREY C. Multi-sensor obstacle tracking for safe human-robot interaction [C]//ISR/Robotik 2014; 41st International Symposium on Robotics. Munich: VDE, 2014: 1–8.

    Google Scholar 

  20. [20]

    ZUBE A. Combined workspace monitoring and collision avoidance for mobile manipulators [C]//2015 IEEE 20th Conference on Emerging Technologies & Factory Automation (ETFA). Luxembourg: IEEE, 2015: 1–8.

    Google Scholar 

  21. [21]

    FETZNER A, FRESE C, FREY C. A 3D representation of obstacles in the robots reachable area considering occlusions [C]//ISR/Robotik 2014; 41st International Symposium on Robotics. Munich: VDE, 2014: 1–8.

    Google Scholar 

  22. [22]

    RIVADENEYRA C, MILLER I, SCHOENBERG J R, et al. Probabilistic estimation of multi-level terrain maps [C]//2009 IEEE International Conference on Robotics and Automation. Kobe: IEEE, 2009: 1643–1648.

    Google Scholar 

  23. [23]

    HORNUNG A, WURM K M, BENNEWITZ M, et al. OctoMap: An efficient probabilistic 3D mapping framework based on octrees [J]. Autonomous Robots, 2013, 34(3): 189–206.

    Article  Google Scholar 

  24. [24]

    FLACCO F, DE LUCA A. Multiple depth/presence sensors: Integration and optimal placement for human/robot coexistence [C]//2010 IEEE International Conference on Robotics and Automation. Anchorage, AK: IEEE, 2010: 3916–3923.

    Book  Google Scholar 

  25. [25]

    XU H. Research on robot visual perception and motion planning for human-machine collaboration [D]. Suzhou: Soochow University, 2020 (in Chinese).

    Google Scholar 

  26. [26]

    ZHANG Z. A flexible new technique for camera calibration [J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2000, 22(11): 1330–1334.

    Article  Google Scholar 

Download references

Author information

Affiliations

Authors

Corresponding authors

Correspondence to Wei Tao or Wenzheng Chi.

Additional information

Foundation item: the Interdisciplinary Program of Shanghai Jiao Tong University (No. YG2019QNA25)

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Wang, Z., Xu, H., Lü, N. et al. Dynamic Obstacle Avoidance for Application of Human-Robot Cooperative Dispensing Medicines. J. Shanghai Jiaotong Univ. (Sci.) (2021). https://doi.org/10.1007/s12204-021-2366-5

Download citation

Key words

  • automated dispensing medicines
  • dynamic unstructured environment
  • human-robot collaboration
  • dynamic obstacle avoidance
  • multi-sensor depth images
  • 3D grids
  • reactive control method

CLC number

  • TP 242.3
  • TP 391

Document code

  • A