Time-of-flight-assisted Kinect camera-based people detection for intuitive human robot cooperation in the surgical operating room

  • Tim Beyl
  • Philip Nicolai
  • Mirko D. Comparetti
  • Jörg Raczkowsky
  • Elena De Momi
  • Heinz Wörn
Original Article

Abstract

Background

Scene supervision is a major tool to make medical robots safer and more intuitive. The paper shows an approach to efficiently use 3D cameras within the surgical operating room to enable for safe human robot interaction and action perception. Additionally the presented approach aims to make 3D camera-based scene supervision more reliable and accurate.

Methods

A camera system composed of multiple Kinect and time-of-flight cameras has been designed, implemented and calibrated. Calibration and object detection as well as people tracking methods have been designed and evaluated.

Results

The camera system shows a good registration accuracy of 0.05 m. The tracking of humans is reliable and accurate and has been evaluated in an experimental setup using operating clothing. The robot detection shows an error of around 0.04 m.

Conclusions

The robustness and accuracy of the approach allow for an integration into modern operating room. The data output can be used directly for situation and workflow detection as well as collision avoidance.

Keywords

Digital operating room Environment supervision Surgical robotics 3D vision  RGB-D cameras ToF cameras 

Notes

Acknowledgments

This research was funded by the European Commissions Seventh Framework program within the projects Patient Safety in Robotic Surgery (SAFROS) under Grant No. 248960 and Active Constraints Technologies for Ill-defined or Volatile Environments (ACTIVE) under Grant No. 270460. The authors thank the EU for its financial support. The authors thank NVIDIA (USA) for providing two NVIDIA Geforce GTX Titan graphic adapters for the research shown in this paper.

Compliance with ethical standards

Conflict of interest

The authors state that there are no conflicts of interest.

References

  1. 1.
    Besl P, McKay ND (1992) A method for registration of 3-D shapes. IEEE Trans Pattern Anal Mach Intell 14(2):239–256. doi:10.1109/34.121791 CrossRefGoogle Scholar
  2. 2.
    Beyl T, Nicolai P, Raczkowsky J, Worn H, Comparetti M, De Momi E (2013) Multi kinect people detection for intuitive and safe human robot cooperation in the operating room. In: 2013 16th international conference on advanced robotics (ICAR), pp 1–6. doi:10.1109/ICAR.2013.6766594
  3. 3.
    Bischoff R, Kurth J, Schreiber G, Koeppe R, Albu-Schaeffer A, Beyer A, Eiberger O, Haddadin S, Stemmer A, Grunwald G, Hirzinger G (2010) The KUKA-DLR lightweight robot arm—a new reference platform for robotics research and manufacturing. In: 2010 41st international symposium on robotics (ISR) and 2010 6th German conference on robotics (ROBOTIK), pp 1–8Google Scholar
  4. 4.
    Cadeddu JA, Bzostek A, Schreiner S, Barnes AC, Roberts WW, Anderson JH, Taylor RH, Kavoussi LR (1997) A robotic system for percutaneous renal access. J Urol 158(4):1589–1593CrossRefPubMedGoogle Scholar
  5. 5.
    Castaneda V, Mateeus D, Navab N (2013) Stereo time-of-flight with constructive interference. In: 2013 IEEE transactions on pattern analysis and machine intelligence, p 1Google Scholar
  6. 6.
    Culjak I, Abram D, Pribanic T, Dzapo H, Cifrek M (2012) A brief introduction to opencv. In: 2012 proceedings of the 35th international convention MIPRO, pp 1725–1730Google Scholar
  7. 7.
    Daniele Comparetti M, Beretta E, Kunze M, De Momi E, Raczkowsky J, Ferrigno G (2014) Event-based device-behavior switching in surgical human–robot interaction. In: 2014 IEEE international conference on robotics and automation (ICRA), pp 1877–1882. doi:10.1109/ICRA.2014.6907106
  8. 8.
    Faion F, Friedberger S, Zea A, Hanebeck U (2012) Intelligent sensor-scheduling for multi-kinect-tracking. In: 2012 IEEE/RSJ international conference on intelligent robots and systems (IROS), pp 3993–3999. doi:10.1109/IROS.2012.6386007
  9. 9.
    Fischler MA, Bolles RC (1981) Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography. Commun ACM 24(6):381–395. doi:10.1145/358669.358692 CrossRefGoogle Scholar
  10. 10.
    Garland M, Le Grand S, Nickolls J, Anderson J, Hardwick J, Morton S, Phillips E, Zhang Y, Volkov V (2008) Parallel computing experiences with CUDA. IEEE Micro 28(4):13–27. doi:10.1109/MM.2008.57 CrossRefGoogle Scholar
  11. 11.
    Guthart G, Salisbury JJ (2000) The intuitivetm telesurgery system: overview and application. In: Robotics and automation. Proceedings. ICRA ’00. IEEE international conference on, vol 1, pp 618–621. doi:10.1109/ROBOT.2000.844121
  12. 12.
    Hannaford B, Rosen J, Friedman D, King H, Roan P, Cheng L, Glozman D, Ma J, Kosari S, White L (2013) Raven-II: an open platform for surgical robotics research. IEEE Trans Biomed Eng 60(4):954–959. doi:10.1109/TBME.2012.2228858 CrossRefPubMedGoogle Scholar
  13. 13.
    Horn BKP (1987) Closed-form solution of absolute orientation using unit quaternions. Opt Soc 4(4):629–642CrossRefGoogle Scholar
  14. 14.
    Karan B (2015) Calibration of Kinect-type RGB-D sensors for robotic applications. FME Trans 47:47–54CrossRefGoogle Scholar
  15. 15.
    Konietschke R, Hagn U, Nickl M, Jorg S, Tobergte A, Passig G, Seibold U, Le-Tien L, Kubler B, Groger M, Frohlich F, Rink C, Albu-Schaffer A, Grebenstein M, Ortmaier T, Hirzinger G (2009) The DLR Mirosurge—a robotic system for surgery. In: Robotics and automation. ICRA ’09. IEEE international conference on, pp 1589–1590. doi:10.1109/ROBOT.2009.5152361
  16. 16.
    Lou Y, Wu W, Zhang H, Zhang H, Chen Y (2012) A multi-user interaction system based on kinect and wii remote. In: 2012 IEEE international conference on Multimedia and Expo workshops (ICMEW), pp 667–667. doi:10.1109/ICMEW.2012.123
  17. 17.
    Nakazawa M, Mitsugami I, Makihara Y, Nakajima H, Habe H, Yamazoe H, Yagi Y (2012) Dynamic scene reconstruction using asynchronous multiple kinects. In: 2012 21st international conference on pattern recognition (ICPR), pp 469–472Google Scholar
  18. 18.
    Nicolai P, Beyl T, Monnich H, Raczkowsky J, Worn H (2011) Op:sense—an integrated rapid development environment in the context of robot assisted surgery and operation room sensing. In: 2011 IEEE international conference on robotics and biomimetics (ROBIO), pp 2421–2422. doi:10.1109/ROBIO.2011.6181667
  19. 19.
    Nicolai P, Raczkowsky J, Wrn H (2014) A novel 3D camera based supervision system for safe human–robot interaction in the operating room. J. Autom. Control Eng. 3(5):410–417Google Scholar
  20. 20.
    OpenNI organization: OpenNI (2010). http://www.openni.org. Last viewed 24-01-2014
  21. 21.
    Nicolai P, Brennecke T, Kunze M, Schreiter L, Beyl T, Zhang Y, Mintenbeck J, Raczkowsky J, Wörn H (2013) The OP: Sense surgical robotics platform: first feasibility studies and current research. Int J Comput Assist Radiol Surg 8:136–137Google Scholar
  22. 22.
    Pennec X (1998) Computing the mean of geometric features application to the mean rotation. Tech. Rep. RR-3371, INRIA. http://hal.inria.fr/inria-00073318
  23. 23.
    PrimeSense Inc.: Prime Sensor NITE 1.3 Algorithms notes (2010). http://www.primesense.com. Last viewed 24-01-2014
  24. 24.
    Quigley M, Conley K, Gerkey B, Faust J, Foote T, Leibs J, Wheeler R, Ng AY (2009) Ros: an open-source robot operating system. In: ICRA workshop on open source software, vol 3Google Scholar
  25. 25.
    Rusu RB, Cousins S (2011) 3D is here: Point Cloud Library (PCL). In: Proceedings of the IEEE international conference on robotics and automation (ICRA), Shanghai, ChinaGoogle Scholar
  26. 26.
    Shotton J, Fitzgibbon A, Cook M, Sharp T, Finocchio M, Moore R, Kipman A, Blake A (2011) Real-time human pose recognition in parts from single depth images. In: 2011 IEEE conference on computer vision and pattern recognition (CVPR), pp 1297–1304. doi:10.1109/CVPR.2011.5995316
  27. 27.
    Beyl Tim, Nicolai Philip, Raczkowsky Jörg, Wörn Heinz (2012) Ein Kinect basiertes Überwachungssystem für Workflowerkennung und Gestensteuerung im Operationssaal. In: Tagungsband der 11. Jahrestagung der Deutschen Gesellschaft für Computer- und Roboterassistierte Chirurgie e.V. (CURAC)Google Scholar
  28. 28.
    Wilson AD, Benko H (2010) Combining multiple depth cameras and projectors for interactions on, above and between surfaces. In: Proceedings of the 23rd annual ACM symposium on user interface software and technology, UIST ’10. ACM, New York, NY, USA, pp 273–282. doi:10.1145/1866029.1866073
  29. 29.
    Zhang L, Sturm J, Cremers D, Lee D (2012) Real-time human motion tracking using multiple depth cameras. In: 2012 IEEE/RSJ international conference on intelligent robots and systems (IROS), pp 2389–2395. doi:10.1109/IROS.2012.6385968

Copyright information

© CARS 2015

Authors and Affiliations

  • Tim Beyl
    • 1
  • Philip Nicolai
    • 1
  • Mirko D. Comparetti
    • 2
  • Jörg Raczkowsky
    • 1
  • Elena De Momi
    • 2
  • Heinz Wörn
    • 1
  1. 1.Institute for Anthropomatics and Robotics (IAR), Intelligent Process Control and Robotics (IPR)Karlsruhe Institute of TechnologyKarlsruheGermany
  2. 2.NeuroEngineering and Medical Robotics Laboratory, Department of Electronics, Information and BioengineeringPolitecnico di MilanoMilanItaly

Personalised recommendations