Development of angle information system to facilitate the adjustment of needle-holding posture

  • Yang CaoEmail author
  • Li Liu
  • Satoshi Miura
  • Masaki Seki
  • Yo Kobayashi
  • Kazuya Kawamura
  • Masakatsu G. Fujie
Original Article



Our purpose is to develop a system based on image processing methods that can inform users of the angular relationship between the needle and the forceps. The user thereby adjusts their needle grasping posture according to the angle information, which leads to an improvement in suturing accuracy.


The system prototype consists of a camera and an image processing computer. The image captured by the camera is input to the computer, and then, the angular relationship between the forceps and needle is calculated via image processing. Then, the system informs the user of the calculated angular relationship between the needle and forceps in real time. To evaluate whether the system improves suturing accuracy, we invited 12 participants to enroll in an experiment based on a suturing task.


The experimental results illustrate that the system allows participants to easily adjust the positional relationship between the needle and the forceps and that this adjusted angular relationship leads to higher suturing accuracy.


Adjustment to holding the needle at a right angle before insertion has a critical effect on suturing quality. Therefore, we developed a system that informs the user of the angular relationship between the needle and the forceps. The results of the evaluation show that the system significantly improves the suturing accuracy of participants via informing them of the angle.


Laparoscopic training Needle holding Image processing Posture informing 



This work was supported in part by a research grant from the JSPS Global COE Program: Global Robot Academia, JSPS Grant-in-Aid for Scientific Research 25220005, Yazaki Memorial Foundation for Science and Technology and the Program for Leading Graduate Schools, “Graduate Program for Embodiment Informatics”.

Compliance with ethical standards

Conflict of interest

The authors declare no conflicts of interest in preparing this article.

Ethical standard

All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki Declaration and its later amendments or comparable ethical standards.

Informed consent

Informed consent was obtained from all individual participants included in the study.

Supplementary material

Supplementary material 1 (mpg 2158 KB)

Supplementary material 2 (mpg 1766 KB)


  1. 1.
    Rassweiler J, Gözen AS, Frede T, Teber D (2011) Laparoscopy vs. robotics: ergonomics–Does it matter? In: Hemal AK, Menon M (eds) Robotics in genitourinary surgery. Springer, London, pp 63–78. doi: 10.1007/978-1-84882-114-9_5 CrossRefGoogle Scholar
  2. 2.
    Vilos GA, Ternamian A, Dempster J, Laberge PY (2007) Laparoscopic entry: a review of techniques, technologies, and complications. J Obstet Gynaecol Can 29:433–447CrossRefPubMedGoogle Scholar
  3. 3.
    Dorsey JH, Tabb CR, Zucker KA (2001) Laparoscopic suturing and knot tying. In: Zucker KA (ed) Surgical laparoscopy. Lippincott Williams & Williams, Philadelphia, pp 77–95Google Scholar
  4. 4.
    Jain N (2006) Laparoscopic needle holders and knot pushers. In: Zucker KA (ed) State of the art atlas and textbook of laparoscopic suturing. Jaypee Brothers Publishers, New Delhi, pp 18–20CrossRefGoogle Scholar
  5. 5.
    Joice P, Hanna G, Cuschieri A (1998) Ergonomic evaluation of laparoscopic bowel suturing. Am J Surg 176:373–378CrossRefPubMedGoogle Scholar
  6. 6.
    Allan M, Chang PL, Ourselin S, Hawkes DJ, Sridhar A, Kelly J, Stoyanov D (2015) Image based surgical instrument pose estimation with multi-class labelling and optical flow. Medical image computing and computer-assisted intervention - MICCAI 2015, pp 331–338Google Scholar
  7. 7.
    Wengert C, Bossard L, Häberling A, Baur C, Székely G, Cattin PC (2007) Endoscopic navigation for minimally invasive suturing. In: International conference on medical image computing and computer-assisted intervention. Springer, Berlin, pp 620–627Google Scholar
  8. 8.
    Koreeda Y, Kobayashi Y, Ieiri S, Nishio Y, Kawamura K, Obata S, Souzaki R, Hashizume M, Fujie MG (2016) Virtually transparent surgical instruments in endoscopic surgery with augmentation of obscured regions. Int J Comput Assist Radiol Surg 11:1927–1936. doi: 10.1007/s11548-016-1384-5 CrossRefPubMedGoogle Scholar
  9. 9.
    Cucchiara R, Grana C, Piccardi M, Prati A, Sirotti S (2001) Improving shadow suppression in moving object detection with HSV color information. In: 2001 IEEE Intelligent Transportation Systems Proceedings, pp 334–339Google Scholar
  10. 10.
    HoughLines, Feature Detection, OpenCV API Reference (2016) Accessed 26 December 2016
  11. 11.
    fitEllipse, Structural Analysis and Shape Descriptors, OpenCV API Reference 2016 (2016) Accessed 21 March 2016
  12. 12.
    Beland LJ, Mourou W (2009) Image segmentation: a watershed transformation algorithm. Image Anal Stereol 28:93–102Google Scholar
  13. 13.
    Miall RC, Wolpert DM (1996) Forward models for physiological motor control. Neural Netw 9:1265–1279CrossRefPubMedGoogle Scholar
  14. 14.
    Enomoto M (1978) The experimental studies on the anastomotic technic of one-layer colonic anastomosis. Jpn J Gastroenterol Surg 11:737–747CrossRefGoogle Scholar
  15. 15.
    Laparoscopic suturing training (2012) Accessed 3 September 2016
  16. 16.
    Suturing needle holding (2011) Accessed 3 September 2016

Copyright information

© CARS 2017

Authors and Affiliations

  1. 1.Graduate School of Creative Science and EngineeringWaseda UniversityTokyoJapan
  2. 2.Graduate School of Advanced Science and EngineeringWaseda UniversityTokyoJapan
  3. 3.Faculty of Science and EngineeringWaseda UniversityTokyoJapan
  4. 4.Graduate School of Engineering ScienceOsaka UniversityOsakaJapan
  5. 5.Center for Frontier Medical EngineeringChiba UniversityChibaJapan

Personalised recommendations