Skip to main content

Advertisement

Log in

Automatic data-driven real-time segmentation and recognition of surgical workflow

  • Original Article
  • Published:
International Journal of Computer Assisted Radiology and Surgery Aims and scope Submit manuscript

Abstract

Purpose

With the intention of extending the perception and action of surgical staff inside the operating room, the medical community has expressed a growing interest towards context-aware systems. Requiring an accurate identification of the surgical workflow, such systems make use of data from a diverse set of available sensors. In this paper, we propose a fully data-driven and real-time method for segmentation and recognition of surgical phases using a combination of video data and instrument usage signals, exploiting no prior knowledge. We also introduce new validation metrics for assessment of workflow detection.

Methods

The segmentation and recognition are based on a four-stage process. Firstly, during the learning time, a Surgical Process Model is automatically constructed from data annotations to guide the following process. Secondly, data samples are described using a combination of low-level visual cues and instrument information. Then, in the third stage, these descriptions are employed to train a set of AdaBoost classifiers capable of distinguishing one surgical phase from others. Finally, AdaBoost responses are used as input to a Hidden semi-Markov Model in order to obtain a final decision.

Results

On the MICCAI EndoVis challenge laparoscopic dataset we achieved a precision and a recall of 91 % in classification of 7 phases.

Conclusion

Compared to the analysis based on one data type only, a combination of visual features and instrument signals allows better segmentation, reduction of the detection delay and discovery of the correct phase order.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

Notes

  1. http://grand-challenge.org/site/endovissub-workflow/data/.

References

  1. Bardram J, Doryab A, Jensen R, Lange P, Nielsen K, Petersen S (2011) Phase recognition during surgical procedures using embedded and body-worn sensors. In: IEEE international conference on pervasive computing and communications, pp 45–53

  2. Bharathan R, Aggarwal R, Darzi A (2013) Operating room of the future. Best Pract Res Clin Obst Gynecol 27(3):311–322

    Article  Google Scholar 

  3. Blum T, Feuner H, Navab N (2010) Modeling and segmentation of surgical workflow from laparoscopic video. In: Medical image computing and computer-assisted interventions, vol. 6363, pp 400–407

  4. Bouget D, Benenson R, Omran M, Riffaud L, Schiele B, Jannin P (2015) Detecting surgical tools by modelling local appearance and global shape. IEEE Trans Med Imaging 34(12):2603–2617

    Article  PubMed  Google Scholar 

  5. Charriere K, Quellec G, Lamard M, Coatrieux G, Cochener B, Cazuguel G (2014) Automated surgical step recognition in normalized cataract surgery videos. In: IEEE international conference on engineering in medicine and biology society, pp 4647–4650

  6. Cleary K, Kinsella A (2005) Or 2020: the operating room of the future. J Laparoendosc Adv Surg Tech 15(5):495–497

    Article  Google Scholar 

  7. Despinoy F, Bouget D, Forestier G, Penet C, Zemiti N, Poignet P, Jannin P (2015) Unsupervised trajectory segmentation for surgical gesture recognition in robotic training. IEEE transactions on biomedical engineering. doi:10.1109/TBME.2015.2493100

  8. Holden MS, Ungi T, Sargent D, McGraw RC, Chen EC, Ganapathy S, Peters TM, Fichtinger G (2014) Feasibility of real-time workflow segmentation for tracked needle interventions. IEEE Trans Biomed Eng 61(6):1720–1728

    Article  PubMed  Google Scholar 

  9. James A, Vieira D, Lo B, Darzi A, Yang GZ (2007) Eye-gaze driven surgical workflow segmentation. In: medical image computing and computer-assisted interventions, pp 110–117

  10. Jannin P, Morandi X (2007) Surgical models for computer-assisted neurosurgery. NeuroImage 37(3):783–791

    Article  CAS  PubMed  Google Scholar 

  11. Lalys F, Jannin P (2014) Surgical process modelling: a review. Int J Comput Assist Radiol Surg 9(3):495–511

    Article  PubMed  Google Scholar 

  12. Lalys F, Riffaud L, Bouget D, Jannin P (2012) A framework for the recognition of high-level surgical tasks from video images for cataract surgeries. IEEE Trans Biomed Eng 59(4):966–976

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  13. Lo B, Darzi A, Yang GZ (2003) Episode classification for the analysis of tissue/instrument interaction with multiple visual cues. In: Medical image computing and computer-assisted interventions, vol 2878, pp 230–237

  14. Nara A, Izumi K, Iseki H, Suzuki T, Nambu K, Sakurai Y (2011) Surgical workflow monitoring based on trajectory data mining. In: New frontiers in artificial intelligence, vol 6797, pp 283–291

  15. Padoy N, Blum T, Essa I, Feussner H, Berger MO, Navab N (2007) A boosted segmentation method for surgical workflow analysis. In: Medical image computing and computer-assisted interventions, vol 4791, pp 102–109

  16. Padoy N, Blum T, Ahmadi SA, Feussner H, Berger MO, Navab N (2012) Statistical modeling and recognition of surgical workflow. Med Image Anal 16(3):632–641

    Article  PubMed  Google Scholar 

  17. Quellec G, Lamard M, Cochener B, Cazuguel G (2014) Real-time segmentation and recognition of surgical tasks in cataract surgery videos. IEEE Trans Med Imaging 33(12):2352–2360

    Article  PubMed  Google Scholar 

  18. Rabiner LR (1989) A tutorial on hidden Markov models and selected applications in speech recognition. Proc IEEE 77(2):257–286

    Article  Google Scholar 

  19. Schapire RE (2003) The boosting approach to machine learning: An overview. In: Nonlinear estimation and classification, pp 149–171

  20. Stauder R, Okur A, Navab N (2014) Detecting and analyzing the surgical workflow to aid human and robotic scrub nurses. In: The Hamlyn Symposium on Medical Robotics, p 91

  21. Weede O, Dittrich F, Worn H, Jensen B, Knoll A, Wilhelm D, Kranzfelder M, Schneider A, Feussner H (2012) Workflow analysis and surgical phase recognition in minimally invasive surgery. In: IEEE international conference on robotics and biomimetics, pp 1080–1074

  22. Yu SZ, Kobayashi H (2003) An efficient forward-backward algorithm for an explicit-duration hidden markov model. IEEE Signal Process Lett 10(1):11–14

    Article  Google Scholar 

Download references

Acknowledgments

This work was partially supported by French state funds managed by the ANR within the Investissements d’Avenir Program (Labex CAMI) under the reference ANR-11-LABX-0004.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Olga Dergachyova.

Ethics declarations

Conflict of interest

Olga Dergachyova, David Bouget, Arnaud Huaulmé, Xavier Morandi and Pierre Jannin declare that they have no conflict of interest.

Ethical approval

For this type of study formal consent is not required.

Informed consent

Not required. Used data were anonymously available through the MICCAI EndoVis challenge.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Dergachyova, O., Bouget, D., Huaulmé, A. et al. Automatic data-driven real-time segmentation and recognition of surgical workflow. Int J CARS 11, 1081–1089 (2016). https://doi.org/10.1007/s11548-016-1371-x

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11548-016-1371-x

Keywords

Navigation