System events: readily accessible features for surgical phase detection

  • Anand Malpani
  • Colin Lea
  • Chi Chiung Grace Chen
  • Gregory D. Hager
Original Article

DOI: 10.1007/s11548-016-1409-0

Cite this article as:
Malpani, A., Lea, C., Chen, C.C.G. et al. Int J CARS (2016) 11: 1201. doi:10.1007/s11548-016-1409-0

Abstract

Purpose

Surgical phase recognition using sensor data is challenging due to high variation in patient anatomy and surgeon-specific operating styles. Segmenting surgical procedures into constituent phases is of significant utility for resident training, education, self-review, and context-aware operating room technologies. Phase annotation is a highly labor-intensive task and would benefit greatly from automated solutions.

Methods

We propose a novel approach using system events—for example, activation of cautery tools—that are easily captured in most surgical procedures. Our method involves extracting event-based features over 90-s intervals and assigning a phase label to each interval. We explore three classification techniques: support vector machines, random forests, and temporal convolution neural networks. Each of these models independently predicts a label for each time interval. We also examine segmental inference using an approach based on the semi-Markov conditional random field, which jointly performs phase segmentation and classification. Our method is evaluated on a data set of 24 robot-assisted hysterectomy procedures.

Results

Our framework is able to detect surgical phases with an accuracy of 74 % using event-based features over a set of five different phases—ligation, dissection, colpotomy, cuff closure, and background. Precision and recall values for the cuff closure (Precision: 83 %, Recall: 98 %) and dissection (Precision: 75 %, Recall: 88 %) classes were higher than other classes. The normalized Levenshtein distance between predicted and ground truth phase sequence was 25 %.

Conclusions

Our findings demonstrate that system events features are useful for automatically detecting surgical phase. Events contain phase information that cannot be obtained from motion data and that would require advanced computer vision algorithms to extract from a video. Many of these events are not specific to robotic surgery and can easily be recorded in non-robotic surgical modalities. In future work, we plan to combine information from system events, tool motion, and videos to automate phase detection in surgical procedures.

Keywords

Surgical phase detection System events Sensor data Surgical workflow analysis Robot-assisted surgery  Surgical task flow Surgical process modeling 

Funding information

Funder NameGrant NumberFunding Note
Link Foundation
  • Modeling, Training and Simulation Fellowship
Intuitive Surgical Technology Research Grant
    Johns Hopkins University (US)

      Copyright information

      © CARS 2016

      Authors and Affiliations

      • Anand Malpani
        • 1
      • Colin Lea
        • 1
      • Chi Chiung Grace Chen
        • 2
      • Gregory D. Hager
        • 1
      1. 1.Department of Computer ScienceThe Johns Hopkins UniversityBaltimoreUSA
      2. 2.Department of Gynecology and ObstetricsThe Johns Hopkins UniversityBaltimoreUSA

      Personalised recommendations