Skip to main content

Pelphix: Surgical Phase Recognition from X-Ray Images in Percutaneous Pelvic Fixation

  • Conference paper
  • First Online:
Medical Image Computing and Computer Assisted Intervention – MICCAI 2023 (MICCAI 2023)

Abstract

Surgical phase recognition (SPR) is a crucial element in the digital transformation of the modern operating theater. While SPR based on video sources is well-established, incorporation of interventional X-ray sequences has not yet been explored. This paper presents Pelphix, a first approach to SPR for X-ray-guided percutaneous pelvic fracture fixation, which models the procedure at four levels of granularity – corridor, activity, view, and frame value – simulating the pelvic fracture fixation workflow as a Markov process to provide fully annotated training data. Using added supervision from detection of bony corridors, tools, and anatomy, we learn image representations that are fed into a transformer model to regress surgical phases at the four granularity levels. Our approach demonstrates the feasibility of X-ray-based SPR, achieving an average accuracy of 99.2% on simulated sequences and 71.7% in cadaver across all granularity levels, with up to 84% accuracy for the target corridor in real data. This work constitutes the first step toward SPR for the X-ray domain, establishing an approach to categorizing phases in X-ray-guided surgery, simulating realistic image sequences to enable machine learning model development, and demonstrating that this approach is feasible for the analysis of real procedures. As X-ray-based SPR continues to mature, it will benefit procedures in orthopedic surgery, angiography, and interventional radiology by equipping intelligent surgical systems with situational awareness in the operating room. Code and data available at https://github.com/benjamindkilleen/pelphix.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Bier, B., et al.: X-ray-transform invariant anatomical landmark detection for pelvic trauma surgery. In: Frangi, A.F., Schnabel, J.A., Davatzikos, C., Alberola-López, C., Fichtinger, G. (eds.) MICCAI 2018. LNCS, vol. 11073, pp. 55–63. Springer, Cham (2018). https://doi.org/10.1007/978-3-030-00937-3_7

    Chapter  Google Scholar 

  2. Caldwell, R., Kamionkowski, M.: Dark matter and dark energy. Nature 458(7238), 587–589 (2009). https://doi.org/10.1038/458587a

    Article  Google Scholar 

  3. Czempiel, T., Paschali, M., Ostler, D., Kim, S.T., Busam, B., Navab, N.: OperA: attention-regularized transformers for surgical phase recognition. In: de Bruijne, M., et al. (eds.) MICCAI 2021, Part IV. LNCS, vol. 12904, pp. 604–614. Springer, Cham (2021). https://doi.org/10.1007/978-3-030-87202-1_58

    Chapter  Google Scholar 

  4. Da Col, T., Mariani, A., Deguet, A., Menciassi, A., Kazanzides, P., De Momi, E.: SCAN: system for camera autonomous navigation in robotic-assisted surgery. In: 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 2996–3002. IEEE (2021). https://doi.org/10.1109/IROS45743.2020.9341548

  5. DiPietro, R., et al.: Segmenting and classifying activities in robot-assisted surgery with recurrent neural networks. Int. J. Comput. Assist. Radiol. Surg. 14(11), 2005–2020 (2019). https://doi.org/10.1007/s11548-019-01953-x

    Article  Google Scholar 

  6. Gao, C., et al.: SyntheX: scaling up learning-based X-ray image analysis through in silico experiments. arXiv (2022). https://doi.org/10.48550/arXiv.2206.06127

  7. Guédon, A.C.P.: Deep learning for surgical phase recognition using endoscopic videos. Surg. Endosc. 35(11), 6150–6157 (2020). https://doi.org/10.1007/s00464-020-08110-5

    Article  Google Scholar 

  8. Hossain, M., Nishio, S., Hiranaka, T., Kobashi, S.: Real-time surgical tools recognition in total knee arthroplasty using deep neural networks. In: 2018 Joint 7th International Conference on Informatics, Electronics & Vision (ICIEV) and 2018 2nd International Conference on Imaging, Vision & Pattern Recognition (icIVPR), pp. 470–474. IEEE (2018). https://doi.org/10.1109/ICIEV.2018.8641074

  9. Kadkhodamohammadi, A., et al.: Towards video-based surgical workflow understanding in open orthopaedic surgery. Comput. Meth. Biomech. Biomed. Eng. Imaging Vis. 9(3), 286–293 (2021). https://doi.org/10.1080/21681163.2020.1835552

    Article  Google Scholar 

  10. Kausch, L., et al.: C-Arm positioning for spinal standard projections in different intra-operative settings. In: de Bruijne, M., et al. (eds.) MICCAI 2021. LNCS, vol. 12904, pp. 352–362. Springer, Cham (2021). https://doi.org/10.1007/978-3-030-87202-1_34

    Chapter  Google Scholar 

  11. Killeen, B.D., et al.: An autonomous X-ray image acquisition and interpretation system for assisting percutaneous pelvic fracture fixation. Int. J. CARS 18, 1–8 (2023). https://doi.org/10.1007/s11548-023-02941-y

    Article  Google Scholar 

  12. Killeen, B.D., et al.: Mixed reality interfaces for achieving desired views with robotic X-ray systems. Comput. Meth. Biomech. Biomed. Eng. Imaging Vis. 11, 1–6 (2022). https://doi.org/10.1080/21681163.2022.2154272

    Article  Google Scholar 

  13. Kim, K.P., et al.: Occupational radiation doses to operators performing fluoroscopically-guided procedures. Health Phys. 103(1), 80 (2012). https://doi.org/10.1097/HP.0b013e31824dae76

    Article  Google Scholar 

  14. Munawar, A., et al.: Virtual reality for synergistic surgical training and data generation. Comput. Meth. Biomech. Biomed. Eng. Imaging Vis. 10, 1–9 (2021)

    Google Scholar 

  15. Padoy, N.: Machine and deep learning for workflow recognition during surgery. Minim. Invasive Therapy Allied Technol. 28(2), 82–90 (2019). https://doi.org/10.1080/13645706.2019.1584116

    Article  Google Scholar 

  16. Ronneberger, O., Fischer, P., Brox, T.: U-Net: convolutional networks for biomedical image segmentation. In: Navab, N., Hornegger, J., Wells, W.M., Frangi, A.F. (eds.) MICCAI 2015. LNCS, vol. 9351, pp. 234–241. Springer, Cham (2015). https://doi.org/10.1007/978-3-319-24574-4_28

    Chapter  Google Scholar 

  17. Simonian, P.T., Routt Jr, M.L.C., Harrington, R.M., Tencer, A.F.: Internal fixation of the unstable anterior pelvic ring a biomechanical comparison of standard plating techniques and the retrograde medullary superior pubic ramus screw. J. Orthop. Trauma 8(6), 476 (1994)

    Article  Google Scholar 

  18. Unberath, M., et al.: DeepDRR – a catalyst for machine learning in fluoroscopy-guided procedures. In: Frangi, A.F., Schnabel, J.A., Davatzikos, C., Alberola-López, C., Fichtinger, G. (eds.) MICCAI 2018. LNCS, vol. 11073, pp. 98–106. Springer, Cham (2018). https://doi.org/10.1007/978-3-030-00937-3_12

    Chapter  Google Scholar 

  19. Valderrama, N., et al.: Towards holistic surgical scene understanding. In: Wang, L., Dou, Q., Fletcher, P.T., Speidel, S., Li, S. (eds.) MICCAI 2022. Lecture Notes in Computer Science, vol. 13437, pp. 442–452. Springer, Cham (2022). https://doi.org/10.1007/978-3-031-16449-1_42

    Chapter  Google Scholar 

  20. Varier, V.M., Rajamani, D.K., Goldfarb, N., Tavakkolmoghaddam, F., Munawar, A., Fischer, G.S.: Collaborative suturing: a reinforcement learning approach to automate hand-off task in suturing for surgical robots. In: 2020 29th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN), pp. 1380–1386. IEEE (2020). https://doi.org/10.1109/RO-MAN47096.2020.9223543

  21. Wu, J.Y., Tamhane, A., Kazanzides, P., Unberath, M.: Cross-modal self-supervised representation learning for gesture and skill recognition in robotic surgery. Int. J. Comput. Assist. Radiol. Surg. 16(5), 779–787 (2021). https://doi.org/10.1007/s11548-021-02343-y

    Article  Google Scholar 

  22. Zhang, B., et al.: Towards accurate surgical workflow recognition with convolutional networks and transformers. Comput. Meth. Biomech. Biomed. Eng. Imaging Vis. 10(4), 349–356 (2022). https://doi.org/10.1080/21681163.2021.2002191

    Article  Google Scholar 

  23. Zisimopoulos, O., et al.: DeepPhase: surgical phase recognition in CATARACTS videos. In: Frangi, A.F., Schnabel, J.A., Davatzikos, C., Alberola-López, C., Fichtinger, G. (eds.) MICCAI 2018. LNCS, vol. 11073, pp. 265–272. Springer, Cham (2018). https://doi.org/10.1007/978-3-030-00937-3_31

    Chapter  Google Scholar 

Download references

Acknowledgements

This work was supported by NIH Grant No. R21EB028505 and Johns Hopkins University internal funds. Thank you to Demetries Boston, Henry Phalen, and Justin Ma for assistance with cadaver experiments.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Benjamin D. Killeen .

Editor information

Editors and Affiliations

1 Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary material 1 (pdf 1164 KB)

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Killeen, B.D. et al. (2023). Pelphix: Surgical Phase Recognition from X-Ray Images in Percutaneous Pelvic Fixation. In: Greenspan, H., et al. Medical Image Computing and Computer Assisted Intervention – MICCAI 2023. MICCAI 2023. Lecture Notes in Computer Science, vol 14228. Springer, Cham. https://doi.org/10.1007/978-3-031-43996-4_13

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-43996-4_13

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-43995-7

  • Online ISBN: 978-3-031-43996-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics