Skip to main content
Log in

An autonomous X-ray image acquisition and interpretation system for assisting percutaneous pelvic fracture fixation

  • Original Article
  • Published:
International Journal of Computer Assisted Radiology and Surgery Aims and scope Submit manuscript

Abstract

Purpose

Percutaneous fracture fixation involves multiple X-ray acquisitions to determine adequate tool trajectories in bony anatomy. In order to reduce time spent adjusting the X-ray imager’s gantry, avoid excess acquisitions, and anticipate inadequate trajectories before penetrating bone, we propose an autonomous system for intra-operative feedback that combines robotic X-ray imaging and machine learning for automated image acquisition and interpretation, respectively.

Methods

Our approach reconstructs an appropriate trajectory in a two-image sequence, where the optimal second viewpoint is determined based on analysis of the first image. A deep neural network is responsible for detecting the tool and corridor, here a K-wire and the superior pubic ramus, respectively, in these radiographs. The reconstructed corridor and K-wire pose are compared to determine likelihood of cortical breach, and both are visualized for the clinician in a mixed reality environment that is spatially registered to the patient and delivered by an optical see-through head-mounted display.

Results

We assess the upper bounds on system performance through in silico evaluation across 11 CTs with fractures present, in which the corridor and K-wire are adequately reconstructed. In post hoc analysis of radiographs across 3 cadaveric specimens, our system determines the appropriate trajectory to within 2.8 ± 1.3 mm and 2.7 ± 1.8\(^{\circ }\).

Conclusion

An expert user study with an anthropomorphic phantom demonstrates how our autonomous, integrated system requires fewer images and lower movement to guide and confirm adequate placement compared to current clinical practice. Code and data are available.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

References

  1. Sen MEJH, Steinitz D, Guy P, Reindl R (2005) Anatomical risks of using supra-acetabular screws in percutaneous internal fixation of the acetabulum and pelvis. Am J Orthop Belle Mead NJ 34(2):94–96

    PubMed  Google Scholar 

  2. Tucker E, Fotouhi J, Unberath M, Lee SC, Fuerst B, Johnson A, Armand M, Osgood GM, Navab N (2018) Towards clinical translation of augmented orthopedic surgery: from pre-op ct to intra-op X-ray via rgbd sensing. In: Medical imaging 2018: imaging informatics for healthcare, research, and applications, vol. 10579. SPIE, pp 109–116

  3. De Silva T, Punnoose J, Uneri A, Goerres J, Jacobson M, Ketcha MD, Manbachi A, Vogt S, Kleinszig G, Khanna AJ, et al (2017) C-arm positioning using virtual fluoroscopy for image-guided surgery. In: Medical Imaging 2017: Image-guided procedures, robotic interventions, and modeling, vol. 10135. International Society for Optics and Photonics, p 101352

  4. Unberath M, Fotouhi J, Hajek J, Maier A, Osgood G, Taylor R, Armand M, Navab N (2018) Augmented reality-based feedback for technician-in-the-loop c-arm repositioning. Healthc Technol Lett 5(5):143–147

    Article  PubMed  PubMed Central  Google Scholar 

  5. Mandelka E, El Barbari J, Kausch L, Privalov M, Gruetzner PA, Vetter SY, Franke J (2022) Intraoperative adjustment of radiographic standard projections of the spine: interrater-and intrarater variance and consequences of fluoro-hunting considering time and radiation exposure. a cadaveric study. medRxiv

  6. Preuhs A, Berger M, Bauer S, Redel T, Unberath M, Achenbach S, Maier A (2018) Viewpoint planning for quantitative coronary angiography. Int. J. CARS 13(8):1159–1167. https://doi.org/10.1007/s11548-018-1763-1

    Article  Google Scholar 

  7. de Silva T, Punnoose J, Uneri A, Goerres J, Jacobson M, Ketcha MD, Manbachi A, Vogt S, Kleinszig G, Khanna AJ, Wolinsky J-P, Osgood G, Siewerdsen JH (2017) C-arm positioning using virtual fluoroscopy for image-guided surgery. In: Proceedings Volume 10135, Medical Imaging 2017: Image-guided procedures, robotic interventions, and modeling vol. 10135, pp 663–668. SPIE, Orlando, FL, USA. https://doi.org/10.1117/12.2256028

  8. Killeen BD, Winter J, Gu W, Martin-Gomez A, Taylor RH, Osgood G, Unberath M (2022) Mixed reality interfaces for achieving desired views with robotic X-ray systems. Computer methods in biomechanics and biomedical engineering: imaging & visualization, 1–6. https://doi.org/10.1080/21681163.2022.2154272

  9. Kausch L, Thomas S, Kunze H, Privalov M, Vetter S, Franke J, Mahnken AH, Maier-Hein L, Maier-Hein K (2020) Toward automatic c-arm positioning for standard projections in orthopedic surgery. Int J Comput Assist Radiol Surg 15(7):1095–1105

    Article  PubMed  PubMed Central  Google Scholar 

  10. Kausch L, Thomas S, Kunze H, Norajitra T, Klein A, El Barbari JS, Privalov M, Vetter S, Mahnken A, Maier-Hein L, Maier-Hein KH (2021) C-arm positioning for spinal standard projections in different intra-operative settings. In: Medical image computing and computer assisted intervention – MICCAI 2021, pp 352–362. Springer, Cham, Switzerland. https://doi.org/10.1007/978-3-030-87202-1_34

  11. Bier B, Unberath M, Zaech J-N, Fotouhi J, Armand M, Osgood G, Navab N, Maier A (2018) X-ray-transform invariant anatomical landmark detection for pelvic trauma surgery. In: Medical image computing and computer assisted intervention – MICCAI 2018. Springer, Cham, pp 55–63. https://doi.org/10.1007/978-3-030-00937-3_7

  12. Ehlke M, Ramm H, Lamecker H, Hege H-C, Zachow S (2013) Fast generation of virtual X-ray images for reconstruction of 3D anatomy. IEEE Trans. Visual. Comput. Graph 19(12):2673–2682. https://doi.org/10.1109/TVCG.2013.159

    Article  Google Scholar 

  13. Grupp RB, Hegeman RA, Murphy RJ, Alexander CP, Otake Y, McArthur BA, Armand M, Taylor RH (2019) Pose estimation of periacetabular osteotomy fragments with intraoperative X-ray navigation. IEEE Trans. Biomed. Eng. 67(2):441–452. https://doi.org/10.1109/TBME.2019.2915165

    Article  PubMed  PubMed Central  Google Scholar 

  14. Grupp RB, Unberath M, Gao C, Hegeman RA, Murphy RJ, Alexander CP, Otake Y, McArthur BA, Armand M, Taylor RH (2020) Automatic annotation of hip anatomy in fluoroscopy for robust and efficient 2D/3D registration. Int. J. CARS 15(5):759–769. https://doi.org/10.1007/s11548-020-02162-7

    Article  Google Scholar 

  15. Gao C, Farvardin A, Grupp RB, Bakhtiarinejad M, Ma L, Thies M, Unberath M, Taylor RH, Armand M (2020) Fiducial-free 2D/3D registration for robot-assisted femoroplasty. IEEE Trans Med Robot Bionics 2(3):437–446

    Article  PubMed  PubMed Central  Google Scholar 

  16. Gao C, Liu X, Gu W, Killeen B, Armand M, Taylor R, Unberath M (2020) Generalizing spatial transformers to projective geometry with applications to 2D/3D registration. In: Medical image computing and computer assisted intervention – MICCAI 2020. Springer, Cham, pp 329–339. https://doi.org/10.1007/978-3-030-59716-0_32

  17. Van Houtte J, Audenaert E, Zheng G, Sijbers J (2022) Deep learning-based 2D/3D registration of an atlas to biplanar X-ray images. Int. J. CARS 17(7):1333–1342. https://doi.org/10.1007/s11548-022-02586-3

    Article  Google Scholar 

  18. Han R, Uneri A, De Silva T, Ketcha M, Goerres J, Vogt S, Kleinszig G, Osgood G, Siewerdsen J (2019) Atlas-based automatic planning and 3D–2D fluoroscopic guidance in pelvic trauma surgery. Phys Med Biol 64(9):095022

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  19. Kügler D, Sehring J, Stefanov A, Stenin I, Kristin J, Klenzner T, Schipper J, Mukhopadhyay A (2020) i3PosNet: instrument pose estimation from X-ray in temporal bone surgery. Int. J. CARS 15(7):1137–1145. https://doi.org/10.1007/s11548-020-02157-4

    Article  Google Scholar 

  20. Gao C, Unberath M, Taylor R, Armand M (2019) Localizing dexterous surgical tools in X-ray for image-based navigation. arxiv: 1901.06672. https://doi.org/10.48550/arXiv.1901.06672

  21. Killeen BD, Chakraborty S, Osgood G, Unberath M (2022) Phys Med Imaging. Toward perception-based anticipation of cortical breach during K-wire fixation of the pelvis 12031:120311. https://doi.org/10.1117/12.2612989. (International Society for Optics and Photonics)

    Article  Google Scholar 

  22. Liu P, Han H, Du Y, Zhu H, Li Y, Gu F, Xiao H, Li J, Zhao C, Xiao L, Wu X, Zhou SK (2021) Deep learning to segment pelvic bones: large-scale CT datasets and baseline models. Int. J. CARS 16(5):749–756. https://doi.org/10.1007/s11548-021-02363-8

    Article  Google Scholar 

  23. Unberath M, Zaech J-N, Lee SC, Bier B, Fotouhi J, Armand M, Navab N (2018) DeepDRR – a catalyst for machine learning in fluoroscopy-guided procedures. In: Medical image computing and computer assisted intervention – MICCAI 2018. Springer, Cham, pp 98–106. https://doi.org/10.1007/978-3-030-00937-3_12c

  24. Unberath M, Zaech J-N, Gao C, Bier B, Goldmann F, Lee SC, Fotouhi J, Taylor R, Armand M, Navab N (2019) Enabling machine learning in X-ray-based procedures via realistic simulation of image formation. Int J Comput Assist Radiol Surg 14(9):1517–1528

  25. Ronneberger O, Fischer P, Brox T (2015) U-Net: convolutional networks for biomedical Image segmentation. In: Medical image computing and computer-assisted intervention – MICCAI 2015. Springer, Cham, pp 234–241. https://doi.org/10.1007/978-3-319-24574-4_28

  26. Martin-Gomez A, Li H, Song T, Yang S, Wang G, Ding H, Navab N, Zhao Z, Armand M (2022) STTAR: surgical tool tracking using off-the-shelf augmented reality head-mounted displays. arXiv:2208.08880. https://doi.org/10.48550/arXiv.2208.08880

  27. Gu W, Shah K, Knopf J, Josewski C, Unberath M (2022) A calibration-free workflow for image-based mixed reality navigation of total shoulder arthroplasty. Comput Methods Biomech Biomed Eng Imaging Vis 10(3):243–251. https://doi.org/10.1080/21681163.2021.2009378

    Article  Google Scholar 

Download references

Acknowledgements

This work was supported by the NIH under Grant No. R21EB028505 and Johns Hopkins University Internal Funds.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Benjamin D. Killeen.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Ethical approval

This article does not contain any studies with human participants performed by any of the authors.

Informed consent

This article does not contain patient data collected by any of the authors.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

https://github.com/benjamindkilleen/IPCAI-pelvic-corridors.

Supplementary Information

Below is the link to the electronic supplementary material.

Supplementary file 1 (pdf 2148 KB)

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Killeen, B.D., Gao, C., Oguine, K.J. et al. An autonomous X-ray image acquisition and interpretation system for assisting percutaneous pelvic fracture fixation. Int J CARS 18, 1201–1208 (2023). https://doi.org/10.1007/s11548-023-02941-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11548-023-02941-y

Keywords

Navigation