Skip to main content
Log in

Agreement Between Augmented Reality and Computed Tomography Coordinate Systems: A New Approach to an Image-Guided Procedure

  • Original Article
  • Published:
Journal of Medical and Biological Engineering Aims and scope Submit manuscript

Abstract

Purpose

Aligning augmented reality (AR) objects with the actual patient’s body in the real world is challenging in interventional radiology. We aimed to propose a computed tomography (CT) guided procedure with markerless AR based that uses a coordinate system complying with Digital Imaging and Communications in Medicine (DICOM) standard. We also sought to verify the accuracy and precision of the replication of the CT coordinate system in the AR space using in-house software for smartphones.

Methods

Spatial anchors were placed on the laser aperture of the CT gantry to automatically calculate the CT’s isocenter and origin of the DICOM coordinates. A real phantom holder and a virtual protractor were placed at the foot side, specifically 600 mm from the isocenter, and the respective horizontal, and vertical positioning errors were measured to evaluate accuracy and precision.

Results

The horizontal and vertical errors were − 3.4 ± 5.5 and − 5.1 ± 4.7 mm, respectively.

Conclusion

The agreement between the CT coordinate system and AR space is satisfactory. In our technique, the operator can confirm the location of the lesion observed in the CT image during the procedure and can place a virtual protractor for guiding the puncture at that location.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

Data Availability

https://doi.org/10.1016/j.compeleceng.2017.04.032

References

  1. Rosenthal, M., State, A., Lee, J., Hirota, G., Ackerman, J., Keller, K., Pisano, E. D., Jiroutek, M., Muller, K., & Fuchs, H. M. (2002). Augmented reality guidance for needle biopsies: An initial randomized, controlled trial in phantoms. Medical Image Analysis, 6(3), 313–320. https://doi.org/10.1016/S1361-8415(02)00088-9

    Article  PubMed  Google Scholar 

  2. Wacker, F. K., Vogt, S., Khamene, A., Jesberger, J. A., Nour, S. G., Elgort, D. R., Sauer, F., Duerk, J. L., & Lewin, J. S. F. K. (2006). An augmented reality system for MR image–guided needle biopsy: Initial results in a swine model. Radiology, 238(2), 497–504. https://doi.org/10.1148/radiol.2382041441

    Article  PubMed  Google Scholar 

  3. Hecht, R., Li, M., de Ruiter, Q. M. B., Pritchard, W. F., Li, X., Krishnasamy, V., Saad, W., Karanian, J. W., & Wood, B. J. (2020). Smartphone augmented reality CT-based platform for needle insertion guidance: A phantom study. Cardiovascular and Interventional Radiology, 43(5), 756–764. https://doi.org/10.1007/s00270-019-02403-6

    Article  PubMed  PubMed Central  Google Scholar 

  4. Suzuki, K., Morita, S., Endo, K., Yamamoto, T., & Sakai, S. (2022). Noncontact measurement of puncture needle angle using augmented reality technology in computed tomography-guided biopsy: Stereotactic coordinate design and accuracy evaluation. International Journal of Computer Assisted Radiology and Surgery, 17(4), 745–750. https://doi.org/10.1007/s11548-022-02572-9.

    Article  PubMed  Google Scholar 

  5. Morita, S., Suzuki, K., Yamamoto, T., Kunihara, M., Hashimoto, H., Ito, K., Fujii, S., Ohya, J., Masamune, K., & Sakai, S. (2022). Mixed reality needle guidance application on smartglasses without pre-procedural CT image import with manually matching coordinate systems. Cardiovascular and Interventional Radiology, 45(3), 349–356. https://doi.org/10.1007/s00270-021-03029-3

    Article  PubMed  Google Scholar 

  6. Solbiati, M., Passera, K. M., Rotilio, A., Oliva, F., Marre, I., Goldberg, S. N., Ierace, T., & Solbiati, L. M. (2018). Augmented reality for interventional oncology: Proof-of-concept study of a novel high-end guidance system platform. European Radiology Experimental, 2(1), 1. https://doi.org/10.1186/s41747-018-0054-5

    Article  Google Scholar 

  7. DICOM-C.7.6 Common image IE modules (2016). https://dicom.nema.org/medical/Dicom/2016b/output/chtml/part03/sect_C.7.6.2.html

  8. Morita, S., Suzuki, K., Yamamoto, T., Endo, S., Yamazaki, H., & Sakai, S. (2023). Out-of-plane needle placements using 3D augmented reality protractor on smartphone: An experimental phantom study. Cardiovascular and  Interventional Radiology. https://doi.org/10.1007/s00270-023-03357-6

    Article  PubMed  Google Scholar 

  9. Saleem, S., Bais, A., Sablatnig, R., Ahmad, A., & Naseer, N. S. (2017). Feature points for multisensor images. Computers Electrical Engineering, 62 C, 511–523. https://doi.org/10.1016/j.compeleceng.2017.04.032

    Article  Google Scholar 

  10. Nister, D., Naroditsky, O., & Bergen, J. (2004). Visual odometry. Proceedings of the 2004 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2004. https://doi.org/10.1109/CVPR.2004.1315094

  11. Xiao, G., Bonmati, E., Thompson, S., Evans, J., Hipwell, J., Nikitichev, D., Gurusamy, K., Ourselin, S., Hawkes, D. J., Davidson, B., & Clarkson, M. J. (2018). Electromagnetic tracking in image-guided laparoscopic surgery: Comparison with optical tracking and feasibility study of a combined laparoscope and laparoscopic ultrasound system. Medical Physics, 45(11), 5094–5104. https://doi.org/10.1002/mp.13210.

    Article  PubMed  Google Scholar 

  12. Park, B. J., Hunt, S. J., Nadolski, G. J., & Gade, T. P. (2020). Augmented reality improves procedural efficiency and reduces radiation dose for CT-guided lesion targeting: A phantom study using HoloLens 2. Scientific Reports, 10(1), 18620. https://doi.org/10.1038/s41598-020-75676-4.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

Download references

Acknowledgements

None.

Funding

This work was supported by the Japan Society for the Promotion of Science (JSPS) KAKENHI (Grants-in-Aid for Scientific Research) Grant #18K07648.

Author information

Authors and Affiliations

Authors

Contributions

Both authors contributed to the study conception and design. Software development, data collection, and analysis were performed by KS. The first draft of the manuscript was written by KS, and both authors commented on the previous versions of the manuscript. Both authors have read and approved the final manuscript.

Corresponding author

Correspondence to Kazufumi Suzuki.

Ethics declarations

Competing Interest

The authors have no relevant financial or nonfinancial interests to disclose.

Ethical Approval

This study does not include data on human or animal subjects; therefore, ethics approval is not applicable.

Consent to Participate

This study does not include data on human subjects.

Consent to Publish

This study does not include data on human subjects.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Below is the link to the electronic supplementary material.

Supplementary material 1 A sample of one experimental trial. When the phone is moved slowly to the left or right, numerous feature points are identified from the parallax. When targeting the feature points on the laser aperture, a spatial anchor is instantiated at the nearest feature point. With just two button presses, the coordinate system is placed in the real world and the virtual protractor is automatically positioned at the desired coordinates (256 pixel, 256 pixel, −600 mm). We observed the vertical tick marks of the virtual protractor from above to record the horizontal error; then, we observed the horizontal tick marks from the left side to record the vertical error. In this trial, the right spatial anchors appeared at the lower edge of the right laser aperture while observing from the left side; this was often the case in the other trials. We created the user interface for our software such that it can provide an intuitive user experience. (MP4 119906.4 kb)

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Suzuki, K., Sakai, S. Agreement Between Augmented Reality and Computed Tomography Coordinate Systems: A New Approach to an Image-Guided Procedure. J. Med. Biol. Eng. 43, 561–565 (2023). https://doi.org/10.1007/s40846-023-00820-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s40846-023-00820-0

Keywords

Navigation