Skip to main content

Advertisement

Log in

A novel enhanced energy function using augmented reality for a bowel: modified region and weighted factor

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

The popularity of augmented reality in medical application is rising exponentially over time, especially in the medical sector. It possesses a greater possibility in the reduction of surgical risks by raising visual awareness during the operation. Incorrect representation or inadequate detail on target region or delay in processing time may result in serious consequences. Also, the absence of a proper mechanism to handle occlusions like nerves, vessels or medical equipment may affect the performance. Therefore, this research aims to improve the visualization accuracy of bowel region and reduce the processing time. a novel enhanced energy function using augmented reality for bowel is proposed. The proposed system targets the modified region and weighted factor that encompasses the power of the combined region and dense cue with longterm and accurate augmented reality display mechanism. The system is capable of providing detailed visual output by precisely placing the model developed using the CT images of the target object, over the live video. Also, by applying the least square approach, the system is capable of addressing larger deformation and occlusion that appears during the surgical procedure providing the most accurate display. The feature tracking and tracking recovery components help the entire visualization procedure to stay on track by automatically registering and re-registering the surface whenever required. The proposed system capable of running image registration without human involvement and it can even decide when to trigger the reregistration process whenever required. The results from the proposed system has minimized the overlay error by a larger number. We validated the system with different sets of samples from endoscopy. The dataset included the samples from the bowel region from people with three different age groups. The overlay error accuracy was 0.24777px, and the performance was 44fps. The proposed system is concentrated on the overlay accuracy and the processing time. This study has addressed the shortcoming of the previous systems regarding manual registration and rigid assumptions.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14

Similar content being viewed by others

References

  1. Bergen T, Wittenberg T (2016) Stitching and surface reconstruction from endoscopic image sequences: a review of applications and methods. IEEE J Biomed Health Inf 20(1):304–321. https://doi.org/10.1109/JBHI.2014.2384134

    Article  Google Scholar 

  2. Carl B, Bopp M, Saß B, Pojskic M, Nimsky C (2019) Augmented reality in intradural spinal tumor surgery. Acta Neurochir 161(10):2181–2193. https://doi.org/10.1007/s00701-019-04005-0

    Article  Google Scholar 

  3. Chen F, Liu J, Liao H (2017) 3D catheter shape determination for endovascular navigation using a two-step particle filter and ultrasound scanning. IEEE Trans Med Imaging 36(3):685–695. https://doi.org/10.1109/TMI.2016.2635673

    Article  Google Scholar 

  4. Chen L, Tang W, John NW, Wan TR, Zhang JJ (2018) SLAM-based dense surface reconstruction in monocular minimally invasive surgery and its application to augmented reality. Comput Methods Prog Biomed 158:135–146. https://doi.org/10.1016/j.cmpb.2018.02.006

    Article  Google Scholar 

  5. Chu Y et al (2018) Perception enhancement using importance-driven hybrid rendering for augmented reality based endoscopic surgical navigation. Biomed Opt Exp 9(11):5205–5226. https://doi.org/10.1364/BOE.9.005205

    Article  Google Scholar 

  6. Deib G et al (2018) Image guided percutaneous spine procedures using an optical see-through head mounted display: proof of concept and rationale. J Neuro Interv Surg 10(12):1187–1191. https://doi.org/10.1136/neurintsurg-2017-013649

    Article  Google Scholar 

  7. Eccles CL, Tse RV, Hawkins MA, Lee MT, Moseley DJ, Dawson LA (2016) Intravenous contrast-enhanced cone beam computed tomography (IVCBCT) of intrahepatic tumors and vessels. Adv Radiat Oncol 1(1):43–50. https://doi.org/10.1016/j.adro.2016.01.001

    Article  Google Scholar 

  8. Edgcumbe P, Singla R, Pratt P, Schneider C, Nguan C, Rohling R (2018) Follow the light: projector-based augmented reality intracorporeal system for laparoscopic surgery. J Med Imaging 5(02):021216. https://doi.org/10.1117/1.JMI.5.2.021216

    Article  Google Scholar 

  9. Gao QH, Wan TR, Tang W, Chen L (2019) Object registration in semi-cluttered and partial-occluded scenes for augmented reality. Multimed Tools Appl 78(11):15079–15099. https://doi.org/10.1007/s11042-018-6905-5

    Article  Google Scholar 

  10. Hanna MG, Ahmed I, Nine J, Prajapati S, Pantanowitz L (2018) Augmented reality technology using microsoft hololens in anatomic pathology. Arch Pathol Lab Med 142(5):638–644. https://doi.org/10.5858/arpa.2017-0189-OA

    Article  Google Scholar 

  11. Kahl F, Strandmark P (2011) Generalized roof duality for pseudo-boolean optimization. https://doi.org/10.1109/ICCV.2011.6126250

  12. Kolmogorov V (2015) A new look at reweighted message passing. IEEE Trans Pattern Anal Mach Intell 37(5):919–930. https://doi.org/10.1109/TPAMI.2014.2363465

    Article  Google Scholar 

  13. Kolmogorov V, Zabih R (2004) What energy functions can be minimized via graph cuts? IEEE Trans Pattern Anal Mach Intell 26(2):147–159. https://doi.org/10.1109/TPAMI.2004.1262177

    Article  MATH  Google Scholar 

  14. Kosterhon M, Gutenberg A, Kantelhardt SR, Archavlis E, Giese A (2017) Navigation and image injection for control of bone removal and osteotomy planes in spine surgery. Oper Neurosurg 13(2):297–304. https://doi.org/10.1093/ons/opw017

    Article  Google Scholar 

  15. Lawonn K, Viola I, Preim B, Isenberg T (2018) A survey of surface-based illustrative rendering for visualization. Comput Graph Forum 37(6):205–234. https://doi.org/10.1111/cgf.13322

    Article  Google Scholar 

  16. Lee D et al (2018) Preliminary study on application of augmented reality visualization in robotic thyroid surgery. Ann Surg Treat Res 95(6):297–302. https://doi.org/10.4174/astr.2018.95.6.297

    Article  Google Scholar 

  17. Liu J et al (2019) An augmented reality system for image guidance of transcatheter procedures for structural heart disease. PLoS One 14(7):e0219174. https://doi.org/10.1371/journal.pone.0219174

    Article  Google Scholar 

  18. Pepe A et al (2019) A marker-less registration approach for mixed reality–aided maxillofacial surgery: a pilot evaluation. J Digit Imaging 32(6):1008–1018. https://doi.org/10.1007/s10278-019-00272-6

    Article  Google Scholar 

  19. Peterlík I et al (2018) Fast elastic registration of soft tissues under large deformations. Med Image Anal 45:24–40. https://doi.org/10.1016/j.media.2017.12.006

    Article  Google Scholar 

  20. Puerto-Souza GA, Mariottini GL (2013) Toward long-term and accurate Augmented-Reality display for minimally-invasive surgery. https://doi.org/10.1109/ICRA.2013.6631349

  21. Randhawa S, Alsadoon A, Prasad PWC, Al-Dala’in T, Dawoud A, Alrubaie A (2020) Deep learning for liver tumour classification: enhanced loss function. Multimed Tools Appl 80:4729–4750. https://doi.org/10.1007/s11042-020-09900-8

    Article  Google Scholar 

  22. Shen J, Du Y, Li X (2014) Interactive segmentation using constrained Laplacian optimization. IEEE Trans Circ Syst Video Technol 24(7):1088–1100. https://doi.org/10.1109/TCSVT.2014.2302545

    Article  Google Scholar 

  23. Shen J, Yunfan D, Wang W, Li X (2014) Lazy random walks for superpixel segmentation. IEEE Trans Image Process 23(4):1451–1462. https://doi.org/10.1109/TIP.2014.2302892

    Article  MathSciNet  MATH  Google Scholar 

  24. Shen J, Peng J, Dong X, Shao L, Porikli F (2017) Higher order energies for image segmentation. IEEE Trans Image Process 26(10):4911–4922. https://doi.org/10.1109/TIP.2017.2722691

    Article  MathSciNet  MATH  Google Scholar 

  25. Wang J et al (2014) Augmented reality navigation with automatic marker-free image registration using 3-D image overlay for dental surgery. IEEE Trans Biomed Eng 61(4):1295–1304. https://doi.org/10.1109/TBME.2014.2301191

    Article  Google Scholar 

  26. Wang R, Zhang M, Meng X, Geng Z, Wang F-Y (2018) 3-D tracking for augmented reality using combined region and dense cues in endoscopic surgery. IEEE J Biomed Health Inf 22(5):1540–1551. https://doi.org/10.1109/JBHI.2017.2770214

    Article  Google Scholar 

  27. Yushkevich PA et al (2006) User-guided 3D active contour segmentation of anatomical structures: Significantly improved efficiency and reliability. NeuroImage 31(3):1116–1128. https://doi.org/10.1016/j.neuroimage.2006.01.015

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Abeer Alsadoon.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Shrestha, G., Alsadoon, A., Prasad, P.W.C. et al. A novel enhanced energy function using augmented reality for a bowel: modified region and weighted factor. Multimed Tools Appl 80, 17893–17922 (2021). https://doi.org/10.1007/s11042-021-10606-8

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-021-10606-8

Keywords

Navigation