Skip to main content
Log in

Mixed reality using illumination-aware gradient mixing in surgical telepresence: enhanced multi-layer visualization

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

Surgical telepresence using augmented perception has been applied, but mixed reality is still being researched and is only theoretical. The aim of this work is to propose a solution to improve the visualization in the final merged video by producing globally consistent videos when the intensity of illumination in the input source and target video varies. The proposed system uses an enhanced multi-layer visualization with illumination-aware gradient mixing using Illumination Aware Video Composition algorithm. Particle Swarm Optimization Algorithm is used to find the best sample pair from foreground and background region and image pixel correlation to estimate the alpha matte. Particle Swarm Optimization algorithm helps to get the original colour and depth of the unknown pixel in the unknown region. Our results showed improved accuracy caused by reducing the Mean squared Error for selecting the best sample pair for unknown region in 10 each sample for bowel, jaw and breast. The amount of this reduction is 16.48% from the state of art system. As a result, the visibility accuracy is improved from 89.4 to 97.7% which helped to clear the hand vision even in the difference of light. Illumination effect and alpha pixel correlation improves the visualization accuracy and produces a globally consistent composition results and maintains the temporal coherency when compositing two videos with high and inverse illumination effect. In addition, this paper provides a solution for selecting the best sampling pair for the unknown region to obtain the original colour and depth.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

Abbreviations

AR:

Augmented reality

MR:

Mixed reality

MES:

Mean square error

EMMVC:

Enhanced multi-layer mean value cloning

PSO:

Particle swarm algorithm

CT:

Computed tomography

AINET:

Artificial immune neural network

References

  1. Archer J, Leach G, Van-Schyndel R (2018) GPU based techniques for deep image merging. Comput Vis Media 4(3):277–285. https://doi.org/10.1007/s41095-018-0118-8

    Article  Google Scholar 

  2. Basnet BR, Alsadoon A, Withana C, Deva CA, Paul M (2018) A novel noise filtered and occlusion removal: navigational accuracy in augmented reality-based constructive jaw surgery. Oral Maxillofac Surg 22(4):385–401. https://doi.org/10.1007/s10006-018-0719-5

    Article  Google Scholar 

  3. Cho D, Kim S, Tai Y-W, Kweon IS (2017) Automatic trimap generation and consistent matting for light-field images. IEEE Trans Pattern Anal Mach Intell 39(8):1504–1517. https://doi.org/10.1109/tpami.2016.2606397

    Article  Google Scholar 

  4. Choi SH, Kim M, Lee JY (2018) Situation-dependent remote AR collaborations: image-based collaboration using a 3D perspective map and live video-based collaboration with a synchronized VR mode. Comput Ind 101:51–66. https://doi.org/10.1016/j.compind.2018.06.006

    Article  Google Scholar 

  5. De Lima ES, Feijó B, Furtado AL (2017) Video-based interactive storytelling using real-time video compositing techniques. Multimed Tools Appl 77(2):2333–2357. https://doi.org/10.1007/s11042-017-4423-5

    Article  Google Scholar 

  6. Fang D, Xu H, Yang X et al (2020) An augmented reality-based method for remote collaborative real-time assistance: from a system perspective. Mobile Netw Appl 25:412–425. https://doi.org/10.1007/s11036-019-01244-4

    Article  Google Scholar 

  7. Henry C, Lee S-W (2019) Automatic trimap generation and artifact reduction in alpha matte using unknown region detection. Expert Syst Appl 133:242–259. https://doi.org/10.1016/j.eswa.2019.05.019

    Article  Google Scholar 

  8. Huang W, Alem L, Tecchia F, Duh H (2017) Augmented 3D hands: a gesture-based mixed reality system for distributed collaboration. J Multimodal User Interfaces 12(2):77–89. https://doi.org/10.1007/s12193-017-0250-2

    Article  Google Scholar 

  9. Murugesan YP, Alsadoon A, Manoranjan P, Prasad PWC (2018) A novel rotational matrix and translation vector algorithm: geometric accuracy for augmented reality in oral and maxillofacial surgeries. Int J Med Robot 14(3):e1889. https://doi.org/10.1002/rcs.1889

    Article  Google Scholar 

  10. Oyekan J, Prabhu V, Tiwari A, Baskaran V, Burgess M, McNally R (2017) Remote real-time collaboration through synchronous exchange of digitised human–workpiece interactions. Futur Gener Comput Syst 67:83–93. https://doi.org/10.1016/j.future.2016.08.012

    Article  Google Scholar 

  11. Pluhacek M, Senkerik R, Zelinka I (2014) Particle swarm optimization algorithm driven by multichaotic number generator. Soft Comput 18(4):631–639. https://doi.org/10.1007/s00500-014-1222-z

    Article  Google Scholar 

  12. Shakya K, Khanal S, Alsadoon A, Elchouemi A et al (2018) Remote surgeon hand motion and occlusion removal in mixed reality in breast surgical telepresence: rural and remote care. Am J Appl Sci 15(11):497–509. https://doi.org/10.3844/ajassp.2018.497.509

    Article  Google Scholar 

  13. Shen Y, Lin X, Gao Y, Sheng B, Liu Q (2012) Video composition by optimized 3D mean-value coordinates. Comput Anim Virtual Worlds 23(3–4):179–190. https://doi.org/10.1002/cav.1465

    Article  Google Scholar 

  14. Si W, Liao X, Qian Y, Wang Q (2018) Mixed reality guided radiofrequency needle placement: a pilot study. IEEE Access 6:31493–31502. https://doi.org/10.1109/ACCESS.2018.2843378

    Article  Google Scholar 

  15. Venkata HS et al (2019) A novel mixed reality in breast and constructive jaw surgical tele-presence. Comput Methods Progr Biomed 177:253–268. https://doi.org/10.1016/j.cmpb.2019.05.025

    Article  Google Scholar 

  16. Wang P et al (2019) 2.5DHANDS: a gesture-based MR remote collaborative platform. Int J Adv Manuf Technol 102(5–8):1339–1353. https://doi.org/10.1007/s00170-018-03237-1

    Article  Google Scholar 

  17. Wang H, Xu N, Raskar R, Ahuja N (2007) Videoshop: a new framework for spatio-temporal video editing in gradient domain. Graph Models 69(1):57–70. https://doi.org/10.1016/j.gmod.2006.06.002

    Article  Google Scholar 

  18. Wang J, Sheng B, Li P, Jin Y, Feng D (2019) Illumination-guided video composition via gradient consistency optimization. IEEE Trans Image Process. https://doi.org/10.1109/TIP.2019.2916769

    Article  MathSciNet  MATH  Google Scholar 

  19. Yan X, Hao Z, Huang H (2019) Alpha matting with image pixel correlation. Int J Mach Learn Cybern 9(4):621–627. https://doi.org/10.1007/s13042-016-0584-1

    Article  Google Scholar 

  20. Zhang M, Piao Y, Wei C, Si Z (2019) Occlusion removal based on epipolar plane images in integral imaging system. Opt Laser Technol. https://doi.org/10.1016/j.optlastec.2019.105680

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Abeer Alsadoon.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Puri, N., Alsadoon, A., Prasad, P.W.C. et al. Mixed reality using illumination-aware gradient mixing in surgical telepresence: enhanced multi-layer visualization. Multimed Tools Appl 81, 1153–1178 (2022). https://doi.org/10.1007/s11042-021-11343-8

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-021-11343-8

Keywords

Navigation