Comparing touch-based and head-tracking navigation techniques in a virtual reality biopsy simulator

Abstract

Recently, virtual reality (VR) technologies started gaining momentum in surgical simulation-based training by allowing clinicians to practice their skills before performing real procedures. The design of such simulators is usually focused on the primary operative tasks to be taught, but little attention is paid to secondary tasks that the user needs to perform, such as changing his/her point of view when manipulating the surgical instruments. More particularly, it is not clear how to design appropriate interaction techniques for those tasks, and how the fidelity of these interactions can impact the user’s performance on such systems. In this paper, we compare two viewpoint changing techniques having two different levels of interaction fidelity during needle insertion in a semi-immersive VR (SIVR) biopsy trainer. These techniques were designed based on observing clinicians performing actual biopsy procedures. The first technique is based on tracking the user’s head position (high interaction fidelity), while the second technique is touch-based with the user utilizing his/her non-dominant hand fingers to manipulate the point of view on a touch screen (moderate interaction fidelity). A user study was carried out to investigate the impact of the interaction fidelity of the viewpoint changing task (secondary task) on the user’s performance during the needle insertion task (main task). Twenty-one novice participants were asked to perform several trials of a needle insertion task while using the navigation techniques (within-subject design). Objective and subjective measures were recorded to compare the task performance (time to accomplish the task, precision of the tumor sampling, and errors) and user experience for both techniques. The results show that the touch-based viewpoint changing technique improves the users’ task completion performance during needle insertion while maintaining a similar level of needle manipulation accuracy as compared to the head-tracking technique. These results suggest that high interaction fidelity is not always necessary when designing surgical trainers. This also highlights the importance of designing appropriate interactions for secondary tasks because they can influence the user’s primary task performance in VR simulators.

This is a preview of subscription content, access via your institution.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

References

  1. Akhtar KSN, Chen A, Standfield NJ, Gupte CM (2014) The role of simulation in developing surgical skills. Curr Rev Musculoskelet Med 7(2):155–160. https://doi.org/10.1007/s12178-014-9209-z

    Article  Google Scholar 

  2. Arsenault R, Ware C (2000) Eye-hand co-ordination with force feedback. In: Proceedings of the SIGCHI conference on human factors in computing systems, ACM, CHI ’00, pp 408–414. https://doi.org/10.1145/332040.332466

  3. Balcombe J (2004) Medical training using simulation: toward fewer animals and safer patients. Altern Lab Anim 32:553–560. https://doi.org/10.1177/026119290403201s90

    Article  Google Scholar 

  4. Barbé L, Bayle B, de Mathelin M (2006) Bilateral controllers for teleoperated percutaneous interventions: evaluation and improvements. In: Proceedings of the American control conference, IEEE, ACC ’06, pp 3209–3214. https://doi.org/10.1109/ACC.2006.1657212

  5. Barkana-Erol D, Erol D (2013) Applying a user-centered design approach to develop a surgical interface for cryoablation of a kidney tumor. In: Proceedings of the international workshop on human–machine systems, cyborgs and enhancing devices (HUMASCEND), Manchester, UK

  6. Bertrand J, Brickler D, Babu S, Madathil K, Zelaya M, Wang T, Wagner J, Gramopadhye A, Luo J (2015) The role of dimensional symmetry on bimanual psychomotor skills education in immersive virtual environments. In: 2015 IEEE conference on virtual reality (VR), IEEE, pp 3–10. https://doi.org/10.1109/VR.2015.7223317

  7. Bhargava A, Bertrand JW, Gramopadhye AK, Madathil KC, Babu SV (2018) Evaluating multiple levels of an interaction fidelity continuum on performance and learning in near-field training simulations. IEEE Trans Vis Comput Gr 24(4):1418–1427. https://doi.org/10.1109/TVCG.2018.2794639

    Article  Google Scholar 

  8. Blanke O, Metzinger T (2009) Full-body illusions and minimal phenomenal selfhood. Trends Cognit Sci 13(1):7–13. https://doi.org/10.1016/j.tics.2008.10.003

    Article  Google Scholar 

  9. Boritz J, Booth KS (1997) A study of interactive 3d point location in a computer simulated virtual environment. In: Proceedings of the ACM symposium on virtual reality software and technology, ACM, VRST ’97, pp 181–187. https://doi.org/10.1145/261135.261168

  10. Bowman DA, Kruijff E, LaViola JJ Jr, Poupyrev I (2004) 3D user interfaces: theory and practice, 1st edn. Addison-Wesley, Boston

    Google Scholar 

  11. Brooke J (1996) SUS: a “quick and dirty” usability scale. Usability Eval Ind 189(194):4–7

    Google Scholar 

  12. Buckley C, Nugent E, Ryan D, Neary P (2012) Virtual reality—a new era in surgical training chap 7. In: Eichenberg C (ed) Virtual reality in psychological, medical and pedagogical applications, InTech, pp 139–166. https://doi.org/10.5772/46415

  13. Chellali A, Dumas C, Milleville-Pennel I (2012) Haptic communication to support biopsy procedures learning in virtual environments. Presence Teleoper Virtual Environ 21(4):470–489. https://doi.org/10.1162/PRES_a_00128

    Article  Google Scholar 

  14. Chellali A, Mentis H, Miller A, Ahn W, Arikatla VS, Sankaranarayanan G, Suvranu D, Schwaitzberg SD, Cao CGL (2016) Achieving interface and environment fidelity in the virtual basic laparoscopic surgical trainer. Int J Hum Comput Stud 96:22–37. https://doi.org/10.1016/j.ijhcs.2016.07.005

    Article  Google Scholar 

  15. Christie M, Olivier P (2009) Camera control in computer graphics: models, techniques and applications. In: ACM SIGGRAPH ASIA 2009 courses, ACM, pp 1–197. https://doi.org/10.1145/1665817.1665820

  16. Coles TR, Meglan D, John NW (2011) The role of haptics in medical training simulators: a survey of the state of the art. IEEE Trans Haptics 4(1):51–66. https://doi.org/10.1109/TOH.2010.19

    Article  Google Scholar 

  17. Corrêa CG, dos Santos Nunes FdL, Tori R (2014) Virtual reality-based system for training in dental anesthesia. In: Virtual, augmented and mixed reality. applications of virtual and augmented reality (VAMR 2014), Springer, Lecture notes in computer science, vol 8526, pp 267–276. https://doi.org/10.1007/978-3-319-07464-1_25

  18. Corrêa CG, Nunes FL, Ranzini E, Nakamura R, Tori R (2018) Haptic interaction for needle insertion training in medical applications: the state-of-the-art. Med Eng Phys 63:6–25. https://doi.org/10.1016/j.medengphy.2018.11.002

    Article  Google Scholar 

  19. Denson JS (1969) A Computer-Controlled Patient Simulator. JAMA 208(3):504–508. https://doi.org/10.1001/jama.1969.03160030078009

    Article  Google Scholar 

  20. Dieckmann P (2009) Using simulations for education, training, and research. Pabst Science Publishers, London

    Google Scholar 

  21. Drews FA, Bakdash JZ (2013) Simulation training in health care. Rev Hum Factors Ergon 8(1):191–234. https://doi.org/10.1177/1557234X13492977

    Article  Google Scholar 

  22. Edelmann J, Schilling A, Fleck S (2009) The DabR-a multitouch system for intuitive 3d scene navigation. In: 2009 3DTV conference: the true vision-capture, transmission and display of 3D Video, IEEE, pp 1–4. https://doi.org/10.1109/3DTV.2009.5069671

  23. Fortmeier D, Mastmeyer A, Schröder J, Handels H (2016) A virtual reality system for PTCD simulation using direct visuo-haptic rendering of partially segmented image data. IEEE J Biomed Health Inform 20(1):355–366. https://doi.org/10.1109/JBHI.2014.2381772

    Article  Google Scholar 

  24. Fried GM, Feldman LS, Vassiliou MC, Fraser SA, Stanbridge D, Ghitulescu G, Andrew CG (2004) Proving the value of simulation in laparoscopic surgery. Ann Surg 240(3):518. https://doi.org/10.1097/01.sla.0000136941.46529.56

    Article  Google Scholar 

  25. Fu CW, Goh WB, Ng JA (2010) Multi-touch techniques for exploring large-scale 3d astrophysical simulations. In: Proceedings of the SIGCHI conference on human factors in computing systems, ACM, CHI ’10, pp 2213–2222. https://doi.org/10.1145/1753326.1753661

  26. Gerovich O, Marayong P, Okamura AM (2004) The effect of visual and haptic feedback on computer-assisted needle insertion. Comput Aided Surg 9(6):243–249. https://doi.org/10.3109/10929080500190441

    Article  Google Scholar 

  27. Guiard Y (1987) Asymmetric division of labor in human skilled bimanual action. J Motor Behav 19(4):486–517. https://doi.org/10.1080/00222895.1987.10735426

    Article  Google Scholar 

  28. Hamblin CJ (2005) Transfer of training from virtual reality environments. Wichita State University, Wichita

    Google Scholar 

  29. Hand C (1997) A survey of 3d interaction techniques. Comput Gr Forum 16(5):269–281. https://doi.org/10.1111/1467-8659.00194

    Article  Google Scholar 

  30. Henshall G, Pop SR, Edwards MR, ap Cenydd L, John NW (2015) Towards a high fidelity simulation of the kidney biopsy procedure. In: 2015 IEEE conference on virtual reality (VR), IEEE, pp 191–192. https://doi.org/10.1109/VR.2015.7223360

  31. Hoyet L, Argelaguet F, Nicole C, Lécuyer A (2016) “Wow! I have six fingers!”: would you accept structural changes of your hand in vr? Front Robot AI 3(27):1–12. https://doi.org/10.3389/frobt.2016.00027

    Article  Google Scholar 

  32. Jankowski J, Hachet M (2015) Advances in interaction with 3d environments. Comput Gr Forum 34(1):152–190. https://doi.org/10.1111/cgf.12466

    Article  Google Scholar 

  33. Jarc AM, Curet MJ (2017) Viewpoint matters: objective performance metrics for surgeon endoscope control during robot-assisted surgery. Surg Endosc 31(3):1192–1202. https://doi.org/10.1007/s00464-016-5090-8

    Article  Google Scholar 

  34. Johnson SJ, Guediri SM, Kilkenny C, Clough PJ (2011) Development and validation of a virtual reality simulator: human factors input to interventional radiology training. Hum Factors 53(6):612–625. https://doi.org/10.1177/0018720811425042

    Article  Google Scholar 

  35. Khan A, Komalo B, Stam J, Fitzmaurice G, Kurtenbach G (2005) Hovercam: interactive 3d navigation for proximal object inspection. In: Proceedings of the 2005 symposium on interactive 3D graphics and games, ACM, pp 73–80. https://doi.org/10.1145/1053427.1053439

  36. Kilteni K, Groten R, Slater M (2012) The sense of embodiment in virtual reality. Presence Teleoper Virtual Environ 21(4):373–387. https://doi.org/10.1162/PRES_a_00124

    Article  Google Scholar 

  37. Kim HK, Rattner DW, Srinivasan MA (2003) The role of simulation fidelity in laparoscopic surgical training. In: Ellis RE, Peters TM (eds) International conference on medical image computing and computer-assisted intervention—MICCAI 2003, Lecture notes in computer science, vol 2878, Springer, pp 1–8. https://doi.org/10.1007/978-3-540-39899-8_1

  38. Kirurobo (2014) A C# (.NET) wrapper for Sensable PHANTOM Device

  39. Kooima RL (2008) Generalized perspective projection. Louisiana State University, Technical report, School of Electrical Engineering and Computer Science

  40. Kulik A, Kunert A, Keil M, Froehlich B (2018) RST 3D: A comprehensive gesture set for multitouch 3d navigation. In: 2018 IEEE conference on virtual reality and 3D user interfaces (VR), IEEE, pp 363–370. https://doi.org/10.1109/VR.2018.8447554

  41. Lemole GM Jr, Banerjee PP, Luciano C, Neckrysh S, Charbel FT (2007) Virtual reality in neurosurgical education: part-task ventriculostomy simulation with dynamic visual and haptic feedback. Neurosurgery 61(1):142–149. https://doi.org/10.1227/01.neu.0000279734.22931.21

    Article  Google Scholar 

  42. Liu D, Blickensderfer EL, Macchiarella ND, Vincenzi DA (2008) Transfer of training, chap 3. In: Hancock PA, Vincenzi DA, Wise JA, Mouloua M (eds) Human factors in simulation and training, CRC Press, pp 49–60

  43. Maran NJ, Glavin RJ (2003) Low-to high-fidelity simulation: a continuum of medical education? Med Educ 37:22–28. https://doi.org/10.1046/j.1365-2923.37.s1.9.x

    Article  Google Scholar 

  44. Marchal D, Moerman C, Casiez G, Roussel N (2013) Designing intuitive multi-touch 3d navigation techniques. In: Human–computer interaction—INTERACT 2013, Lecture notes in computer science, vol 8117, Springer, Berlin, pp 19–36. https://doi.org/10.1007/978-3-642-40483-2_2

  45. Mastmeyer A, Hecht T, Fortmeier D, Handels H (2014) Ray-casting based evaluation framework for haptic force feedback during percutaneous transhepatic catheter drainage punctures. Int J Comput Assist Radiol Surg 9(3):421–431. https://doi.org/10.1007/s11548-013-0959-7

    Article  Google Scholar 

  46. McMahan RP, Lai C, Pal SK (2016) Interaction fidelity: the uncanny valley of virtual reality interactions. In: International conference on virtual, augmented and mixed reality, Springer, pp 59–70. https://doi.org/10.1007/978-3-319-39907-2_6

  47. McMahan RP (2011) Exploring the effects of higher-fidelity display and interaction for virtual reality games. PhD, Virginia Polytechnic Institute and State University

  48. McMahan RP, Bowman DA, Zielinski DJ, Brady RB (2012) Evaluating display fidelity and interaction fidelity in a virtual reality game. IEEE Trans Vis Comput Gr 18(4):626–633. https://doi.org/10.1109/TVCG.2012.43

    Article  Google Scholar 

  49. Murphy-Chutorian E, Trivedi MM (2009) Head pose estimation in computer vision: a survey. IEEE Trans Pattern Anal Mach Intell 31(4):607–626. https://doi.org/10.1109/TPAMI.2008.106

    Article  Google Scholar 

  50. Nabiyouni M (2017) How does interaction fidelity influence user experience in vr locomotion? PhD, Virginia Polytechnic Institute and State University

  51. Nabiyouni M, Saktheeswaran A, Bowman DA, Karanth A (2015) Comparing the performance of natural, semi-natural, and non-natural locomotion techniques in virtual reality. In: 2015 IEEE symposium on 3D user interfaces (3DUI), IEEE, pp 3–10. https://doi.org/10.1109/3DUI.2015.7131717

  52. Ortega M (2013) 3D object position using automatic viewpoint transitions. In: Proceedings of the SIGCHI conference on human factors in computing systems, ACM, CHI ’13, pp 193–196. https://doi.org/10.1145/2470654.2470681

  53. Ortega M, Stuerzlinger W, Scheurich D (2015) SHOCam: a 3d orbiting algorithm. In: Proceedings of the 28th annual ACM symposium on user interface software and technology, ACM, UIST ’15, pp 119–128. https://doi.org/10.1145/2807442.2807496

  54. Ragan ED, Bowman DA, Kopper R, Stinson C, Scerbo S, McMahan RP (2015) Effects of field of view and visual complexity on virtual reality training effectiveness for a visual scanning task. IEEE Trans Vis Comput Gr 21(7):794–807. https://doi.org/10.1109/TVCG.2015.2403312

    Article  Google Scholar 

  55. Ricca A, Chellali A, Otmane S (2017) Study of interaction fidelity for two viewpoint changing techniques in a virtual biopsy trainer. In: 2017 Conference on IEEE virtual reality (VR), IEEE, pp 227–22. https://doi.org/10.1109/VR.2017.7892259

  56. Rodriguez-Paz JM, Kennedy M, Salas E, Wu AW, Sexton JB, Hunt EA, Pronovost PJ (2009) Beyond “see one, do one, teach one”: toward a different training paradigm. BMJ Qual Saf 18(1):63–68. https://doi.org/10.1136/qshc.2007.023903

    Article  Google Scholar 

  57. Satava RM (2001) Accomplishments and challenges of surgical simulation. Surg Endosc 15(3):232–241. https://doi.org/10.1007/s004640000369

    Article  Google Scholar 

  58. Shin S, Park W, Cho H, Park S, Kim L (2011) Needle insertion simulator with haptic feedback. In: Jacko JA (ed) Human–computer interaction techniques and environments (HCI 2011), Lecture notes in computer science, Springer, Berlin, pp 119–124. https://doi.org/10.1007/978-3-642-21605-3_13

  59. Spindler M, Büschel W, Dachselt R (2012) Use your head: tangible windows for 3d information spaces in a tabletop environment. In: Proceedings of the 2012 ACM international conference on interactive tabletops and surfaces, pp 245–254. https://doi.org/10.1145/2396636.2396674

  60. Stassen HG, Bonjer HJ, Grimbergen CA, Dankelman J (2005) The future of minimally invasive surgery and training, chap 12. In: Dankelman J, Grimbergen CA, Stassen HG (eds) Engineering for patient safety: issues in minimally invasive procedures, CRC Press, pp 272–282

  61. Stoffregen TA, Bardy BG, Smart L, Pagulayan R (2003) On the nature and evaluation of fidelity in virtual environments, chap 6. In: Hettinger LJ, Haas MW (eds) Virtual and adaptive environments: applications, implications, and human performance issues, Lawrence Erlbaum Associates, pp 111–128

  62. Sutherland IE (1968) A head-mounted three dimensional display. In: Proceedings of the December 9–11, 1968, fall joint computer conference, part I, ACM, AFIPS ’68 (Fall, part I), pp 757–764. https://doi.org/10.1145/1476589.1476686

  63. Sutherland C, Hashtrudi-Zaad K, Sellens R, Abolmaesumi P, Mousavi P (2013) An augmented reality haptic training simulator for spinal needle procedures. IEEE Trans Biomed Eng 60(11):3009–3018. https://doi.org/10.1109/TBME.2012.2236091

    Article  Google Scholar 

  64. Tang CY, Chin W, Chui YP, Poon WS, Heng PA (2007) A virtual reality-based surgical simulation system for virtual neuroendoscopy. In: 2007 IEEE international conference on integration technology, IEEE, ICIT ’07, pp 253–258. https://doi.org/10.1109/ICITECHNOLOGY.2007.4290473

  65. Torres RS, Bíscaro HH, Araújo LV, Nunes FLS (2012) ViMeTGame: A serious games for virtual medical training of breast biopsy. SBC Journal on 3D Interactive Systems 3(3):12–22

  66. Van Nguyen D, Ben Lakhal S, Chellali A (2015) Preliminary evaluation of a virtual needle insertion training system. In: 2015 Conference on IEEE virtual reality (VR), IEEE, pp 247–248. https://doi.org/10.1109/VR.2015.7223388

  67. Witmer BG, Singer MJ (1998) Measuring presence in virtual environments: a presence questionnaire. Presence 7(3):225–240. https://doi.org/10.1162/105474698565686

    Article  Google Scholar 

  68. Yang X, Lee W, Choi Y, You H (2012) Development of a user-centered virtual liver surgery planning system. Proc Hum Factors Ergon Soc Annu Meet 56(1):772–776. https://doi.org/10.1177/1071181312561161

    Article  Google Scholar 

Download references

Acknowledgements

The authors would like to thank all the volunteers that participated to the experimental study. The authors would also like to thank the Paris Ile-de-France Region for the financial support.

Funding

This work was supported by the Paris Ile-de-France Region (Grant Number 17002647). Aylen Ricca received a Ph.D. Grant from the University of Evry. We also acknowledge support from Genopole.

Author information

Affiliations

Authors

Corresponding author

Correspondence to Aylen Ricca.

Ethics declarations

Conflict of interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Ricca, A., Chellali, A. & Otmane, S. Comparing touch-based and head-tracking navigation techniques in a virtual reality biopsy simulator. Virtual Reality 25, 191–208 (2021). https://doi.org/10.1007/s10055-020-00445-7

Download citation

Keywords

  • Biopsy trainer
  • Interaction design
  • Interaction fidelity
  • Surgical training