Virtual Reality Simulator for Medical Auscultation Training

  • Luis Andrés Puértolas BálintEmail author
  • Luis Humberto Perez MacíasEmail author
  • Kaspar AlthoeferEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11649)


According to the Oxford English dictionary, auscultation is “the action of listening to sounds from the heart, lungs, or other organs, typically with a stethoscope, as a part of medical diagnosis.” In this work, we describe a medical simulator that includes audio, visual, pseudo-haptic, and spatial elements for training medical students in auscultation. In our training simulator, the user is fully immersed in a virtual reality (VR) environment. A typical hospital bedside scenario was recreated, and the users can see their own body and the patient increase immersion. External tracking devices are used to acquire the user’s movements and map them into the VR environment. The main idea behind this work is for the user to associate the heart and lung sounds, as heard through the stethoscope with the corresponding health-related problems. Several sound parameters including the volume, give information about the type and severity of the disease. Our simulator can reproduce sounds belonging to the heart and lungs. Through the proposed VR-based training, the medical student ideally will learn to relate sounds to illnesses in a realistic setting, accelerating the learning process.


Auscultation Pseudo-haptic Virtual reality Simulator Medical Heart Lungs Pulmonary 



This work was supported by BALTECH Pty Ltd (Ballarat Technologies), CONACYT and Queen Mary University of London.


  1. 1.
    Cruz, M., Amann, E.: Virtual reality and tactile augmentation in the treatment of spider phobia: a case report. In: Capital Without Borders: Challenges to Development, vol. 35, pp. 39–52 (2010)Google Scholar
  2. 2.
    Aggarwal, R., et al.: Virtual reality simulation training can improve technical skills during laparoscopic salpingectomy for ectopic pregnancy. BJOG: Int. J. Obstet. Gynaecol. 113, 1382–1387 (2006)CrossRefGoogle Scholar
  3. 3.
    Wang, M., Reid, D.: Virtual reality in pediatric neurorehabilitation: attention deficit hyperactivity disorder, autism and cerebral palsy. Neuroepidemiology 36, 2–18 (2011)CrossRefGoogle Scholar
  4. 4.
    Pierdicca, R., Frontoni, E., Pollini, R., Trani, M., Verdini, L.: The use of augmented reality glasses for the application in industry 4.0. In: De Paolis, L.T., Bourdot, P., Mongelli, A. (eds.) AVR 2017. LNCS, vol. 10324, pp. 389–401. Springer, Cham (2017). Scholar
  5. 5.
    Erez, T., Tassa, Y., Todorov, E.: Simulation tools for model-based robotics: comparison of Bullet, Havok, MuJoCo, ODE and PhysX. In: 2015 IEEE International Conference on Robotics and Automation (ICRA), pp. 4397–4404 (2015)Google Scholar
  6. 6.
  7. 7.
  8. 8.
  9. 9.
    Takashina, T., Masuzawa, T., Fukui, Y.: A new cardiac auscultation simulator. Clin. Cardiol. 13, 869–872 (1990)CrossRefGoogle Scholar
  10. 10.
    Kagaya, Y., Tabata, M., Arata, Y.: Variation in effectiveness of a cardiac auscultation training class with a cardiology patient simulator among heart sounds and murmurs. J. Cardiol. 70, 192–198 (2017)CrossRefGoogle Scholar
  11. 11.
    Vargas-Orjuela, M., Uribe-Quevedo, A., Rojas, D., Kapralos, B., Perez-Gutierrez, B.: A mobile immersive virtual reality cardiac auscultation app. In: 2017 IEEE 6th Global Conference on Consumer Electronics, GCCE 2017, pp. 1–2, January 2017Google Scholar
  12. 12.
    Gruchalla, K.: Immersive well-path editing: investigating the added value of immersion. In: Proceedings - Virtual Reality Annual International Symposium, pp. 157–164 (2004)Google Scholar
  13. 13.
    Stengel, M., Grogorick, S., Eisemann, M., Eisemann, E., Magnor, M.A.: An affordable solution for binocular eye tracking and calibration in head-mounted displays. In: Proceedings of the 23rd ACM International Conference on Multimedia - MM 2015, pp. 15–24 (2015)Google Scholar
  14. 14.
    Bowman, D.A., Mcmahan, R.P., Virginia Tech: Virtual reality - how much immersion is enough. Computer 40, 36–43 (2007)CrossRefGoogle Scholar
  15. 15.
    Bhoi, A.K., Sherpa, K.S., Khandelwal, B.: Multidimensional analytical study of heart sounds: a review. Int. J. Bioautomat. 19, 351–376 (2015)Google Scholar
  16. 16.
    Altan, G., Kutlu, Y., Pekmezci, A.Ö., Nural, S.: Deep learning with 3D-second order difference plot on respiratory sounds. Biomed. Sig. Process. Control 45, 58–69 (2018)CrossRefGoogle Scholar
  17. 17.
    Israel, S.A., Irvine, J.M.: Heartbeat biometrics: a sensing system perspective. Int. J. Cogn. Biom. 1, 39 (2012)Google Scholar
  18. 18.
    Abbas, A., Fahim, A.: An automated computerized auscultation and diagnostic system for pulmonary diseases. J. Med. Syst. 34, 1149–1155 (2010)CrossRefGoogle Scholar
  19. 19.
    Cǎlin, A.D., Coroiu, A.: Interchangeability of Kinect and Orbbec sensors for gesture recognition. In: Proceedings - 2018 IEEE 14th International Conference on Intelligent Computer Communication and Processing, ICCP 2018, pp. 309–315 (2018)Google Scholar
  20. 20.
    Matsas, E., Vosniakos, G.C.: Design of a virtual reality training system for human–robot collaboration in manufacturing tasks. Int. J. Interact. Des. Manuf. 11, 139–153 (2017)CrossRefGoogle Scholar
  21. 21.
    Andrés, L., Bálint, P., Althoefer, K.: Medical virtual reality palpation training using ultrasound based haptics and image processing. In: CRAS (2018)Google Scholar
  22. 22.
    Macklin, M., Müller, M., Chentanez, N., Kim, T.-Y.: Unified particle physics for real-time applications. ACM Trans. Graph. 33, 1–12 (2014)CrossRefGoogle Scholar
  23. 23.

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Queen Mary University of LondonLondonUK
  2. 2.Benemérita Universidad Autónoma de PueblaPueblaMexico

Personalised recommendations