Skip to main content
Log in

Usability of Computerized Lung Auscultation–Sound Software (CLASS) for learning pulmonary auscultation

  • Original Article
  • Published:
Medical & Biological Engineering & Computing Aims and scope Submit manuscript

Abstract

The mastering of pulmonary auscultation requires complex acoustic skills. Computer-assisted learning tools (CALTs) have potential to enhance the learning of these skills; however, few have been developed for this purpose and do not integrate all the required features. Thus, this study aimed to assess the usability of a new CALT for learning pulmonary auscultation. Computerized Lung Auscultation–Sound Software (CLASS) usability was assessed by eight physiotherapy students using computer screen recordings, think-aloud reports, and facial expressions. Time spent in each task, frequency of messages and facial expressions, number of clicks and problems reported were counted. The timelines of the three methods used were matched/synchronized and analyzed. The tasks exercises and annotation of respiratory sounds were the ones requiring more clicks (median 132, interquartile range [23–157]; 93 [53–155]; 91 [65–104], respectively) and where most errors (19; 37; 15%, respectively) and problems (n = 7; 6; 3, respectively) were reported. Each participant reported a median of 6 problems, with a total of 14 different problems found, mainly related with CLASS functionalities (50%). Smile was the only facial expression presented in all tasks (n = 54). CLASS is the only CALT available that meets all the required features for learning pulmonary auscultation. The combination of the three usability methods identified advantages/disadvantages of CLASS and offered guidance for future developments, namely in annotations and exercises. This will allow the improvement of CLASS and enhance students’ activities for learning pulmonary auscultation skills.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

References

  1. Andrès E, Reichert S, Gass R, Brandt C (2009) A French national research project to the creation of an auscultation’s school: the ASAP project. Eur J Intern Med 20:323–327

    Article  PubMed  Google Scholar 

  2. Bohadana A, Izbicki G, Kraman SS (2014) Fundamentals of lung auscultation. N Engl J Med 370:744–751

    Article  CAS  PubMed  Google Scholar 

  3. Branco P, Encarnação LMEd, Marcos AF (2006) It’s all in the face: studies on monitoring users’ experience. In: Brunet P, Correia N, Baranoski G (eds) Third Ibero-American Symposium in Computer Graphics, Santiago de Compostela. doi:10.2312/LocalChapterEvents/siacg/siacg06/045-051

  4. Cohn JF, Ambadar Z, Ekman P (2007) Observer-based measurement of facial expression with the Facial Action Coding System. In: Coan JA, Allen JJB (eds) The handbook of emotion elicitation and assessment. Oxford University Press, Oxford, pp 203–221

  5. Cotton D, Gresty K (2006) Reflecting on the think-aloud method for evaluating e-learning. Br J Educ Technol 37:45–54

    Article  Google Scholar 

  6. Davis S, Wiedenbeck S (1998) The effect of interaction style and training method on end user learning of software packages. Interact Comput 11:147–172

    Article  Google Scholar 

  7. Diah NM, Ismail M, Ahmad S, Dahari MKM (2010) Usability testing for educational computer game using observation method. In: International Conference on Information Retrieval & Knowledge Management, (CAMP), Malaysia, 2010. IEEE, pp 157–161. doi:10.1109/INFRKM.2010.5466926

  8. Dinis J, Campos G, Rodrigues J, Marques A (2012) Respiratory Sound Annotation Software. In: HEALTHINF, pp 183–188

  9. El Kaliouby R, Robinson P (2005) Real-time inference of complex mental states from facial expressions and head gestures. In: Kisačanin B, Pavlović V, Huang TS (eds) Real-time vision for human-computer interaction. Springer, Boston, pp 181–200

  10. Gross V, Hadjileontiadis L, Penzel T, Koehler U, Ogelmeier C (2003) Multimedia database “Marburg Respiratory Sounds (MARS)”. In: Engineering in medicine and biology society. Proceedings of the 25th Annual International Conference of the IEEE, Cancun, Mexico, 2003. IEEE, pp 456–457

  11. Hilbert DM, Redmiles DF (2000) Extracting usability information from user interface events. ACM Comput Surv (CSUR) 32:384–421

    Article  Google Scholar 

  12. Holzinger A (2005) Usability engineering methods for software developers. Commun ACM 48:71–74

    Article  Google Scholar 

  13. Hou J, Chen Y-T, Hu L-C, Chuang C-C, Chiu Y-H, Tsai M-S (2008) Computer-aided auscultation learning system for nursing technique instruction. In: 30th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, Vancouver, Canada, 2008. IEEE, pp 1575–1578. doi:10.1109/IEMBS.2008.4649472

  14. ISO9241-11 (1998) Ergonomic requirements for office work with visual display terminals (VDTs). Part 11: guidance on usability. International Organization for Standardization, Switzerland

  15. Karnath BM, Das Carlo M, Holden MD (2004) A comparison of faculty-led small group learning in combination with computer-based instruction versus computer-based instruction alone on identifying simulated pulmonary sounds. Teach Learn Med 16:23–27

    Article  PubMed  Google Scholar 

  16. Krall MA, Sittig DF (2002) Clinician’s assessments of outpatient electronic medical record alert and reminder usability and usefulness requirements. In: Proceedings of the AMIA Symposium, San Antonio, TX, 2002. AMIA, p 400

  17. Kushniruk AW, Patel VL (2004) Cognitive and usability engineering methods for the evaluation of clinical information systems. J Biomed Inform 37:56–76

    Article  PubMed  Google Scholar 

  18. Kushniruk AW, Patel VL, Cimino JJ (1997) Usability testing in medical informatics: cognitive approaches to evaluation of information systems and user interfaces. In: Proceedings of the AMIA annual fall symposium, Nashville, TN, 1997. AMIA, p 218

  19. Mangione S, Dennis S (1992) CompuLung: a multimedia CBL on pulmonary auscultation. In: Proceedings of the 16th Annual Symposium on Computer Application in Medical Care, Baltimore, Maryland 1992. AMIA, p 820

  20. Mangione S, Nieman LZ, Gracely EJ (1992) Comparison of computer-based learning and seminar teaching of pulmonary auscultation to first-year medical students. Acad Med 67:S63–S65

    Article  CAS  PubMed  Google Scholar 

  21. Noldus - The Observer XT. http://www.noldus.com/human-behavior-research/products/the-observer-xt. Accessed 18 May 2017

  22. O'Brien HL, Toms EG (2010) The development and evaluation of a survey to measure user engagement. J Am Soc Inf Sci Technol 61:50–69

    Article  Google Scholar 

  23. Oliveira A, Pinho C, Monteiro S, Marcos A, Marques A (2013) Usability testing of a respiratory interface using computer screen and facial expressions videos. Comput Biol Med 43:2205–2213

    Article  PubMed  Google Scholar 

  24. Pantic M, Rothkrantz LJ (2004) Facial action recognition for facial expression analysis from static face images. IEEE Trans Syst Man Cybern 34:1449–1461

    Article  Google Scholar 

  25. Pinho C, Oliveira D, Oliveira A, Dinis J, Marques A (2012) LungSounds@ UA interface and multimedia database. Procedia Technol 5:803–811

    Article  Google Scholar 

  26. Rozin P, Cohen AB (2003) High frequency of facial expressions corresponding to confusion, concentration, and worry in an analysis of naturally occurring facial expressions of Americans. Emotion 3:68

    Article  PubMed  Google Scholar 

  27. Semedo J, Oliveira A, Machado A, Moreira J, Rodrgues J, Aparício J, Pasterkamp H, Jesus LMT, Marques A (2015) Computerised Lung Auscultation–Sound Software (CLASS). Paper presented at the Conference on Health and Social Care Information Systems and Technologies, Vilamoura

    Google Scholar 

  28. Sestini P, Renzoni E, Rossi M, Beltrami V, Vagliasindi M (1995) Multimedia presentation of lung sounds as a learning aid for medical students. Eur Respir J 8:783–788

    CAS  PubMed  Google Scholar 

  29. Sonderegger A, Sauer J (2009) The influence of laboratory set-up in usability tests: effects on user performance, subjective ratings and physiological measures. Ergonomics 52:1350–1361

    Article  PubMed  Google Scholar 

  30. Squires D, Preece J (1999) Predicting quality in educational software: evaluating for learning, usability and the synergy between them. Interact Comput 11:467–483

    Article  Google Scholar 

  31. Tamir D, Komogortsev OV, Mueller CJ (2008) An effort and time based measure of usability. In: Proceedings of the 6th International Workshop on Software Quality, Leipzig, Germany, 2008. ACM, pp 47-52. doi:10.1145/1370099.1370111

  32. TipCam. http://tipcam.en.softonic.com/download. Accessed 12 June 2015

  33. Vaismoradi M, Turunen H, Bondas T (2013) Content analysis and thematic analysis: implications for conducting a qualitative descriptive study. Nurs Health Sci 15:398–405

    Article  PubMed  Google Scholar 

  34. Veneri D (2011) The role and effectiveness of computer-assisted learning in physical therapy education: a systematic review. Physiother Theory Pract 27:287–298

    Article  PubMed  Google Scholar 

  35. Ward JJ (2005) Rale lung sounds 3.1 professional edition. Respir Care 50:1385–1388

    Google Scholar 

  36. Ward JJ, Wattier BA (2011) Technology for enhancing chest auscultation in clinical simulation. Respir Care 56:834–845

    Article  PubMed  Google Scholar 

  37. Yeasin M, Bullot B, Sharma R (2006) Recognition of facial expressions and measurement of levels of interest from video. Multimed IEEE Trans 8:500–508

    Article  Google Scholar 

  38. Zimmerman PH, Bolhuis JE, Willemsen A, Meyer ES, Noldus LP (2009) The observer XT: a tool for the integration and synchronization of multimodal signals. Behav Res Methods 41:731–735

    Article  PubMed  Google Scholar 

Download references

Acknowledgements

The authors would like to thank all the participants of the evaluation session for their contributions, which will improve the usability of CLASS, and give special thanks to Professor Hans Pasterkamp for his helpful comments in the preparation of this article.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Alda Marques.

Ethics declarations

Ethical approval was previously obtained from School Board Ethics Committee, and written informed consents were collected from all participants.

Funding

This work was funded by Fundação Calouste Gulbenkian, Portugal (Project: PGIS ID53 P-136395), and by Fundação para a Ciência e Tecnologia (FCT) (Project: 18/ID/2014). The work was also partially funded by Programa Operacional de Competitividade e Internacionalização-COMPETE, through Fundo Europeu de Desenvolvimento Regional-FEDER (POCI-01-0145-FEDER-016701), Fundação para a Ciência e Tecnologia (PTDC/DTPPIC/2284/2014), and under the project UID/BIM/04501/2013.

Conflict of interest

The authors declare that there is no conflict of interest.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Machado, A., Oliveira, A., Jácome, C. et al. Usability of Computerized Lung Auscultation–Sound Software (CLASS) for learning pulmonary auscultation. Med Biol Eng Comput 56, 623–633 (2018). https://doi.org/10.1007/s11517-017-1697-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11517-017-1697-8

Keywords

Navigation