EQClinic: A Platform for Improving Medical Students’ Clinical Communication Skills
Communication is important in clinical interaction; thus medical students require communication skills training. Most training programs focus on developing students’ verbal skills, and nonverbal communication is not given sufficient attention. This paper describes a tele-health training platform EQClinic, which has the capability of automatically detecting nonverbal behaviour in tele-consultations and providing medical students with human and computer-generated feedback for improving their communication skills. In this paper, we describe EQClinic’s components and report preliminary results from an 8-week user study with 135 medical students. The students were provided two opportunities of having face-to-face consultation, and between these two consultations students were also asked to complete a tele-consultation using EQClinic. Student found the system usable and their scores in the second face-to-face consultations improved after the tele-consultation (from 12.58 to 13.53, p = 0.005). The results suggest that EQClinic positively influenced medical students’ learning and may be a valuable tool in medical education.
This project was funded by an internal grant from the Brain and Mind Centre at the University of Sydney, Australia and Australian Government. RC is funded by the Australian Research Council.
- 1.Chung, S.C., Barma, S., Kuan, T.-W., Lin, T.-W.: Frowning expression detection based on SOBEL filter for negative emotion recognition. In: 2014 IEEE International Conference on Paper Presented at the Orange Technologies (ICOT), Xi’an (2014)Google Scholar
- 3.Fung, M., Jin, Y., Zhao, R., Hoque, M.E.: ROC speak: semi-automated personalized feedback on nonverbal behavior from recorded videos. In: Paper Presented at the Proceedings of the 2015 ACM International Joint Conference on Pervasive and Ubiquitous Computing, Osaka (2015)Google Scholar
- 4.GENKI-4 K: The MPLab GENKI Database, GENKI-4 K Subset. http://mplab.ucsd.edu
- 6.Hoque, M.E., Courgeon, M., Martin, J.-C., Mutlu, B., Picard, R.W.: Mach: my automated conversation coach. In: Paper Presented at the Proceedings of the 2013 ACM International Joint Conference on Pervasive and Ubiquitous Computing, Zurich (2013)Google Scholar
- 8.Jang, J.S.R.: Utility Toolbox (2016). http://mirlab.org/jang
- 9.Kawato, S., Ohya, J.: Real-time detection of nodding and head-shaking by directly detecting and tracking the “between-eyes”. In: Fourth IEEE International Conference on Paper Presented at the Automatic Face and Gesture Recognition, 2000, Proceedings, Grenoble (2000)Google Scholar
- 10.Liu, C., Calvo, R.A., Lim, R.: Improving medical students’ awareness of their nonverbal communication through automated nonverbal behavior feedback. Front. ICT 3(11) (2016)Google Scholar
- 11.Lucey, P., Cohn, J.F., Kanade, T., Saragih, J., Ambadar, Z., Matthews, I.: The extended cohn-kanade dataset (ck+): a complete dataset for action unit and emotion-specified expression. In: 2010 IEEE Computer Society Conference on Paper Presented at the Computer Vision and Pattern Recognition Workshops (CVPRW), San Francisco (2010)Google Scholar
- 15.Smith, L.I.: A tutorial on principal components analysis. Cornell Univ. USA 51(52), 65 (2002)Google Scholar
- 16.Stewart, M.A.: Effective physician-patient communication and health outcomes: a review. CMAJ Can. Med. Assoc. J. 152(9), 1423 (1995)Google Scholar
- 18.Viola, P., Jones, M.: Rapid object detection using a boosted cascade of simple features. In: Proceedings of the 2001 IEEE Computer Society Conference on Paper Presented at the Computer Vision and Pattern Recognition, 2001, CVPR 2001, Kauai (2001)Google Scholar
- 19.Wei, Y.: Research on facial expression recognition and synthesis. Master Thesis, Department of Computer Science and Technology, Nanjing (2009)Google Scholar