Universal Access in the Information Society

, Volume 16, Issue 2, pp 365–379 | Cite as

Physiological mouse: toward an emotion-aware mouse

  • Yujun Fu
  • Hong Va Leong
  • Grace Ngai
  • Michael Xuelin Huang
  • Stephen C. F. Chan
Long paper


Human-centered computing is rapidly becoming a major research direction as new developments in sensor technology make it increasingly feasible to obtain signals from human beings. At the same time, the pervasiveness of computing devices is also encouraging more research in human–computer interaction, especially in the direction of personalized and adaptive user interfaces. Among the various research issues, affective computing, or the ability of computers to understand and react according to what a user “feels,” has been gaining in importance. In order to recognize the human affect (feeling), computers rely on the analysis of signal inputs captured by a multitude of means. This paper proposes the use of human physiological signals as a new form of modality in determining human affects, in a non-intrusive manner. The principle of non-invasiveness is very important, since it imposes no extra burden on the user, which improves user accessibility and encourages user adoption. This goal is realized via the physiological mouse, as a first step toward the support of affective computing. The conventional mouse is converted with a small optical component for capturing user photoplethysmographic (PPG) signal. With the PPG signal, it is possible to compute and derive human physiological signals. A prototype of the physiological mouse was built and raw PPG readings measured. The accuracy of the approach was evaluated through empirical studies to determine human physiological signals from the mouse PPG data. Finally, pilot experiments to correlate human physiological signals with various modes of human–computer interaction, namely gaming and video watching, were conducted. The trend in physiological signals could be used as feedback to the computer system which in turn adapts to the needs or the mood of the user, for instance change the volume and the light intensity when watching a video or playing a game based on current user emotion. The authors argue that this research will provide a new dimension for multimodal affective computing research, and the pilot study has already shed some light toward this research goal.


Affective computing Physiological signals Non-intrusive measurement Gadget prototype Human emotion 



We would like to thank the experiment subjects for their time and patience. We would also like to thank the reviewers for their valuable comments for improving this paper. This research is supported in part by the Research Grant Council and the Hong Kong Polytechnic University under Grant Nos. PolyU 5235/11E and PolyU 5222/13E.


  1. 1.
    Allen, J.: Photoplethysmography and its application in clinical physiological measurement. Physiol. Meas. 28, R1–R39 (2007)CrossRefGoogle Scholar
  2. 2.
    Ark, W.,S., Dryer, D.C., Lu, D.J.: The emotion mouse. In: Proceedings of SIGCHI, ACM, pp. 818–823 (1999)Google Scholar
  3. 3.
    Bixler, R., D’Mello, S.: Detecting boredom and engagement during writing with keystroke analysis, task appraisals, and stable traits. In: Proceedings of the 2013 International Conference on Intelligent User Interfaces, ACM, p. 225 (2013)Google Scholar
  4. 4.
    Bixler, R., D’Mello, S.: Detecting pulse from head motions in video. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, pp. 3430–3437 (2013)Google Scholar
  5. 5.
    Bolt, R.: Put-that-there: voice and gesture at the graphics interface. ACM SIGGRAPH Comput. Graph. 14(3), 262–270 (1980)CrossRefGoogle Scholar
  6. 6.
    Brayda, L., Campus, C., Memeo, M., Lucagrossi, L.: The importance of visual experience, gender and emotion in the assessment of an assistive tactile mouse. IEEE Transactions on Haptics (2015). To appearGoogle Scholar
  7. 7.
    Brown, T., Beightolm, L., Koh, J., Eckberg, D.: Important influence of respiration on human RR interval power spectra is largely ignored. J. Appl. Physiol. 75(5), 2310–2317 (1993)Google Scholar
  8. 8.
    Calvo, R., Mello, S.: Affect detection: an interdisciplinary review of models, methods, and their applications. IEEE Trans. Affect. Comput. 1(1), 18–37 (2010)CrossRefGoogle Scholar
  9. 9.
    Changizi, M.: The Vision Revolution: How the Latest Research Overturns Everything We Thought We Knew About Human Vision. BenBella Books, Dallas, Texas (2009)Google Scholar
  10. 10.
    Ekman, P., Friesen, W.: Detecting deception from the body or face. J. Pers. Social Psychol. 29(3), 288–298 (1974)CrossRefGoogle Scholar
  11. 11.
    Emotiv. EEG System/Electroencephalography.
  12. 12.
    Huang, M.X., Kwok, T.C.K., Ngai, G., Leong, H.V., Chan, S.C.F.: Building a self-learning eye gaze model from user interaction data. In: Proceedings of the 2014 International Conference on Multimedia, ACM, pp. 1017–1020 (2014)Google Scholar
  13. 13.
  14. 14.
    Kim, B., Yoo, S.: Motion artifact reduction in photoplethysmography using independent component analysis. IEEE Trans. Biomed. Eng. 53(3), 566–568 (2006)MathSciNetCrossRefGoogle Scholar
  15. 15.
    Kim, J., Andre, E.: Emotion recognition based on physiological changes in music listening. IEEE Trans. Pattern Anal. Mach. Intell. 30(12), 2067–2083 (2008)CrossRefGoogle Scholar
  16. 16.
    Kusk, K., Nielsen, D., Thylstrup, T., Rasmussen, N., Jorvang, J., Pedersen, C., Wagner, S.: Feasibility of using a lightweight context-aware system for facilitating reliable home blood pressure self-measurements. In: Proceedings of International Conference on Pervasive Computing Technologies for Healthcare, pp. 236–239 (2013)Google Scholar
  17. 17.
    Kwok, T.C.K., Huang, M.X., Tam, W.C., Ngai, G.: Emotar: communicating feelings through video sharing. In: Proceedings of the 2015 International Conference on Intelligent User Interfaces, ACM, pp. 374–378 (2015)Google Scholar
  18. 18.
    Lalanne, D., Robinson, P., Nigay, L., Vanderdonckt, J., Palanque, P., Ladry, J.: Fusion engines for multimodal input: a survey. In: Proceedings of ACM International Conference on Multimodal Interfaces, ACM, pp. 153–160 (2009)Google Scholar
  19. 19.
    Lo, K.W.K., Lau, C., K., Huang, M.,X., Tang, W.,W., Ngai, G., Chan,S.,C.,F.: Mobile DJ: a tangible, mobile platform for active and collaborative music listening. In: Proceedings of International Conference on New Interfaces for Musical Expression, ACM, pp. 217–222 (2013)Google Scholar
  20. 20.
  21. 21.
    Oviatt, S.: Advances in robust multimodal interface design. IEEE Comput. Graph. Appl. 23(5), 62–68 (2003)CrossRefGoogle Scholar
  22. 22.
    Oviatt, S.: Multimodal interfaces. In :Human-Computer Interaction Handbook: Fundamentals, Evolving Technologies, and Emerging Applications, pp. 286–304. L. Erlbaum Assoc. Inc. (2007)Google Scholar
  23. 23.
    Picard, R.: Affective Computing. The MIT Press, Cambridge (1997)CrossRefGoogle Scholar
  24. 24.
    Rainville, P., Bechara, A., Naqvi, N., Damasio, A.: Basic emotions are associated with distinct patterns of cardiorespiratory activity. J. Pers. Social Psychol. 61(1), 5–18 (2006)Google Scholar
  25. 25.
    Scully, C., Lee, J., Meyer, J., Gorbach, A., Granquist-Fraser, D., Mendelson, Y., Chon, K.: Physiological parameter monitoring from optical recordings with a mobile phone. IEEE Trans. Biomed. Eng. 59(2), 303–306 (2012)CrossRefGoogle Scholar
  26. 26.
    Shivappa, S., Trivedi, M., Rao, B.: Audiovisual information fusion in human–computer interfaces and intelligent environments: a survey. Proc. IEEE 98(10), 1–24 (2010)CrossRefGoogle Scholar
  27. 27.
    Soleymani, M., Lichtenauer, J., Pun, T., Pantic, M.: A multimodal database for affect recognition and implicit tagging. IEEE Trans. Affect. Comput. 3(1), 42–55 (2012)CrossRefGoogle Scholar
  28. 28.
    Sun, H.J., Huang, M.X., Ngai, G., Chan, S.C.F.: Nonintrusive multimodal attention detection. In: IEEE Proceedings of International Conference on Advances in Computer–Human Interactions (2014)Google Scholar
  29. 29.
    Waluyo, A., Yeoh, W., Pek, I., Yong, Y., Chen, X.: Mobisense: mobile body sensor network for ambulatory monitoring. ACM Trans. Embed. Comput. Syst. 10(1), 13–42 (2010)CrossRefGoogle Scholar
  30. 30.
    Yamauchi, T.: Mouse trajectories and state anxiety: feature selection with random forest. In: IEEE Proceedings of ACII, pp. 399–404 (2013)Google Scholar
  31. 31.
    Zeng, Z., Pantic, M., Roisman, G., Huang, T.: A survey of affect recognition methods: audio, visual, and spontaneous expressions. IEEE Trans. Pattern Anal. Mach. Intell. 31(1), 39–58 (2009)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2016

Authors and Affiliations

  • Yujun Fu
    • 1
  • Hong Va Leong
    • 1
  • Grace Ngai
    • 1
  • Michael Xuelin Huang
    • 1
  • Stephen C. F. Chan
    • 1
  1. 1.Department of ComputingThe Hong Kong Polytechnic UniversityHong KongChina

Personalised recommendations