Improved Haptic Music Player with Auditory Saliency Estimation

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8618)


This paper presents improvements made on our previous haptic music player designed to enhance music listening experience with mobile devices. Our previous haptic music player featured with dual-band rendering; it delivers bass beat sensations in music with rough superimposed vibrations and high-frequency salient features in music with high-frequency smooth vibrations. This work extends the previous algorithm by taking into account auditory saliency in determining the intensity of vibration to be rendered. The auditory saliency is estimated in real-time from several auditory features in music. The feasibility of multiband rendering was also tested using a wideband actuator. We carried out a user study to evaluate the subjective performance of three haptic music playing modes: saliency-improved dual-band rendering, saliency-improved multiband rendering, and our previous dual-band rendering. Experimental results showed that the new dual-band mode has perceptual merits over the multiband mode and the previous dual-band mode, particularly for rock or dance music. The results can contribute to enhancing multimedia experience by means of vibrotactile rendering of music.


Haptic I/O Vibrotactile music Multiband Auditory saliency 



This research was funded by the MSIP (Ministry of Science, ICT&Future Planning), Korea in the ICT R&D Program 2014 and the National Research Foundation (NRF) grant (No.2013R1A2A2A01016907).


  1. 1.
    Brooks, P.L., Frost, B.J.: The development and evaluation of a tactile vocoder for the profoundly deaf. Can. J. Public Health 77, 108–113 (1986)Google Scholar
  2. 2.
    Chang, A., O’Sullivan, C.: Audio-haptic feedback in mobile phones. In: Proceeding of the CHI, pp. 1264–1267. ACM (2005)Google Scholar
  3. 3.
    Evangelopoulos, G., Rapantzikos, K., Maragos, P., Avrithis, Y., Potamianos, A.: Audiovisual attention modeling and salient event detection. Multimodal Process. Interact. 33(2), 1–21 (2008)CrossRefGoogle Scholar
  4. 4.
    Gault, R.H.: Progress in experiments on tactual interpretation of oral speech. J. Abnorm. Psychol. Soc. Psychol. 19(2), 155–159 (1924)CrossRefGoogle Scholar
  5. 5.
    Hwang, I., Lee, H., Choi, S.: Real-time dual-band haptic music player for mobile devices. IEEE Trans. Haptics 6(3), 352–362 (2013)CrossRefGoogle Scholar
  6. 6.
    Hwang, I., Seo, J., Kim, M., Choi, S.: Vibrotactile perceived intensity for mobile devices as a function of direction, amplitude, and frequency. IEEE Trans. Haptics 6(3), 340–351 (2013)CrossRefGoogle Scholar
  7. 7.
    Kaiser, J.: On a simple algorithm to calculate the energy of a signal. In: Proceedings of International Conference on Acoustics, Speech, and Signal Processing, pp. 381–384 (1990)Google Scholar
  8. 8.
    Karam, M., Russo, F.A., Fels, D.I.: Designing the model human cochlea: an ambient crossmodal audio-tactile display. IEEE Trans. Haptics 2(3), 160–169 (2009)CrossRefGoogle Scholar
  9. 9.
    Lee, J., Choi, S.: Real-time perception-level translation from audio signals to vibrotactile effects. In: Proceeding of the CHI, pp. 2567–2576. ACM (2013)Google Scholar
  10. 10.
    Ma, Y., Hua, X., Lu, L., Zhang, H.: A generic framework of user attention model and its application in video summarization. IEEE Trans. Multimedia 7(5), 907–919 (2005)CrossRefGoogle Scholar
  11. 11.
    Mahns, D.A., Perkins, N.M., Sahai, V., Robinson, L., Rowe, M.J.: Vibrotactile frequency discrimination in human hairy skin. J. Neurophysiol. 95(3), 1442–1450 (2006)CrossRefGoogle Scholar
  12. 12.
    Stevens, S.S., Volkmann, J., Newman, E.: A scale for the measurement of the psychological magnitude pitch. J. Acoust. Soc. Am. 8, 185–190 (1937)CrossRefGoogle Scholar
  13. 13.
    Yoo, Y., Hwang, I., Choi, S.: Consonance of vibrotactile chords. To appear in the IEEE Transactions on Haptics (2014). doi: 10.1109/TOH.2013.57
  14. 14.
    Zlatintsi, A., Maragos, P., Potamianos, A., Evangelopoulos, G.: A saliency-based approach to audio event detection and summarization. In: Proceedings of European Signal Processing Conference (EUSIPCO), pp. 1294–1298. IEEE (2012)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2014

Authors and Affiliations

  1. 1.Electronics and Telecommunications Research Institute (ETRI)DaejeonRepublic of Korea
  2. 2.Pohang University of Science and Technology (POSTECH)PohangRepublic of Korea

Personalised recommendations