Improved Haptic Music Player with Auditory Saliency Estimation
This paper presents improvements made on our previous haptic music player designed to enhance music listening experience with mobile devices. Our previous haptic music player featured with dual-band rendering; it delivers bass beat sensations in music with rough superimposed vibrations and high-frequency salient features in music with high-frequency smooth vibrations. This work extends the previous algorithm by taking into account auditory saliency in determining the intensity of vibration to be rendered. The auditory saliency is estimated in real-time from several auditory features in music. The feasibility of multiband rendering was also tested using a wideband actuator. We carried out a user study to evaluate the subjective performance of three haptic music playing modes: saliency-improved dual-band rendering, saliency-improved multiband rendering, and our previous dual-band rendering. Experimental results showed that the new dual-band mode has perceptual merits over the multiband mode and the previous dual-band mode, particularly for rock or dance music. The results can contribute to enhancing multimedia experience by means of vibrotactile rendering of music.
KeywordsHaptic I/O Vibrotactile music Multiband Auditory saliency
This research was funded by the MSIP (Ministry of Science, ICT&Future Planning), Korea in the ICT R&D Program 2014 and the National Research Foundation (NRF) grant (No.2013R1A2A2A01016907).
- 1.Brooks, P.L., Frost, B.J.: The development and evaluation of a tactile vocoder for the profoundly deaf. Can. J. Public Health 77, 108–113 (1986)Google Scholar
- 2.Chang, A., O’Sullivan, C.: Audio-haptic feedback in mobile phones. In: Proceeding of the CHI, pp. 1264–1267. ACM (2005)Google Scholar
- 7.Kaiser, J.: On a simple algorithm to calculate the energy of a signal. In: Proceedings of International Conference on Acoustics, Speech, and Signal Processing, pp. 381–384 (1990)Google Scholar
- 9.Lee, J., Choi, S.: Real-time perception-level translation from audio signals to vibrotactile effects. In: Proceeding of the CHI, pp. 2567–2576. ACM (2013)Google Scholar
- 13.Yoo, Y., Hwang, I., Choi, S.: Consonance of vibrotactile chords. To appear in the IEEE Transactions on Haptics (2014). doi: 10.1109/TOH.2013.57
- 14.Zlatintsi, A., Maragos, P., Potamianos, A., Evangelopoulos, G.: A saliency-based approach to audio event detection and summarization. In: Proceedings of European Signal Processing Conference (EUSIPCO), pp. 1294–1298. IEEE (2012)Google Scholar