Skip to main content

Improved Haptic Music Player with Auditory Saliency Estimation

  • Conference paper
  • First Online:
Haptics: Neuroscience, Devices, Modeling, and Applications (EuroHaptics 2014)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 8618))

Abstract

This paper presents improvements made on our previous haptic music player designed to enhance music listening experience with mobile devices. Our previous haptic music player featured with dual-band rendering; it delivers bass beat sensations in music with rough superimposed vibrations and high-frequency salient features in music with high-frequency smooth vibrations. This work extends the previous algorithm by taking into account auditory saliency in determining the intensity of vibration to be rendered. The auditory saliency is estimated in real-time from several auditory features in music. The feasibility of multiband rendering was also tested using a wideband actuator. We carried out a user study to evaluate the subjective performance of three haptic music playing modes: saliency-improved dual-band rendering, saliency-improved multiband rendering, and our previous dual-band rendering. Experimental results showed that the new dual-band mode has perceptual merits over the multiband mode and the previous dual-band mode, particularly for rock or dance music. The results can contribute to enhancing multimedia experience by means of vibrotactile rendering of music.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Brooks, P.L., Frost, B.J.: The development and evaluation of a tactile vocoder for the profoundly deaf. Can. J. Public Health 77, 108–113 (1986)

    Google Scholar 

  2. Chang, A., O’Sullivan, C.: Audio-haptic feedback in mobile phones. In: Proceeding of the CHI, pp. 1264–1267. ACM (2005)

    Google Scholar 

  3. Evangelopoulos, G., Rapantzikos, K., Maragos, P., Avrithis, Y., Potamianos, A.: Audiovisual attention modeling and salient event detection. Multimodal Process. Interact. 33(2), 1–21 (2008)

    Article  Google Scholar 

  4. Gault, R.H.: Progress in experiments on tactual interpretation of oral speech. J. Abnorm. Psychol. Soc. Psychol. 19(2), 155–159 (1924)

    Article  Google Scholar 

  5. Hwang, I., Lee, H., Choi, S.: Real-time dual-band haptic music player for mobile devices. IEEE Trans. Haptics 6(3), 352–362 (2013)

    Article  Google Scholar 

  6. Hwang, I., Seo, J., Kim, M., Choi, S.: Vibrotactile perceived intensity for mobile devices as a function of direction, amplitude, and frequency. IEEE Trans. Haptics 6(3), 340–351 (2013)

    Article  Google Scholar 

  7. Kaiser, J.: On a simple algorithm to calculate the energy of a signal. In: Proceedings of International Conference on Acoustics, Speech, and Signal Processing, pp. 381–384 (1990)

    Google Scholar 

  8. Karam, M., Russo, F.A., Fels, D.I.: Designing the model human cochlea: an ambient crossmodal audio-tactile display. IEEE Trans. Haptics 2(3), 160–169 (2009)

    Article  Google Scholar 

  9. Lee, J., Choi, S.: Real-time perception-level translation from audio signals to vibrotactile effects. In: Proceeding of the CHI, pp. 2567–2576. ACM (2013)

    Google Scholar 

  10. Ma, Y., Hua, X., Lu, L., Zhang, H.: A generic framework of user attention model and its application in video summarization. IEEE Trans. Multimedia 7(5), 907–919 (2005)

    Article  Google Scholar 

  11. Mahns, D.A., Perkins, N.M., Sahai, V., Robinson, L., Rowe, M.J.: Vibrotactile frequency discrimination in human hairy skin. J. Neurophysiol. 95(3), 1442–1450 (2006)

    Article  Google Scholar 

  12. Stevens, S.S., Volkmann, J., Newman, E.: A scale for the measurement of the psychological magnitude pitch. J. Acoust. Soc. Am. 8, 185–190 (1937)

    Article  Google Scholar 

  13. Yoo, Y., Hwang, I., Choi, S.: Consonance of vibrotactile chords. To appear in the IEEE Transactions on Haptics (2014). doi:10.1109/TOH.2013.57

  14. Zlatintsi, A., Maragos, P., Potamianos, A., Evangelopoulos, G.: A saliency-based approach to audio event detection and summarization. In: Proceedings of European Signal Processing Conference (EUSIPCO), pp. 1294–1298. IEEE (2012)

    Google Scholar 

Download references

Acknowledgment

This research was funded by the MSIP (Ministry of Science, ICT&Future Planning), Korea in the ICT R&D Program 2014 and the National Research Foundation (NRF) grant (No.2013R1A2A2A01016907).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Inwook Hwang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Hwang, I., Choi, S. (2014). Improved Haptic Music Player with Auditory Saliency Estimation. In: Auvray, M., Duriez, C. (eds) Haptics: Neuroscience, Devices, Modeling, and Applications. EuroHaptics 2014. Lecture Notes in Computer Science(), vol 8618. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-662-44193-0_30

Download citation

  • DOI: https://doi.org/10.1007/978-3-662-44193-0_30

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-662-44192-3

  • Online ISBN: 978-3-662-44193-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics