Skip to main content

Haptic and audio interaction design

Abstract

Haptics, audio and human–computer interaction are three scientific disciplines that share interests, issues and methodologies. Despite these common points, interaction between these communities are sparse, because each of them have their own publication venues, meeting places, etc. A venue to foster interaction between these three communities was created in 2006, the Haptic and Audio Interaction Design workshop (HAID), aiming to provide a meeting place for researchers in these areas. HAID was carried out yearly from 2006 to 2013, then discontinued. Having worked in the intersection of these areas for several years, we felt the need to revive this event and decided to organize a HAID edition in 2019 in Lille, France. HAID 2019 was attended by more than 100 university, industry and artistic researchers and practitioners, showing the continued interest for such a unique venue. This special issue gathers extended versions of a selection of papers presented at the 2019 workshop. These papers focus on several directions of research on haptics, audio and HCI, including perceptual studies and the design, evaluation and use of vibrotactile and force-feedback devices in audio, musical and game applications.

Introduction

Research and development across haptics, audio/music and human–computer interaction (HCI) have a long history, going back several decades. Important work focusing on the intersection of these areas has been carried out since the late 70’s at the Association pour la Cration et la Recherche sur les Outils dExpression (ACROE), in Grenoble, France [5, 6]. Similarly, interest in haptics has been a constant in the computer music community, where new musical instruments and interactive interfaces with haptic feedback were frequently proposed, with early examples by Florens [9], Gillespie [12], Chafe [7], Bongers [2], Chu [8], O’Modhrain and Chafe [18], Rovan and Hayward [22], and Florens and Henry [10], among many others.

The HCI community has also shown early interest in audio and haptics. In the late 80’s, Gaver had the idea of using semaphoric sounds to encode information [11]. A few years later, Brewster at the Glasgow Interactive Systems Group (GIST) used musical notations to design auditory icons [4], and later used the same concept to create vibrotactile icons [3], and finally combined them to create cross-modal icons [14]. The same concept was also applied to pin-array actuators [21] and force-feedback [20].

The HAID workshop

This shared interest in interaction, audio and haptics led S. Brewster to create the HAID workshop in 2006, aiming to bring together researchers and practitioners interested in exploring how the haptic and audio modalities can be used together in human–computer interaction. While the audio, haptics and interaction communities all have their own venues, HAID was created as a unique meeting point for exchanges amongst the three communities. HAID workshops were held annually from 2006 to 2013, then discontinued.

Related events

Since 2013, a few events have specifically addressed haptics and audio/music. Most notably, the workshop “Haptics and Musical Practice: From audio-tactile psychophysics to the design and evaluation of digital musical interfaces”, organized by S. Papetti, took place in Zurich in February 2016. A related workshop, “Musical Haptics: Use and relevance of haptic feedback in musical practice” took place during the Eurohaptics Conference in July 2016. The recent book, Musical Haptics, edited by Papetti and Saitis [19], is a follow up of the Eurohaptics 2016 workshop and brings an excellent review of the state of the art on the use of haptics in musical applications.

Nevertheless, research on haptics and audio/music seems presently dominated by tactile interfaces, due in part to the widespread availability of vibrotactile feedback in portable devices and the availability of inexpensive and reliable vibrotactile actuators. Though a body of work was developed over the years focusing on measurements, models and applications, musical force-feedback has never become as widespread; the game-changing force-feedback musical application is yet to come. Despite this situation, in recent years a number of works have addressed several aspects of this topic, proposing software platforms and simulation models with the potential to provide popular and/or advanced force-feedback tools for musical applications [1, 15, 16, 23, 24].

The symposium “Force-Feedback and Music”, organized by M. Wanderley, held in Montreal in December 2016Footnote 1, specifically addressed the use of force-feedback devices in audio and musical applications.

HAID 2019: Academia, arts and industry

HAID 2019Footnote 2 was organized as a meeting place for exchanges among the audio, haptics and HCI research and practitioner communities, but also as a means to reach industries involved in the design of haptic and audio applications.

The first day of the workshop was dedicated to the meeting of academia and the industry. The Lille metropolis has two incubators for startups in technologies and creative industries, as well as a vibrant artistic community, making it an ideal location for a HAID revival. Thanks to the workshop, several of these startups specializing in music technologies could meet researchers and other international companies. The last two days focused on academic exchanges and hands-on demos, allowing participants to interact during the various paper, demo and workshop sessions.

Featured work

This special issue brings together extended versions of a selection of HAID 2019 papers. We present here a teaser for each of the five selected articles.

Multisensory instrumental dynamics as an emergent paradigm for digital musical creation. In this paper, the authors are interested in the concepts and technologies for the creation of Digital Musical Instruments (DMIs). In particular they study the close relation between haptics and audio in such systems. They discuss an overview of artistic creations from the last 20 years that rely on this concept, and the remaining challenges on this topic.

Tactile discrimination of material properties: application to virtual buttons for professional appliances. In this study, authors leverage sound and vibrations to simulate the contact with materials such as wood, plastic or metal. They present a user study in which participants were able to recognize the simulated material. They discuss the use of such feedback for the design of buttons for professional appliances.

Psychophysical comparison of the auditory and tactile perception: a survey. This paper presents an overview of dozens of works on either auditory or tactile perception, presenting their results in a uniform, coherent way. It allows for a direct comparison of the similarities and differences between both perception channels and will surely become a very useful tool for teaching auditory and haptic perception.

Defining a vibrotactile toolkit for digital musical instruments: Characterizing voice coil actuators, effects of loading, and equalization of the frequency response. This work addresses the need for a vibrotactile toolkit to be used for displaying vibrations in musical contexts. The authors evaluate designs for the display of a large range of frequencies with independent amplitude control and low distortion, proposing means to compensate for actuators’ typically non-flat frequency responses and for the effect of positioning actuators in DMIs.

Developing a mobile activity game for stroke survivors: Lessons learned. In this paper, authors studied the need for stroke survivors to adopt a healthier lifestyle and perform more exercises. Because usual exergame apps are not suited for stroke survivors because of their health condition, authors propose another app targeted to them. They present the design and implementation of the app, which features multimodal feedback (visual, audio and vibrotactile). They discuss lessons learned during the design process.

Conclusion and the future

The revival of HAID was a success, with more than a hundred attendees during the three days of the event. Participants of previous editions were delighted to return to such a forum, while new attendees enjoyed the interdisciplinary focus of the workshop. The industry day event was a key factor of HAID 2019’s success, while the two days of scientific presentations allowed researchers to exchange about their work and experience a variety of demos presented during the workshop.

One initial intent of HAID 2019 was to help foster research on force-feedback and music. We believe that reference reviews of technologies and methodologies for the design and implementation of force-feedback devices and applications [13, 17], as well as recent open platforms such as HapkitFootnote 3 and HaplyFootnote 4, might have a positive influence in future research and development in this direction.

We are pleased to pass the torch on the organizers of HAID 2020Footnote 5, I. Frissen, C. Frisson and V. Lvesque, who did a great job organizing the follow-up workshop in such challenging times. We encourage volunteers considering the organization of future HAID editions to contact us and the HAID 2020 organizers to plan upcoming events, so that this unique series of events may continue.

Notes

  1. 1.

    https://www.cirmmt.org/activities/workshops/research/ffedback_music.

  2. 2.

    https://haid2019.lille.inria.fr/.

  3. 3.

    http://hapkit.stanford.edu/.

  4. 4.

    https://www.haply.co/.

  5. 5.

    http://haid2020.etsmtl.ca/.

References

  1. 1.

    Berdahl E, Kontogeorgakopoulos A (2013) The FireFader: simple, open-source, and reconfigurable haptic force feedback for musicians. Comput Music J 37(1):23–34

    Article  Google Scholar 

  2. 2.

    Bongers B (1994) The use of active tactile and force feedback in timbre controlling electronic instruments. In: Proceedings of the 1994 international computer music conference. International Computer Music Association (ICMA), ICMC

  3. 3.

    Brewster S, Brown LM (2004) Tactons: structured tactile messages for non-visual information display. In: Proceedings of the fifth conference on Australasian user interface—volume 28. Australian Computer Society, Inc., AUIC, pp 15–23

  4. 4.

    Brewster SA, Wright PC, Edwards ADN (1993) An evaluation of Earcons for use in auditory human–computer interfaces. In: Proceedings of the INTERACT 93 and CHI 93 conference on human factors in computing systems. Association for Computing Machinery, New York, NY, USA, pp 222–227

  5. 5.

    Cadoz C, Luciani A, Florens JL (1984) Responsive input devices and sound synthesis by simulation of instrumental mechanisms: the Cordis system. Comput Music J 8(3):60–73

    Article  Google Scholar 

  6. 6.

    Cadoz C, Lisowski L, Florens JL (1990) A modular feedback keyboard design. Comput Music J 14(2):47–56

    Article  Google Scholar 

  7. 7.

    Chafe C (1993) Tactile audio feedback. In: Proceedings of the 1993 international computer music conference. International Computer Music Association (ICMA)

  8. 8.

    Chu L (1996) Haptic feedback in computer music performance. In: Proceedings of the 1996 international computer music conference. International Computer Music Association (ICMA)

  9. 9.

    Florens JL (2015) Coupleur gestuel interactif pour la commande et le contrôle de sons synthétisés en temps réel. PhD thesis, Institut National Polytechnique de Grenoble

  10. 10.

    Florens JL, Henry C (2001) Bowed string synthesis with force-feedback gesture interaction. In: Proceedings of the 2001 international computer music conference. ICMC, International Computer Music Association (ICMA), pp 115–118

  11. 11.

    Gaver WW (1989) The SonicFinder: an interface that uses auditory icons. Hum Comput Interact 4:67–94

    Article  Google Scholar 

  12. 12.

    Gillespie B (1992) The touchback keyboard. In: Proceedings of the 1992 international computer music conference. ICMC, International Computer Music Association (ICMA), pp 77–80

  13. 13.

    Hayward V, Maclean KE (2007) Do it yourself haptics: part I. IEEE Robot Autom Mag 14(4):88–104

    Article  Google Scholar 

  14. 14.

    Hoggan E, Brewster S (2007) Designing audio and tactile crossmodal icons for mobile devices. In: Proceedings of the 9th international conference on multimodal interfaces. Association for Computing Machinery, New York, NY, USA, ICMI, pp 162–169

  15. 15.

    Kirkegaard M, Frisson C, Wanderley MM (2020) TorqueTuner: a self contained module for designing rotary haptic force feedback for digital musical instruments. In: Proceedings of the 2020 international conference on new interfaces for musical expression. NIME

  16. 16.

    Leonard J (2015) La Plateforme MSCI : Vers un outil de cration dinstruments virtuels retour deffort. Application la cration musicale. PhD thesis, Institut National Polytechnique de Grenoble

  17. 17.

    Maclean KE, Hayward V (2008) Do it yourself haptics: part II [tutorial]. IEEE Robot Autom Mag 15(1):104–119

    Article  Google Scholar 

  18. 18.

    O’Modhrain MS, Chafe C (2000) The performer–instrument interaction: a sensory motor perspective. In: Proceedings of the 2000 international computer music conference. International Computer Music Association (ICMA), ICMC

  19. 19.

    Papetti S, Saitis C (eds) (2018) Musical haptics. Springer

  20. 20.

    Pietrzak T, Martin B, Pecci I (2005) Information display by dragged haptic bumps. In: Proceedings of Enactive/05, Genoa, Italy

  21. 21.

    Pietrzak T, Crossan A, Brewster SA, Martin B, Pecci I (2009) Creating usable pin array tactons for non-visual information. IEEE Trans Haptics 2(2):61–72

    Article  Google Scholar 

  22. 22.

    Rovan J, Hayward V (2000) Trends in gestural control of music. In: Typology of tactile sounds and their synthesis in gesture-driven computer music performance. Ircam—Centre Pompidou

  23. 23.

    Sinclair S, Wanderley MM (2009) A run-time programmable simulator to enable multi-modal interaction with rigid-body systems. Interact Comput 21:54–63

    Article  Google Scholar 

  24. 24.

    Sinclair S, Wanderley MM, Hayward V (2014) Velocity estimation algorithms for audio-haptic simulations involving stick-slip. IEEE Trans Haptics 7(4):533–544

    Article  Google Scholar 

Download references

Acknowledgements

We would like to thank the participants of the HAID 2019 workshop, as well as the various workshop sponsors. Thanks also to the HAID 2019 co-organizers and program committee, as well as the reviewers of the revised versions of papers in this edition. We are grateful to Inria Lille - Nord Europe (https://www.inria.fr/fr/centre-inria-lille-nord-europe), which provided invaluable support for organizing the 2019 workshop. We warmly thank Plaine Images (https://www.plaine-images.fr/) for hosting and advertising the event. We would also like to thank CRIStAL (http://cristal.univ-lille.fr/) and I-Site ULNE (http://www.isite-ulne.fr/) for partially funding the event. This work was made possible by an Inria International Chair awarded to the second guest editor.

Author information

Affiliations

Authors

Corresponding author

Correspondence to Thomas Pietrzak.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Pietrzak, T., Wanderley, M.M. Haptic and audio interaction design. J Multimodal User Interfaces 14, 231–233 (2020). https://doi.org/10.1007/s12193-020-00344-w

Download citation

Keywords

  • Haptics
  • Audio
  • Interaction
  • Music
  • Human–computer interaction
  • Design