Advertisement

AI & SOCIETY

, Volume 33, Issue 4, pp 583–597 | Cite as

Assistive Device Art: aiding audio spatial location through the Echolocation Headphones

  • Aisen C. ChacinEmail author
  • Hiroo Iwata
  • Victoria Vesna
Open Forum
  • 120 Downloads

Abstract

Assistive Device Art derives from the integration of Assistive Technology and Art, involving the mediation of sensorimotor functions and perception from both, psychophysical methods and conceptual mechanics of sensory embodiment. This paper describes the concept of ADA and its origins by observing the phenomena that surround the aesthetics of prosthesis-related art. It also analyzes one case study, the Echolocation Headphones, relating its provenience and performance to this new conceptual and psychophysical approach of tool design. This ADA tool is designed to aid human echolocation. They facilitate the experience of sonic vision, as a way of reflecting and learning about the construct of our spatial perception. Echolocation Headphones are a pair of opaque goggles which disable the participant’s vision. This device emits a focused sound beam which activates the space with directional acoustic reflection, giving the user the ability to navigate and perceive space through audition. The directional properties of parametric sound provide the participant a focal echo, similar to the focal point of vision. This study analyzes the effectiveness of this wearable sensory extension for aiding auditory spatial location in three experiments; optimal sound type and distance for object location, perceptual resolution by just noticeable difference, and goal-directed spatial navigation for open pathway detection, all conducted at the Virtual Reality Lab of the University of Tsukuba, Japan. The Echolocation Headphones have been designed for a diverse participant base. They have both the potential to aid auditory spatial perception for the visually impaired and to train sighted individuals in gaining human echolocation abilities. Furthermore, this Assistive Device artwork instigates participants to contemplate on the plasticity of their sensorimotor architecture.

Keywords

Assistive Technology Human echolocation Device Art Sensory substitution 

Notes

Acknowledgements

The research conducted for this manuscript and prototype were supported by the Empowerment Informatics department at the University of Tsukuba, University of California, Los Angeles, and the National Science Foundation. This research has been made possible by the careful consideration and advisement of Hiroo Iwata, Victoria Vesna, and Hiroaki Yano, and with the editing help of Tyson Urich and Nicholas Spencer.

References

  1. Antonellis A (ed) (2013) Net art implant. http://www.anthonyantonellis.com/news-post/item/670-net-art-implant. Accessed 12 Sept 2017
  2. Au Whitlow WL (1997) Echolocation in dolphins with a dolphin–bat comparison. Bioacoustics 8:137–162. doi: 10.1080/09524622.1997.9753357 CrossRefGoogle Scholar
  3. Auger J, Loizeau J (2002) Audio tooth implant. In: http://www.auger-loizeau.com/projects/toothimplant. Accessed 12 Aug 2017
  4. Bach-y-Rita P, Kaczmarek KA, Tyler ME, Garcia-Lara J (1998) Form perception with a 49-point electrotactile stimulus array on the tongue. J Rehabilit Res Dev 35:427–430Google Scholar
  5. Beta Tank, Eyal B, Michele G (2007) Eye candy. In: Studio Eyal Burstein. http://www.eyalburstein.com/eye-candy/1n5r92f24hltmysrkuxtiuepd6nokx. Accessed 12 Aug 2017
  6. Blakemore S-J, Frith U (2005) The learning brain: lessons for education. Blackwell Publishing, Malden. ISBN 1-4051-2401-6Google Scholar
  7. Brain Center America (2008) Brain functions: visuospatial skills. http://www.braincenteramerica.com/visuospa.php. Accessed May 2013
  8. Chacin AC (2013) Sensory pathways for the plastic mind. http://aisencaro.com/thesis2.html. Accessed 25 May 2017
  9. Collignon O, Voss P, Lassonde M, Lepore F (2008) Cross-modal plasticity for the spatial processing of sounds in visually deprived subjects. Exp Brain Res 192:343–358. doi: 10.1007/s00221-008-1553-z CrossRefGoogle Scholar
  10. D3A Lab at Princeton University (2010) http://www.princeton.edu/3D3A. Accessed 20 May 2013
  11. Dinkla S (1996) From interaction to participation: toward the origins of interactive art. In: Hershman-Leeson (ed) Clicking in: hot links to a digital culture. Bay, Seattle, p 279–290Google Scholar
  12. Driver J, Spence C (2000) Multisensory perception: beyond modularity and convergence. Curr Biol 10:R731–R735. doi: 10.1016/s0960-9822(00)00740-5 CrossRefGoogle Scholar
  13. Dunne A, Raby F (2014) Speculative everything: design, fiction, and social dreaming. MIT Press, Cambridge. ISBN: 9780262019842Google Scholar
  14. Goodale MA, Milner A (1992) Separate visual pathways for perception and action. Trends Neurosci 15:20–25. doi: 10.1016/0166-2236(92)90344-8 CrossRefGoogle Scholar
  15. Guillot P (2016) Pad Library. In: Patch storage. http://patchstorage.com/pad-library/. Accessed 10 May 2017
  16. Haberkern R (2012) How directional speakers work. Soundlazer. http://www.soundlazer.com/. Accessed 26 May 2017
  17. Holosonics Research Labs (1997) About the inventor: Dr Joseph Pompei. In: Audio spotlight by Holosonics. https://www.holosonics.com/about-us-1/. Accessed 26 May 2017
  18. Holzner D (2008) The envelope generator. In: Hyde, Adam (ed 2009) Pure data. https://booki.flossmanuals.net/pure-data/audio-tutorials/envelope-generator. Accessed 26 May 2017
  19. Houde S, Hill C (1997) What do prototypes prototype? Handb Hum Comput Interact. doi: 10.1016/b978-044481862-1.50082-0 CrossRefGoogle Scholar
  20. Hutto D, Myin E (2013) A helping hand. Radicalizing enactivism: minds without content. MIT Press, Cambridge. 46. ISBN 9780262018548Google Scholar
  21. Ihde D (1979) Tecnics and praxis, vol 24. D. Reidel, Dordrecht, pp 3–15. doi: 10.1007/978-94-009-9900-8_1 CrossRefGoogle Scholar
  22. Iwasaki M, Inomara H (1986) Relation between superficial capillaries and foveal structures in the human retina. Investig Ophthalmol Vis Sci 27:1698–1705. http://iovs.arvojournals.org/article.aspx?articleid=2177383.
  23. Iwata H (2000) Floating eye. http://www.iamas.ac.jp/interaction/i01/works/E/hiroo.html. Accessed 25 May 2017
  24. Iwata H (2004) What is Device Art. http://www.deviceart.org/. Accessed 10 April 2017
  25. Kish D (2011) Blind vision. In: PopTech: PopCast http://poptech.org/popcasts/daniel_kish_blind_vision. Accessed 20 May 2013
  26. Kreidler J (2009) 3.3 Subtractive synthesis. In: Programming electronic music in Pd. http://www.pd-tutorial.com/english/ch03s03.html. Accessed 3 Mar 2017
  27. Kusahara M (2010) Wearing media: technology, popular culture, and art in Japanese daily life. In: Niskanen, E (ed) Imaginary Japan: Japanese fantasy in contemporary popular culture. Turku: International Institute for Popular Culture. http://iipc.utu.fi/publications.html. Accessed 25 May 2017
  28. Lundborg G, Rosén B, Lindberg S (1999) Hearing as substitution for sensation: a new principle for artificial sensibility. J Hand Surg 24(2):219–224CrossRefGoogle Scholar
  29. Mishkin M, Ungerleider LG (1982) Contribution of striate inputs to the visuospatial functions of parieto-preoccipital cortex in monkeys. Behav Brain Res 6:57–77. doi: 10.1016/0166-4328(82)90081-x CrossRefGoogle Scholar
  30. Picard DJ, Hulse J, Krajewski J (2009) Integrated fire exit alert system. US Patent 7528700. 5 May 2009Google Scholar
  31. Pullin G (2009) Desing meets disability. Massachusetts Institute of Technology Press, MassachussettsGoogle Scholar
  32. Rosch E, Thompson E, Varela FJ (1991) The embodied mind: cognitive science and human experience. MIT Press, Cambridge, MAGoogle Scholar
  33. Schwartzman M (2011) See yourself sensing: redefining human perception. Black Dog Publishing, London, pp 98–99. ISBN: 9781907317293Google Scholar
  34. ScorpionZZZ’s Lab (2013) SGenerator (Signal Generator). http://sgenerator.scorpionzzz.com/en/index.html. Accessed 11 May 2017
  35. Sterlac (1980) Third hand. http://stelarc.org/?catID=20265. Accessed 25 May 2017
  36. Thaler L, Castillo-Serrano J (2016) People’s ability to detect objects using click-based echolocation: a direct comparison between mouth-clicks and clicks made by a loudspeaker. PLoS ONE. doi: 10.1371/journal.pone.0154868 CrossRefGoogle Scholar
  37. Thompson E (2010) Chapter 1: The enactive approach. Mind in life: biology, phenomenology, and the sciences of mind. Harvard University Press, Cambridge. ISBN 978-0674057517Google Scholar
  38. TriState (2014) Supplement for frequency modulation(FM) In: TriState. http://www.tristate.ne.jp/parame-2.htm. Accessed 26 May 2017
  39. TriState (2014) World’s first parametric speaker experiment Kit. In: TriState g. http://www.tristate.ne.jp/parame.htm. Accessed 26 May 2017
  40. WAFB (2010) Daniel Kish, Our President. In: World access for the blind. http://www.worldaccessfortheblind.org/node/105. Accessed 20 May 2013
  41. Warwick K (2002) In: Kevin Warwick Biography. http://www.kevinwarwick.com/. Accessed 25 Sept 2017
  42. Wood M (2010) Introduction to 3D audio with professor choueiri. Video format. Princeton University. https://www.princeton.edu/3D3A/. Accessed 10 May 2013

Copyright information

© Springer-Verlag London Ltd. 2017

Authors and Affiliations

  1. 1.Empowerment Informatics, School of Integrative and Global Majors (SIGMA)University of TsukubaTsukubaJapan
  2. 2.UCLA Design Media Arts, Broad Art CenterLos AngelesUSA

Personalised recommendations