Human-Computer Interaction

INTERACT 2015: Human-Computer Interaction – INTERACT 2015 pp 72-79

Exploring Map Orientation with Interactive Audio-Tactile Maps

  • Alistair D. N. Edwards
  • Nazatul Naquiah Abd Hamid
  • Helen Petrie
Conference paper

DOI: 10.1007/978-3-319-22701-6_6

Part of the Lecture Notes in Computer Science book series (LNCS, volume 9296)
Cite this paper as:
Edwards A.D.N., Hamid N.N.A., Petrie H. (2015) Exploring Map Orientation with Interactive Audio-Tactile Maps. In: Abascal J., Barbosa S., Fetter M., Gross T., Palanque P., Winckler M. (eds) Human-Computer Interaction – INTERACT 2015. INTERACT 2015. Lecture Notes in Computer Science, vol 9296. Springer, Cham

Abstract

Multi-modal interactive maps can provide a useful aid to navigation for blind people. We have been experimenting with such maps that present information in a tactile and auditory (speech) form, but with the novel feature that the map’s orientation is tracked. This means that the map can be explored in a more ego-centric manner, as favoured by blind people. Results are encouraging, in that scores in an orientation task are better with the use of map rotation.

Keywords

Multi-modal maps Blind people Tactile Speech Rotation 

Copyright information

© IFIP International Federation for Information Processing 2015

Authors and Affiliations

  • Alistair D. N. Edwards
    • 1
  • Nazatul Naquiah Abd Hamid
    • 1
  • Helen Petrie
    • 1
  1. 1.Department of Computer ScienceUniversity of YorkYorkUK

Personalised recommendations