Abstract
This article presents an Java-based mobile service that enables blind-environment interaction through voice-augmented objects. To make this possible, it is necessary to tag the object with an associated radio frequency identification and record its voice-based description. The blind users can later use the service to scan surrounding augmented objects and verbalize their identity and characteristics. We use a user centred design in order to guarantee the accessibility of the service for visually impaired and blind people. The required hardware is a near field communication-enabled mobile phone with built-in accelerometer. The client-side application does not require pushing any buttons, browsing any menus, or touching any screens to select and activate any of supported modes: registration, calibration, voice recording, physical object identification, delete voice recording(s), cloud-based file sync and share. Twelve visually impaired individuals (aged 31–84, 6 men and 6 women) have tested the service in two different scenarios: (1) a test based on comparison with a PenFriend labeling unit, and (2) a users’ experience test. The results show that selected tangible, multimodal interface (object touching, phone shaking and tilt, voice output) can be used very easily (58 %) or easily (33 %) by blind and visually impaired users who have had no previous experience with other mobile services. Most of participants from the test group agreed that the service could be useful for their daily activities. The service can be used both at home and in public buildings for voice description of objects such as food, medicines, books, clothes, cosmetics, CD/DVDs, rooms, etc.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Hervás R, Bravo J, Fontecha J (2012) Awareness marks: adaptive services through user interactions with augmented objects. J Pers Ubiquit Comput 15(4):409–418
Domingo MC (2012) An overview of the Internet of Things for people with disabilities. J Netw Comput Appl 35(2):584–596
Guerrero LA, Horta H, Ochoa SF (2010) Developing augmented objects: a process perspective. J Univers Comput Sci 16(12):1612–1632
Leichtenstern K, Andre E (2008) User-centred development of mobile interfaces to a pervasive computing environment. First international conference on advances in computer-human interaction. Sainte Luce, Martinique, France, pp 114–119
Brady E, Morris MR, Zhong Y, White S, Bigham JP (2013) Visual challenges in the everyday lives of blind people. In: Proceedings of the SIGCHI conference on human factors in computing systems. ACM, pp 2117–2126
Roentgen UR, Gelderblom GJ, Soede M, de Witte LP (2008) Inventory of electronic mobility aids for persons with visual impairments: a literature review. J Vis Impair Blind 102(11):702–724
Giudice NA, Legge GE (2008) Blind navigation and the role of technology. In: Helal A, Mokhtari M, Abdulrazak B (eds) Engineering handbook of smart technology for aging, disability, and independence. Wiley, New York, pp 479–500
Dramas F, Thorpe SJ, Jouffrais C (2010) Artificial vision for the blind: a bio-inspired algorithm for objects and obstacles detection. J Image Graph 10(4):531–544
Osendorfer C, Bayer J, Urban S, van der Smagt P (2013) Unsupervised feature learning for low-level local image descriptors. arXiv:1301.2840
Juan L, Gwun O (2013) A Comparison of SIFT, PCA-SIFT and SURF. J Image Process 3(4):143–152
Zhong Y, Garrigues PJ, Bigham JP (2013) Real time object scanning using a mobile phone and cloud-based visual search engine, In: Proceedings of the 15th international ACM SIGACCESS conference on computers and accessibility. doi:10.1145/2513383.2513443
Yi C, Flores RW, Chincha R, Tian Y (2013) Finding objects for assisting blind people. J Netw Model Anal Health Inform Bioinform 2(2):71–79
Jafri R, Ali SA, Arabnia HR (2013) Computer vision-based object recognition for the visually impaired using visual tags, In: The 2013 international conference on image processing, computer vision, and pattern recognition, pp 400–406
Diekmann T, Melski A, Schumann, M (2007) Data-on-network vs. data-on-tag: managing data in complex rfid environments. In: Proceedings of the 40th annual Hawaii international conference on system sciences, IEEE, pp 224a
Vazquez-Briseno M, Hirata FI, Sanchez-Lopez JD, Jimenez-Garcia E, Navarro-Cota C, Nieto-Hipolito JI (2012) Using RFID/NFC and QR-code in mobile phones to link the physical and the digital world. In: Deliyannis I (ed) Interactive multimedia. InTech, Rijeka, pp 219–242
Halgaonkar P, Jain S, Wadhai VM (2013) NFC: a review of technology, tags, applications and security. J Res Comput Commun Technol 2(10):979–987
Patel J, Kothari B (2013) Near field communication—the future technology for an interactive world. J Eng Res Sci Technol 2(2):55–59
Coskun V, Ozdenizci B, Ok K (2013) A survey on near field communication (NFC) technology. J Wirel Pers Commun 71(3):2259–2294
Kendrick D (2011) PenFriend and touch memo: a comparison of labeling tools. AccessWorld 12(9). http://www.afb.org/afbpress/pub.asp?DocID=aw120902. Accessed 4 April 2013
Konttila A, Harjumaa M, Muuraiskangas S, Jokela M, Isomursu M (2012) Touch n’ Tag: digital annotation of physical objects with voice tagging. J Assist Technol 6(1):24–37
Sánchez M, Mateos M, Fraile J, Pizarro D (2012) Touch Me: a new and easier way for accessibility using Smartphones and NFC. J Adv Intell Soft Comput 156:307–314
Nielsen J (2012) Usability 101: introduction to usability. Alertbox. http://www.nngroup.com/articles/usability-101-introduction-to-usability. Accessed 20 March 2013
Arora I, Gupta A (2012) Cloud databases: a paradigm shift in databases. J Comp Sci Issues 9(4):77–83
Rukzio E, Broll G, Leichtenstern K, Schmidt A (2007) Mobile interaction with the real world: an evaluation and comparison of physical mobile interaction techniques. Ambient intelligence. Springer, Heidelberg, Berlin, pp 1–18
Turk M (2014) Multimodal interaction: a review. J Pattern Recognit Lett 36:189–195
Giudice NA, Palani H, Brenner E, Kramer KM (2012) Learning non-visual graphical information using a touch-based vibro-audio interface. Proceedings of the 14th international ACM SIGACCESS conference on computers and accessibility. ACM, New York, pp 103–110
Loomis JM, Klatzky RL, Giudice NA (2012) Sensory substitution of vision: importance of perceptual and cognitive processing. In: Manduchi R, Kurniawan S (eds) Assistive technology for blindness and low vision. CRC Press, Boca Raton, pp 162–191
López-de-Ipiña D, Vazquez JI, Jamardo I (2007) Touch computing: simplifying human to environment interaction through NFC technology. In: 1as Jornadas Científicas sobre RFID. Ciudad Real, Spain
Choe BW, Min JK, Cho SB (2010) Online gesture recognition for user interface on accelerometer built-in mobile phones. 17th international conference on neural information processing: models and applications. Springer, Heidelberg, Berlin, pp 650–657
Mantyjarvi J, Paternò F, Salvador Z, Santoro C (2006) Scan and tilt—towards natural interaction for mobile museum guides. In: 8th conference on human-computer interaction with mobile devices and services
Leichtenstern K, Andre E (2008) User-centred development of mobile interfaces to a pervasive computing environment. In: Proceedings of the first international conference on advances in computer-human interaction, pp 114–119
Tomitsch M, Schlögl R, Grechenig T, Wimmer C, Költringer T (2008) Accessible real-world tagging through audio-tactile location markers. 5th Nordic conference on human-computer interaction: building bridges. ACM, New York, pp 551–554
Ivanov R (2012) RSNAVI: an RFID-based context-aware indoor navigation system for the blind. In: Proceedings of the 13th international conference on computer systems and technologies. ACM, New York, pp 313–320
Ivanov R (2013) NFC-based pervasive learning service for children. Proceedings of the 14th international conference on computer systems and technologies. ACM, New York, pp 329–336
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
About this article
Cite this article
Ivanov, R. Blind-environment interaction through voice augmented objects. J Multimodal User Interfaces 8, 345–365 (2014). https://doi.org/10.1007/s12193-014-0166-z
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12193-014-0166-z