Intelligent Service Robotics

, Volume 1, Issue 3, pp 237–251 | Cite as

Robot-assisted shopping for the blind: issues in spatial cognition and product selection

Original Research Paper

Abstract

Research on spatial cognition and blind navigation suggests that a device aimed at helping blind people to shop independently should provide the shopper with effective interfaces to the locomotor and haptic spaces of the supermarket. In this article, we argue that robots can act as effective interfaces to haptic and locomotor spaces in modern supermarkets. We also present the design and evaluation of three product selection modalities—browsing, typing and speech, which allow the blind shopper to select the desired product from a repository of thousands of products.

Keywords

Assistive robotics Service robotics Human–robot interaction Blind navigation Spatial cognition Haptic and locomotor interfaces Independent shopping for the visually impaired 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Reference

  1. 1.
    Bradley N, Dunlop M (2005) An experimental investigation into wayfinding directions for visually impaired people. Ubiquitous Comput 9: 395–403CrossRefGoogle Scholar
  2. 2.
    Brent J, Modi N (2000) Shopping aid for the visually impaired. In: Proceedings of conference on rehabilitation engineering and assistive technology society of North AmericaGoogle Scholar
  3. 3.
    Brewster SA (1998) Using nonspeech sounds to provide navigation cues. ACM Trans Comput Hum Interact 5(3): 224–259CrossRefMathSciNetGoogle Scholar
  4. 4.
    Dey A, Abowd G (2000) Towards a better understanding of context and context-awareness. In: Proceedings of the CHI workshop on the what, who, where, and how of context-awareness, The Hague, The NetherlandsGoogle Scholar
  5. 5.
    Divi V, Forlines C, Gemert J, Raj B, Schmidt-Nielsen B, Wittenburg K, Woelfel P, Zhang F (2004) A speech-in list-out approach to spoken user interfaces. In: Proceedings of human language technologies, Boston, MAGoogle Scholar
  6. 6.
    Fox D (1998) Markov localization: a probabilistic framework for mobile robot localization and navigation. Ph.D. thesis, University of Bonn, GermanyGoogle Scholar
  7. 7.
    Freundschuh S, Egenhofer M (1997) Human conceptions of spaces: implications for geographic information systems. Trans GIS 2(4): 361–365Google Scholar
  8. 8.
    Gaver W (1989) The sonicfinder: An interface that uses auditory icons. Hum Comput Interact 4(1): 57–94CrossRefMathSciNetGoogle Scholar
  9. 9.
    Gharpure C (2004) Orientation-free rfid based navigation in a robotic guide for the visually impaired. Master’s thesis, Utah State University, USAGoogle Scholar
  10. 10.
    Gharpure C, Kulyukin V, Kutiyanawala A, Jiang M (2006) Passive radio frequency exteroception in robot assisted shopping for the blind. In: Proceedings of the international conference on ubiquitous intelligence and computing (UIC), Wuhan and Three Gorges, ChinaGoogle Scholar
  11. 11.
    Gockley R, Mataric MJ (2006) Encouraging physical therapy compliance with a hands-off mobile robot. In: Proceedings of human robot interaction (HRI) conference, Salt Lake City, USAGoogle Scholar
  12. 12.
    Golledge R, Klatzky R, , Loomis J (1998) Cognitive mapping and wayfinding by adults without vision. In: Portugali J(ed) The construction of cognitive maps, pp 215–246Google Scholar
  13. 13.
    Hahnel D, Burgard W, Fox D, Fishkin K, Philipose M (2003) Mapping and localization with rfid technology. Technical report, IRS-TR-03-014, Intel Research Institute, Seattle, WAGoogle Scholar
  14. 14.
    Haptica C (2001) Guido (c). http://www.haptica.com
  15. 15.
    Hart S, Staveland L (1988) Development of nasa-tlx: results of empirical and theoretical research. In: Hancock P, Meshkati N (eds) Human mental overload. North Holland, Amsterdam, pp 139–183Google Scholar
  16. 16.
    Household-Products-Database (2004) http://www.householdproducts.nlm.nih.gov
  17. 17.
    Jacquet C, Bellik Y, Bourda Y (2004) A context-aware locomotion assistance device for the blind. In: Fincher S, Markopoulos P, Moore D, Ruddle R (eds) People and computers XVIII f́b design for life. Springer, LondonGoogle Scholar
  18. 18.
    Kantor G, Singh S (2002) Priliminary results in range-only localization and mapping. In: IEEE conference on robotics and automation, Washington, DCGoogle Scholar
  19. 19.
    Kay L (1974) A sonar aid to enhance spatial perception of the blind: engineering design and evaluation. Radio Electron Eng 44:40–62Google Scholar
  20. 20.
    Klante P (2004) Auditory interaction objects for mobile applications. In: Proceedings of 7th international conference on work with computing systems, WWCS2004, Kuala LumpurGoogle Scholar
  21. 21.
    Kulyukin V, Gharpure C, Nicholson J (2005) Robocart: toward robot-assisted navigation of grocery stores by the visually impaired. In: Proceedings of the IEEE/RSJ international conference on intelligent robots and systems (IROS)Google Scholar
  22. 22.
    Kulyukin V, Gharpure C, Nicholson J, Osborne G (2006) Robot-assisted wayfinding for the visually impaired in structured indoor environments. Auton Robots 21(1):29–41. http://dx.doi.org/10.1007/s10514-006-7223-8
  23. 23.
    Kulyukin V, Gharpure C (2006) Ergonomics-for-one in a robotic shopping cart for the blind. In: Proceedings of the 2006 ACM conference on human–robot interaction, Salt Lake City, UTGoogle Scholar
  24. 24.
    Kulyukin V, Gharpure C, Pentico C (2007) Robots as interfaces to haptic and locomotor spaces. In: Proceedings of the 2007 ACM conference on human–robot interaction, Arlington, VAGoogle Scholar
  25. 25.
    Kupiers B (2000) The spatial semantic hierarchy. Artif Intell 119: 191–233CrossRefGoogle Scholar
  26. 26.
    Lahav O, Mioduser D (2003) A blind persons cognitive mapping of new spaces using a haptic virtual environment. J Res Spec Educ Needs 3(3): 172–177CrossRefGoogle Scholar
  27. 27.
    Lalatendu S, Pierce N, Anijo M (2006) Cat eye: an assistance system for independent shopping. In: Proceedings of international conference on aging, disability and independence, St Petersburg, FLGoogle Scholar
  28. 28.
    Lanigan P, Paulos A, Williams A, Narasimhan P (2006) Trinetra: assistive technologies for the blind. In: CyLab technical report CMU-CyLab-06-006, Carnegie Mellon UniversityGoogle Scholar
  29. 29.
    Millar S (1982) The problem of imagery and spatial development in the blind. In: de Gelder B(ed) Knowledge and representation, pp~111–120Google Scholar
  30. 30.
    Millar S (1995) Understanding and representing spatial information. Br J Vis Impairment 13: 8–11CrossRefGoogle Scholar
  31. 31.
    Millar S (1997) Theory, experiment and practical application in research on visual impairment. Eur J Psychol Educ 12: 415–430CrossRefGoogle Scholar
  32. 32.
    Mori H, Kotani S (1998) Robotic travel aid for the blind: Harunobu-6. In: Second European conference on disability, virtual reality, and assistive technology, Sovde, SwedenGoogle Scholar
  33. 33.
    Parasuraman A (2000) Technology reacdiness index(tri): a multiple-item scale to measure readiness to embrace new technologies. J Serv Res, p 307Google Scholar
  34. 34.
    Passini R, Proulx G (1988) Wayfinding without vision An experiment with congenitally totally blind people. Environ Behav 22: 227–252CrossRefGoogle Scholar
  35. 35.
    Raman TV (1997) Concrete implementation of an audio desktop. In: Auditory user interfaces toward the speaking computerGoogle Scholar
  36. 36.
    Scooter S, Helal S (2004) A passive rfid information grid for location and proximity sensing for the blind user. University of Florida Technical Report number TR04-009Google Scholar
  37. 37.
    Sidner C, Forlines C (2002) Subset language for conversing with collaborative interface agents. In: Mitsubishi Electric Research Laboratory, TR-TR2002-36, Cambridge, MA, USAGoogle Scholar
  38. 38.
    Smith A, Francioni J, Anwar M, Cook J, Hossain A, Rahman M (2004) Non-visual tool for navigating hierarchical structures. In: Proceedings of international ACM SIGACCESS conference on computers and accessibility (ASSETS), Atlanta, GAGoogle Scholar
  39. 39.
    Tversky B and Lee P (1998) How space structures language. In:Freksa C, Habel C, Wender KF(eds) Spatial cognition an interdisciplinary approach to representing and processing spatial knowledge, pp 157–175Google Scholar
  40. 40.
    Tversky B, Morrison J, Franklin N, Bryant D (1999) Three spaces of spatial cognition. Prof Geogr 51: 516–524CrossRefGoogle Scholar
  41. 41.
    Ulrich I, Borenstein J (2001) The guidecane: applying mobile robot technologies to assist the visually impaired. IEEE Trans Syst, Man Cybern A Syst Hum 31: 131–136CrossRefGoogle Scholar
  42. 42.
    Ungar S (2000) Cognitive mapping without visual experience. In: Cognitive mapping: past present and future.Routledge,LondonGoogle Scholar
  43. 43.
    W3C (2000) Web content accessibility guidelines 1.0. In: Web accessibility InitiativeGoogle Scholar
  44. 44.
    Walker B, Nance A, Lindsay J (2006) Spearcons: speech-based earcons improve navigation performance in auditory menus. In: Proceedings of the 12th international conference on auditory display, London, UKGoogle Scholar
  45. 45.
    Wolf P, Woelfel J, Gemert J, Raj B, Wong D (2004) Spokenquery: an alternate approach to choosing items with speech. In: Mitsubishi Electric Research Laboratory, TR-TR2004-121. Cambridge, MA, USAGoogle Scholar

Copyright information

© Springer-Verlag 2008

Authors and Affiliations

  1. 1.Computer Science Assistive Technology Laboratory, Department of Computer ScienceUtah State UniversityLoganUSA

Personalised recommendations