Advertisement

A Hassle-Free Shopping Experience for the Visually Impaired: An Assistive Technology Application

  • Sherin Tresa PaulEmail author
  • Kumudha Raimond
Conference paper
Part of the Lecture Notes in Computational Vision and Biomechanics book series (LNCVB, volume 31)

Abstract

The issues and challenges faced by the Visually Impaired (VI) bring forth an insurmountable inequality of living conditions which needs immediate attention. The objective of this work is to render an in-depth understanding of some of the issues faced by the VI and to propose an Assistive Technology (AT) framework for assisting the VI to have a smooth and carefree shopping experience, including navigation in a supermarket to the sections specified by the VI user, identification of objects of interest, hassle-free billing, and registration for a subscription-based offer notification system based on the user’s shopping habits and preferences. The ultimate goal is to improve the Quality of Life (QoL) led by the VI person in as many ways as possible.

Keywords

Assistive technology for Visually Impaired (VI) Blindness Quality of Life (QoL) Image processing Pattern recognition IoT Optical head-mounted displays BLE beacons Inertial Measurement Unit (IMU) 

References

  1. 1.
    Al-khalifa S, Al-razgan M (2016) Ebsar: indoor guidance for the visually impaired. Comput Electr Eng 54:26–39CrossRefGoogle Scholar
  2. 2.
    Ave I (1987) United States Patent (19)Google Scholar
  3. 3.
    Braun GJ, Fairchild MD (1999) Functions. 2:380–393Google Scholar
  4. 4.
    Canny J (1986) A computational approach to edge detection 6Google Scholar
  5. 5.
    Chan KY et al (2016) PT US CR. Expert Syst, ApplGoogle Scholar
  6. 6.
    Chen LC et al (2017) Rethinking atrous convolution for semantic image segmentationGoogle Scholar
  7. 7.
    Chen LC et al (2018) DeepLab: semantic image segmentation with deep convolutional nets, atrous convolution, and fully connected CRFs. IEEE Trans Pattern Anal Mach Intell 40(4):834–848CrossRefGoogle Scholar
  8. 8.
    Chowdhury T et al (2013) Vegetables detection from the glossary shop for the blind 8(3), 43–53Google Scholar
  9. 9.
    Estimote Inc. Estimote. https://estimote.com/
  10. 10.
    Estimote Inc. Estimote products. https://estimote.com/products/
  11. 11.
    Franke U, Pfeiffer D (2009) The stixel world—a compact medium level representation of the 3D-world, pp 51–52Google Scholar
  12. 12.
    Gaba P, Chugh G (2018) Child Saf Syst RTLS 5(1):147–151Google Scholar
  13. 13.
    Glimworm Beacons Glimworm. https://glimwormbeacons.com/
  14. 14.
  15. 15.
  16. 16.
  17. 17.
    Google Eddystone supported beacon manufacturers. https://developers.google.com/beacons/eddystone#beacon_manufacturers
  18. 18.
  19. 19.
    Google Inc. Google glass. https://developers.google.com/glass/
  20. 20.
    John-Ol of Nilsson, Isaac Skog, Peter Handel1, and K.V.S.H. Openshoe. www.openshoe.org
  21. 21.
    kontakt.io Kontakt. https://kontakt.io/
  22. 22.
    Liu H et al (2016) Visual—tactile fusion for object recognition. IEEE Trans. Autom. Sci. Eng. 1–13Google Scholar
  23. 23.
    Mackey A, Spachos P (2017) Performance evaluation of beacons for indoor localization in smart buildings, pp 823–827Google Scholar
  24. 24.
    Ng TM (2016) From Where I am to Here I am. Accuracy study on location-based services with iBeacon technology, 3733Google Scholar
  25. 25.
    Nilsson J et al (2012) Foot-mounted INS for everybody—an open-source embedded implementation, pp 140–145Google Scholar
  26. 26.
    Pascolini D, Mariotti SP (2012) Global estimates of visual impairment  (2010), 614–619Google Scholar
  27. 27.
    Qi CR et al (2016) PointNet: deep learning on point sets for 3D classification and segmentation big data + deep representation learning. Cvpr, 652–660Google Scholar
  28. 28.
    Rusu RB, Cousins S (2011) 3D is here: point cloud library. In: IEEE international conference on robot automation, pp 1–4Google Scholar
  29. 29.
    Shailesh P et al (2017) Indoor navigation system with augmented reality using eddystone beacons, 10–12Google Scholar
  30. 30.
    Singh A (2017) Bluetooth low energy: need of the hour 3, 749–751Google Scholar
  31. 31.
    Urrea WA, Ariza HM (2017) Real time location system (RTLS) focused on the optimization of efficiency for hospital center assistance 12(15), 5248–5253Google Scholar
  32. 32.
    Wang H et al (2017) Enabling independent navigation for visually impaired people through a wearable vision-based feedback system, pp 6533–6540Google Scholar
  33. 33.
    Yu N et al (2017) A precise dead reckoning algorithm based on bluetooth and multiple sensors. IEEE Internet Things J (Accepted Publ. 4662)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Karunya UniversityCoimbatoreIndia

Personalised recommendations