Skip to main content

One-Shot Wayfinding Method for Blind People via OCR and Arrow Analysis with a 360-Degree Smartphone Camera

  • Conference paper
  • First Online:
Mobile and Ubiquitous Systems: Computing, Networking and Services (MobiQuitous 2021)

Abstract

We present a wayfinding method that assists blind people in determining the correct direction to a destination by taking a one-shot image. Signage is standard in public buildings and used to help visitors, but has little benefit for blind people. Our one-shot wayfinding method recognizes surrounding signage in all directions from an equirectangular image captured using a 360-degree smartphone camera. The method analyzes the relationship between detected text and arrows on signage and estimates the correct direction toward the user’s destination. In other words, the method enables wayfinding for the blind without requiring either environmental modifications (e.g. Bluetooth beacons) or preparation of map data. In a user study, we compared our method with a baseline method: a signage reader using a smartphone camera with a standard field of view. We found that our method enabled the participants to decide directions more efficiently than with the baseline method.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 119.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 159.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    https://www.insta360.com/product/insta360-one/.

  2. 2.

    https://support.apple.com/kb/sp705.

  3. 3.

    https://www.flickr.com/services/api/.

  4. 4.

    https://cloud.google.com/vision/docs/ocr/.

  5. 5.

    All communication with the participants was in their native language. In this paper, we describe any translated content in the form of “translated content”.

References

  1. BeSpecular (2016). https://www.bespecular.com

  2. Seeing AI (2017). https://www.microsoft.com/en-us/seeing-ai

  3. Ahmetovic, D., Gleason, C., Ruan, C., Kitani, K., Takagi, H., Asakawa, C.: NavCog: a navigational cognitive assistant for the blind. In: MobileHCI (2016)

    Google Scholar 

  4. Alnafessah, A., Al-Ammar, M.A., Alhadhrami, S., Al-Salman, A., Al-Khalifa, H.S.: Developing an ultra wideband indoor navigation system for visually impaired people. IJDSN 12, 403–416 (2016)

    Google Scholar 

  5. Augusto Borges Oliveira, D., Palhares Viana, M.: Fast CNN-based document layout analysis. In: ICCVW (2017)

    Google Scholar 

  6. Bigham, J.P., et al.: VizWiz: nearly real-time answers to visual questions. In: UIST (2010)

    Google Scholar 

  7. Brooke, J.: SUS: a ‘quick and dirty’ usability. In: Usability Evaluation in Industry, p. 189 (1996)

    Google Scholar 

  8. Fiannaca, A., Apostolopoulous, I., Folmer, E.: HEADLOCK: a wearable navigation aid that helps blind cane users traverse large open spaces. In: ASSETS (2014)

    Google Scholar 

  9. Gabow, H.N., Galil, Z., Spencer, T., Tarjan, R.E.: Efficient algorithms for finding minimum spanning trees in undirected and directed graphs. Combinatorica 6(2), 109–122 (1986). https://doi.org/10.1007/BF02579168

  10. Gallagher, T., Wise, E., Li, B., Dempster, A.G., Rizos, C., Ramsey-Stewart, E.: Indoor positioning system based on sensor fusion for the blind and visually impaired. In: IPIN (2012)

    Google Scholar 

  11. Garcia, G., Nahapetian, A.: Wearable computing for image-based indoor navigation of the visually impaired. In: WH. A (2015)

    Google Scholar 

  12. Guentert, M.: Improving public transit accessibility for blind riders: a train station navigation assistant. In: ASSETS (2011)

    Google Scholar 

  13. Guerreiro, J.A., Ahmetovic, D., Sato, D., Kitani, K., Asakawa, C.: Airport accessibility and navigation assistance for people with visual impairments. In: CHI (2019)

    Google Scholar 

  14. Guidelines, I.H.F.: Wayfinding Guidelines International Health Facility Guidelines (2016). http://www.healthfacilityguidelines.com/GuidelineIndex/Index/Wayfinding-Guidelines

  15. Jayant, C., Ji, H., White, S., Bigham, J.P.: Supporting blind photography. In: ASSETS (2011)

    Google Scholar 

  16. Kayukawa, S., Tatsuya, I., Takagi, H., Morishima, S., Asakawa, C.: Guiding blind pedestrians in public spaces by understanding walking behavior of nearby pedestrians. IMWUT 4(3), 1–22 (2020)

    Google Scholar 

  17. Kelley, J.F.: An iterative design methodology for user-friendly natural language office information applications. TOIS 2(1), 26–41 (1984)

    Article  Google Scholar 

  18. Ko, E., Ju, J.S., Kim, E.Y.: Situation-based indoor wayfinding system for the visually impaired. In: ASSETS (2011)

    Google Scholar 

  19. Kuribayashi, M., Kayukawa, S., Takagi, H., Asakawa, C., Morishima, S.: LineChaser: a smartphone-based navigation system for blind people to stand in line. In: CHI (2021)

    Google Scholar 

  20. Kuznetsova, A., et al.: The open images dataset v4: unified image classification, object detection, and visual relationship detection at scale. IJCV 128(7), 1956–1981 (2020)

    Article  Google Scholar 

  21. Lee, K., Hong, J., Pimento, S., Jarjue, E., Kacorri, H.: Revisiting blind photography in the context of teachable object recognizers. In: ASSETS (2019)

    Google Scholar 

  22. Li, B., Muñoz, J.P., Rong, X., Xiao, J., Tian, Y., Arditi, A.: ISANA: wearable context-aware indoor assistive navigation with obstacle avoidance for the blind. In: ECCVW (2016)

    Google Scholar 

  23. Loomis, J.M., Lippa, Y., Klatzky, R.L., Golledge, R.G.: Spatial updating of locations specified by 3-D sound and spatial language. JEP:LMC 28(2), 335 (2002)

    Google Scholar 

  24. Manduchi, R., Coughlan, J.M.: The last meter: blind visual guidance to a target. In: CHI (2014)

    Google Scholar 

  25. Pal, J., Viswanathan, A., Song, J.H.: Smartphone adoption drivers and challenges in urban living: cases from Seoul and Bangalore. In: IHCI (2016)

    Google Scholar 

  26. Panëels, S.A., Olmos, A., Blum, J.R., Cooperstock, J.R.: Listen to it yourself! Evaluating usability of what’s around me? For the blind. In: CHI (2013)

    Google Scholar 

  27. Presti, G., et al.: WatchOut: obstacle sonification for people with visual impairment or blindness. In: ASSETS (2019)

    Google Scholar 

  28. Redmon, J., Farhadi, A.: YOLOv3: an incremental improvement. arXiv (2018)

    Google Scholar 

  29. Saha, M., Fiannaca, A.J., Kneisel, M., Cutrell, E., Morris, M.R.: Closing the gap: designing for the last-few-meters wayfinding problem for people with visual impairments. In: ASSETS (2019)

    Google Scholar 

  30. Sato, D., Oh, U., Naito, K., Takagi, H., Kitani, K., Asakawa, C.: NavCog3: an evaluation of a smartphone-based blind indoor navigation assistant with semantic features in a large-scale environment. In: ASSETS (2017)

    Google Scholar 

  31. Shen, H., Coughlan, J.M.: Towards a real-time system for finding and reading signs for visually impaired users. In: Miesenberger, K., Karshmer, A., Penaz, P., Zagler, W. (eds.) ICCHP 2012. LNCS, vol. 7383, pp. 41–47. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-31534-3_7

    Chapter  Google Scholar 

  32. Treuillet, S., Royer, E.: Outdoor/indoor vision based localization for blind pedestrian navigation assistance. IJIG 10, 481–496 (2010)

    Google Scholar 

  33. Vázquez, M., Steinfeld, A.: Helping visually impaired users properly aim a camera. In: ASSETS (2012)

    Google Scholar 

  34. Wang, S., Tian, Y.: Indoor signage detection based on saliency map and bipartite graph matching. In: ICBBW (2011)

    Google Scholar 

  35. Wang, S., Tian, Y.: Camera-based signage detection and recognition for blind persons. In: Miesenberger, K., Karshmer, A., Penaz, P., Zagler, W. (eds.) ICCHP 2012. LNCS, vol. 7383, pp. 17–24. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-31534-3_3

    Chapter  Google Scholar 

  36. Yamanaka, Y., Takaya, E., Kurihara, S.: Tactile tile detection integrated with ground detection using an RGB-depth sensor. In: ICAART (2020)

    Google Scholar 

  37. Zhao, Y., Wu, S., Reynolds, L., Azenkot, S.: A face recognition application for people with visual impairments: understanding use beyond the lab. In: CHI (2018)

    Google Scholar 

  38. Zhong, Y., Lasecki, W.S., Brady, E., Bigham, J.P.: RegionSpeak: quick comprehensive spatial descriptions of complex images for blind users. In: CHI (2015)

    Google Scholar 

Download references

Acknowledgments

We would like to thank all participants who took part in our user study. We would also thank Japan Airport Terminal Co., Ltd. and East Japan Railway Company. This work was supported by AMED (JP20dk0310108, JP21dk0310108h0002), JSPS KAKENHI (JP20J23018), and Grant-in-Aid for Young Scientists (Early Bird, Waseda Research Institute for Science and Engineering, BD070Z003100).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yutaro Yamanaka .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 ICST Institute for Computer Sciences, Social Informatics and Telecommunications Engineering

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Yamanaka, Y., Kayukawa, S., Takagi, H., Nagaoka, Y., Hiratsuka, Y., Kurihara, S. (2022). One-Shot Wayfinding Method for Blind People via OCR and Arrow Analysis with a 360-Degree Smartphone Camera. In: Hara, T., Yamaguchi, H. (eds) Mobile and Ubiquitous Systems: Computing, Networking and Services. MobiQuitous 2021. Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering, vol 419. Springer, Cham. https://doi.org/10.1007/978-3-030-94822-1_9

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-94822-1_9

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-94821-4

  • Online ISBN: 978-3-030-94822-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics