ASSIST: Personalized Indoor Navigation via Multimodal Sensors and High-Level Semantic Information

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11134)


Blind & visually impaired (BVI) individuals and those with Autism Spectrum Disorder (ASD) each face unique challenges in navigating unfamiliar indoor environments. In this paper, we propose an indoor positioning and navigation system that guides a user from point A to point B indoors with high accuracy while augmenting their situational awareness. This system has three major components: location recognition (a hybrid indoor localization app that uses Bluetooth Low Energy beacons and Google Tango to provide high accuracy), object recognition (a body-mounted camera to provide the user momentary situational awareness of objects and people), and semantic recognition (map-based annotations to alert the user of static environmental characteristics). This system also features personalized interfaces built upon the unique experiences that both BVI and ASD individuals have in indoor wayfinding and tailors its multimodal feedback to their needs. Here, the technical approach and implementation of this system are discussed, and the results of human subject tests with both BVI and ASD individuals are presented. In addition, we discuss and show the system’s user-centric interface and present points for future work and expansion.


Indoor positioning Environmental & situational awareness Bluetooth beacons Google Tango 



This research was supported by the U.S. Department of Homeland Security (DHS) Science & Technology (S&T) Directorate, Office of University Programs, Summer Research Team Program for Minority Serving Institutions, administered by the Oak Ridge Institute for Science and Education (ORISE) under DOE contract #DE-AC05-06OR23100 and #DE-SC0014664. This work is also supported by the U.S. National Science Foundation (NSF) through Awards #EFRI-1137172, #CBET-1160046, and #CNS-1737533; the VentureWell (formerly NCIIA) Course and Development Program (Award #10087-12); a Bentley-CUNY Collaborative Research Agreement 2017–2020; and NYSID via the CREATE (Cultivating Resources for Employment with Assistive Technology) Program. We would like to thank the staff at Goodwill NY/NJ for their invaluable feedback and for recruiting subjects for our tests with autistic individuals. We would especially like to thank all of our subjects for their participation and cooperation as well as for providing extremely helpful feedback in improving our system.


  1. 1.
    Ahmetovic, D., Gleason, C., Ruan, C., Kitani, K., Takagi, H., Asakawa, C.: NavCog: a navigational cognitive assistant for the blind. In: Proceedings of the 18th International Conference on Human-Computer Interaction with Mobile Devices and Services - MobileHCI 2016 (2016)Google Scholar
  2. 2.
    Bohonos, S., Lee, A., Malik, A., Thai, C., Manduchi, R.: Universal real-time navigational assistance (URNA): an urban bluetooth beacon for the blind. In: Proceedings of the 1st ACM SIGMOBILE International Workshop on Systems and Networking Support for Healthcare and Assisted Living Environments - HealthNet 2007 (2007)Google Scholar
  3. 3.
    Chumkamon, S., Tuvaphanthaphiphat, P., Keeratiwintakorn, P.: A blind navigation system using RFID for indoor environments. In: 5th International Conference on Electrical Engineering/Electronics, Computer, Telecommunications and Information Technology (2008)Google Scholar
  4. 4.
    Furnari, A., Battiato, S., Farinella, G.M.: Personal-location-based temporal segmentation of egocentric videos for lifelogging applications. J. Vis. Commun. Image Represent. 52, 1–12 (2018)CrossRefGoogle Scholar
  5. 5.
    Hub, A., Diepstraten, J., Ertl, T.: Design and development of an indoor navigation and object identification system for the blind. In: Proceedings of the 6th International ACM SIGACCESS Conference on Computers and Accessibility, Assets 2004 (2004)Google Scholar
  6. 6.
    Kara, A., Bertoni, H.: Blockage/shadowing and polarization measurements at 2.45 GHz for interference evaluation between bluetooth and IEEE 802.11 WLAN. In: IEEE Antennas and Propagation Society International Symposium (2001).
  7. 7.
    Li, B., Muñoz, J.P., Rong, X., Xiao, J., Tian, Y., Arditi, A.: ISANA: wearable context-aware indoor assistive navigation with obstacle avoidance for the blind. In: Hua, G., Jégou, H. (eds.) ECCV 2016. LNCS, vol. 9914, pp. 448–462. Springer, Cham (2016). Scholar
  8. 8.
    Lind, S., Williams, D., Raber, J., Peel, A., Bowler, D.: Spatial navigation impairments among intellectually high-functioning adults with autism spectrum disorder: exploring relations with theory of mind, episodic memory, and episodic future thinking. J. Abnorm. Psychol. 122(4), 1189–1199 (2013)CrossRefGoogle Scholar
  9. 9.
    Mulloni, A., Wagner, D., Barakonyi, I., Schmalstieg, D.: Indoor positioning and navigation with camera phones. IEEE Pervasive Comput. 8(2), 22–31 (2009)CrossRefGoogle Scholar
  10. 10.
    Nair, V., Tsangouri, C., Xiao, B., Olmschenk, G., Seiple, W.H., Zhu, Z.: A hybrid indoor positioning system for the blind and visually impaired using bluetooth and Google Tango. J. Technol. Persons Disabil. 6, 61–81 (2018)Google Scholar
  11. 11.
    Nie, M., et al.: SoundView: an auditory guidance system based on environment understanding for the visually impaired people. In: Annual International Conference of the IEEE Engineering in Medicine and Biology Society (2009)Google Scholar
  12. 12.
    Ortis, A., Farinella, G.M., D’Amico, V., Addesso, L., Torrisi, G., Battiato, S.: Organizing egocentric videos of daily living activities. Pattern Recognit. 72, 207–218 (2017)CrossRefGoogle Scholar
  13. 13.
    Ozdenizci, B., Ok, K., Coskun, V., Aydin, M.N.: Development of an indoor navigation system using NFC technology. In: Fourth International Conference on Information and Computing (2011)Google Scholar
  14. 14.
    Peng, E., Peursum, P., Li, L., Venkatesh, S.: A smartphone-based obstacle sensor for the visually impaired. In: Yu, Z., Liscano, R., Chen, G., Zhang, D., Zhou, X. (eds.) UIC 2010. LNCS, vol. 6406, pp. 590–604. Springer, Heidelberg (2010). Scholar
  15. 15.
    Redmon, J., Farhadi, A.: YOLO9000: better, faster, stronger. arXiv preprint arXiv:1612.08242 (2016)
  16. 16.
    Ruiz, A.R.J., Granja, F.S., Honorato, J.C.P., Rosas, J.I.G.: Accurate pedestrian indoor navigation by tightly coupling foot-mounted imu and RFID measurements. IEEE Trans. Instr. Meas. 61(1), 178–189 (2012)CrossRefGoogle Scholar
  17. 17.
    Spera, E., Furnari, A., Battiato, S., Farinella, G.M.: Egocentric shopping cart localization. In: International Conference on Pattern Recognition (ICPR) (2018)Google Scholar
  18. 18.
    Strumillo, P.: Electronic interfaces aiding the visually impaired in environmental access, mobility and navigation. In: 3rd International Conference on Human System Interaction (2010)Google Scholar
  19. 19.
    Tang, H., Tsering, N., Hu, F., Zhu, Z.: Automatic pre-journey indoor map generation using autocad floor plan. J. Technol. Persons Disabil. 4, 179–191 (2016)Google Scholar
  20. 20.
    Tapu, R., Mocanu, B., Bursuc, A., Zaharia, T.: A smartphone-based obstacle detection and classification system for assisting visually impaired people. In: The IEEE International Conference on Computer Vision (ICCV) Workshops, pp. 444–451 (2013)Google Scholar
  21. 21.
    Vu, T., Osokin, A., Laptev, I.: Context-aware CNNs for person head detection. In: The IEEE International Conference on Computer Vision (2015)Google Scholar
  22. 22.
    World Health Organization: Vision impairment and blindness (2017).

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Department of Computer ScienceThe City College of New YorkNew YorkUSA
  2. 2.Department of Computer ScienceCUNY Graduate CenterNew YorkUSA
  3. 3.Lighthouse GuildNew YorkUSA

Personalised recommendations