Skip to main content

FingerReader: A Finger-Worn Assistive Augmentation

  • Chapter
  • First Online:
Assistive Augmentation

Part of the book series: Cognitive Science and Technology ((CSAT))

Abstract

The FingerReader is a finger-augmenting device equipped with a camera that assists in pointing and touching tasks. While the FingerReader was initially designed as an assistive device for sightless reading of printed text, it expanded to support other activities, such as reading music. The FingerReader is shown to be an intuitive interface for accessing pointable visual material, through user studies and quantitative assessments. This article discusses the origins, design rationale, iterations, applications, evaluations, and an encompassing overview of the past 4 years of work on the FingerReader.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 99.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 129.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 129.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    The Library of Congress lists 70 braille music transcribers US-wide: http://www.loc.gov/nls/music/circular4.html.

  2. 2.

    http://www.visiv.co.uk/.

  3. 3.

    http://www.musitek.com/.

References

  1. Ubi-Camera (March 2012). http://www.gizmodo.in/gadgets/Finger-Camera-Lets-You-Frame-a-Shot-Like-a-Pompous-Director/articleshow/19139922.cms

  2. Baba T, Kikukawa Y, Yoshiike T, Suzuki T, Shoji R, Kushiyama K, Aoki M (2012) Gocen: a handwritten notational interface for musical performance and learning music. In: SIGGRAPH 2012 emerging technologies. ACM

    Google Scholar 

  3. Bigham JP, Jayant C, Ji H, Little G, Miller A, Miller RC, Miller R, Tatarowicz A, White B, White S, Yeh T (2010) VizWiz: nearly real-time answers to visual questions. In: Proceedings of the 23nd annual ACM symposium on user interface software and technology, UIST ’10, New York, NY, USA. ACM, pp 333–342

    Google Scholar 

  4. Byrne JH, Dafny N (1997) Neuroscience online: an electronic textbook for the neurosciences. The University of Texas Medical School at Houston, Department of Neurobiology and Anatomy

    Google Scholar 

  5. Crombie D, Dijkstra S, Schut E, Lindsay N (2002) Spoken music: enhancing access to music for the print disabled. In: Computers helping people with special needs. Lecture notes in computer science, vol 2398. Springer, Berlin, pp 667–674

    Google Scholar 

  6. d’Albe EEF (1914) On a type-reading optophone. Proc R Soc Lond A 90(619):373–375

    Article  Google Scholar 

  7. David M (Oct 2007) Every stalker’s dream: camera ring

    Google Scholar 

  8. El-Glaly YN, Quek F, Smith-Jackson TL, Dhillon G (2012) It is not a talking book;: it is more like really reading a book! In: Proceedings of the 14th international ACM SIGACCESS conference on computers and accessibility, ASSETS ’12, New York, NY, USA. ACM, pp 277–278

    Google Scholar 

  9. Ezaki N, Bulacu M, Schomaker L (2004) Text detection from natural scene images: towards a system for visually impaired persons. Proc ICPR 2:683–686

    Google Scholar 

  10. Hanif SM, Prevost L (2007) Texture based text detection in natural scene images—a help to blind and visually impaired persons. In: CVHI

    Google Scholar 

  11. Hedberg E, Bennett Z (Dec 2010) Thimble—there’s a thing for that

    Google Scholar 

  12. Horvath S, Galeotti J, Wu B, Klatzky R, Siegel M, Stetten G (2014) FingerSight: fingertip haptic sensing of the visual environment. IEEE J Trans Eng Health Med 2:1–9

    Article  Google Scholar 

  13. Jacko VA, Choi JH, Carballo A, Charlson B, Moore JE (2015) A new synthesis of sound and tactile music code instruction in a pilot online braille music curriculum. J Vis Impair Blindness (Online) 109(2):153

    Google Scholar 

  14. Kane SK, Frey B, Wobbrock JO (2013) Access lens: a gesture-based screen reader for real-world documents. In: Proceedings of the SIGCHI conference on human factors in computing systems, CHI ’13, New York, NY, USA. ACM, pp 347–350

    Google Scholar 

  15. Lee H (Sept 2011) Finger reader

    Google Scholar 

  16. Linvill JG, Bliss JC (1966) A direct translation reading aid for the blind. Proc IEEE 54(1):40–51

    Article  Google Scholar 

  17. Luangnapa N, Silpavarangkura T, Nukoolkit C, Mongkolnam P (2012) Optical music recognition on android platform. In: Advances in information technology. Communications in computer and information science, vol 344. Springer, Berlin, pp 106–115

    Google Scholar 

  18. Mattar MA, Hanson AR, Learned-Miller EG (June 2005) Sign classification using local and meta-features. In: CVPR—workshops. IEEE, p 26

    Google Scholar 

  19. McNeill D (2000) Language and gesture, vol 2. Cambridge University Press

    Google Scholar 

  20. Nanayakkara S, Shilkrot R, Yeo KP, Maes P (2013) EyeRing: a finger-worn input device for seamless interactions with our surroundings. In: Proceedings of the 4th augmented human international conference, AH ’13, New York, NY, USA. ACM, pp 13–20

    Google Scholar 

  21. Pazio M, Niedzwiecki M, Kowalik R, Lebiedz J (2007) Text detection system for the blind. In: 15th European signal processing conference EUSIPCO, pp 272–276

    Google Scholar 

  22. Peters J-P, Thillou C, Ferreira S (2004) Embedded reading device for blind people: a user-centered design. In: Procedings of the ISIT. IEEE, pp 217–222

    Google Scholar 

  23. Polanco MRII (2015) Mobireader: a wearable, assistive smartphone peripheral for reading text. Master’s thesis, Massachusetts Institute of Technology

    Google Scholar 

  24. Rebelo A, Fujinaga I, Paszkiewicz F, Marcal ARS, Guedes C, Cardoso JS (2012) Optical music recognition: state-of-the-art and open issues. Int J Multimedia Inf Retrieval 1(3):173–190

    Article  Google Scholar 

  25. Rissanen MJ, Fernando ONN, Iroshan H, Vu S, Pang N, Foo S (2013) Ubiquitous shortcuts: mnemonics by just taking photos. CHI ’13 extended abstracts on human factors in computing systems, CHI EA ’13. New York, NY, USA. ACM, pp 1641–1646

    Chapter  Google Scholar 

  26. Roach JW, Tatem JE (1988) Using domain knowledge in low-level visual processing to interpret handwritten music: an experiment. Pattern Recognit 21(1):33–44

    Article  Google Scholar 

  27. Sand A, Pedersen CNS, Mailund T, Brask AT (2010) HMMlib: a C++ library for general hidden Markov models exploiting modern CPUs. In: 2010 ninth international workshop on parallel and distributed methods. IEEE, pp 126–134

    Google Scholar 

  28. Shen H, Coughlan JM (2012) Towards a real-time system for finding and reading signs for visually impaired users. In: Computers helping people with special needs. Springer, pp 41–47

    Google Scholar 

  29. Shilkrot R (2015) Digital digits: designing assistive finger augmentation devices. PhD thesis, Massachusetts Institute of Technology

    Google Scholar 

  30. Shilkrot R, Huber J, Liu C, Maes P, Nanayakkara SC (2014) FingerReader: a wearable device to support text reading on the go. In: CHI EA. ACM, pp 2359–2364

    Google Scholar 

  31. Shilkrot R, Huber J, Meng Ee W, Maes P, Nanayakkara SC (2015) Fingerreader: a wearable device to explore printed text on the go. In: Proceedings of the 33rd annual ACM conference on human factors in computing systems, CHI ’15, New York, NY, USA. ACM, pp 2363–2372

    Google Scholar 

  32. Smith R (2007) An overview of the tesseract OCR engine. In: ICDAR, pp 629–633

    Google Scholar 

  33. Stearns L, Du R, Oh U, Wang Y, Findlater L, Chellappa R, Froehlich JE (Sept 2014) The design and preliminary evaluation of a finger-mounted camera and feedback system to enable reading of printed text for the blind

    Google Scholar 

  34. Viktor L (Nov 2014) iSeeNotes—sheet music OCR!

    Google Scholar 

  35. Yamamoto Y, Uchiyama H, Kakehi Y (2011) onNote: playing printed music scores as a musical instrument. In: Proceedings of UIST. ACM, pp 413–422

    Google Scholar 

  36. Yang XD, Grossman T, Wigdor D, Fitzmaurice G (2012). Magic finger: always-available input through finger instrumentation. In: Proceedings of the 25th annual ACM symposium on user interface software and technology, UIST ’12, New York, NY, USA. ACM, pp 147–156

    Google Scholar 

  37. Yarrington D, McCoy K (2008) Creating an automatic question answering text skimming system for non-visual readers. In: Proceedings of the 10th international ACM SIGACCESS conference on computers and accessibility, Assets ’08, New York, NY, USA. ACM, pp 279–280

    Google Scholar 

  38. Ye H, Malu M, Oh U, Findlater L (2014) Current and future mobile and wearable device use by people with visual impairments. In: Proceedings of the SIGCHI conference on human factors in computing systems, CHI ’14, New York, NY, USA. ACM, pp 3123–3132

    Google Scholar 

  39. Yi C (2010) Text locating in scene images for reading and navigation aids for visually impaired persons. In: Proceedings of the 12th international ACM SIGACCESS conference on computers and accessibility, ASSETS ’10, New York, NY, USA. ACM, pp 325–326

    Google Scholar 

  40. Yi C, Tian Y (2012) Assistive text reading from complex background for blind persons. In: Camera-based document analysis and recognition. Springer, pp 15–28

    Google Scholar 

Download references

Acknowledgements

We would like to acknowledge the people who were directly involved in the ideation, creation and evaluation of the FingerReader: Connie Liu, Sophia Wu, Marcelo Polanco, Michael Chang, Sabrine Iqbal, Amit Zoran, K.P. Yao. We would like to acknowledge the help of the VIBUG group in MIT for testing and improving the FingerReader.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Roy Shilkrot .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer Nature Singapore Pte Ltd.

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Shilkrot, R., Huber, J., Boldu, R., Maes, P., Nanayakkara, S. (2018). FingerReader: A Finger-Worn Assistive Augmentation. In: Huber, J., Shilkrot, R., Maes, P., Nanayakkara, S. (eds) Assistive Augmentation. Cognitive Science and Technology. Springer, Singapore. https://doi.org/10.1007/978-981-10-6404-3_9

Download citation

  • DOI: https://doi.org/10.1007/978-981-10-6404-3_9

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-10-6402-9

  • Online ISBN: 978-981-10-6404-3

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics