Simulations of Prosthetic Vision

  • Michael P. Barry
  • Gislin Dagnelie


Simulations of prosthetic vision can provide requirements and ­specifications for prosthesis designs and stimulus conditions; these requirements are expected to differ according to the visual task. Studies reviewed here include examinations of visual acuity, reading, face and object recognition, hand–eye coordination, wayfinding, visual tracking, and simple design feasibility. Based on these studies, visual acuity with prosthetic vision seems to depend most on the resolution of perceived phosphenes. Given usable visual acuity, all visual tasks that have been evaluated in simulations with variable dot counts demonstrate some significant dependence on the number simulated phosphenes provided. Some tasks also have more unique dependencies: Facial recognition seems quite sensitive to the number of gray levels and the relative size of dots and spacing. Wayfinding is most dependent on the angle of view captured by the camera. In many of the simulation studies practice was found to be an important factor for successful task performance. As visual prosthesis development becomes less limited by technological barriers, findings from simulation studies may become increasingly important for the design of implants and rehabilitation programs.


Visual Acuity Lateral Geniculate Nucleus Reading Accuracy Visual Prosthesis Landolt Ring 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.



Symbol for minutes of arc


Deep brain stimulation


Head-mounted display


Lateral geniculate nucleus of the thalamus


Logarithm of the minimum angle of resolution


Multi-photodiode array


  1. 1.
    Boyle JR, Maeder AJ, Boles WW (2008), Region-of-interest processing for electronic visual prostheses. J Electron Imaging, 17(1): p. 013002.CrossRefGoogle Scholar
  2. 2.
    Cai S, Fu L, Zhang H, et al. (2005), Prosthetic visual acuity in irregular phosphene arrays under two down-sampling schemes: a simulation study. Conf Proc IEEE Eng Med Biol Soc, 5: p. 5223–6.Google Scholar
  3. 3.
    Cha K, Horch K, Normann RA (1992), Simulation of a phosphene-based visual field: visual acuity in a pixelized system. Ann Biomed Eng, 20: p. 439–49.CrossRefGoogle Scholar
  4. 4.
    Cha K, Horch KW, Normann RA (1992), Mobility performance with a pixelized vision system. Vision Res, 32(7): p. 1367–72.CrossRefGoogle Scholar
  5. 5.
    Cha K, Horch KW, Normann RA, Boman DK (1992), Reading speed with a pixelized vision system. J Opt Soc Am A, 9(5): p. 673–7.CrossRefGoogle Scholar
  6. 6.
    Chen SC, Hallum LE, Lovell NH, Suaning GJ (2005), Learning prosthetic vision: a virtual–reality study. IEEE Trans Neural Syst Rehabil Eng, 13(3): p. 249–55.CrossRefGoogle Scholar
  7. 7.
    Chen SC, Hallum LE, Lovell NH, Suaning GJ (2005), Visual acuity measurement of prosthetic vision: a virtual–reality simulation study. J Neural Eng, 2(1): p. S135–45.CrossRefGoogle Scholar
  8. 8.
    Chen SC, Hallum LE, Suaning GJ, Lovell NH (2006), Psychophysics of prosthetic vision: I. Visual scanning and visual acuity. Conf Proc IEEE Eng Med Biol Soc, 1: p. 4400–3.CrossRefGoogle Scholar
  9. 9.
    Chen SC, Hallum LE, Suaning GJ, Lovell NH (2007), A quantitative analysis of head movement behaviour during visual acuity assessment under prosthetic vision simulation. J Neural Eng, 4(1): p. S108–23.CrossRefGoogle Scholar
  10. 10.
    Chen SC, Lovell NH, Suaning GJ (2004), Effect on prosthetic vision visual acuity by filtering schemes, filter cut-off frequency and phosphene matrix: a virtual reality simulation. Conf Proc IEEE Eng Med Biol Soc, 6: p. 4201–4.Google Scholar
  11. 11.
    Dagnelie G (2008), Psychophysical evaluation for visual prosthesis. Annu Rev Biomed Eng, 10: p. 339–68.CrossRefGoogle Scholar
  12. 12.
    Dagnelie G, Barnett D, Humayun MS, Thompson RW, Jr. (2006), Paragraph text reading using a pixelized prosthetic vision simulator: parameter dependence and task learning in free-viewing conditions. Invest Ophthalmol Vis Sci, 47(3): p. 1241–50.CrossRefGoogle Scholar
  13. 13.
    Dagnelie G, Keane P, Narla V, et al. (2007), Real and virtual mobility performance in simulated prosthetic vision. J Neural Eng, 4(1): p. S92–101.CrossRefGoogle Scholar
  14. 14.
    Dagnelie G, Thompson RW, Barnett D, Zhang W (2001), Simulated prosthetic vision: Perceptual and performance measures. In: Vision Science and its Applications, OSA Technical Digest. Washington, DC: Optical Society of America: p. 43–6.Google Scholar
  15. 15.
    Dagnelie G, Walter M, Yang L (2006), Playing checkers: Detection and eye–hand coordination in simulated prosthetic vision. J Mod Opt, 53: p. 1325–42.CrossRefGoogle Scholar
  16. 16.
    Fornos AP, Sommerhalder J, Rappaz B, et al. (2006), Processes involved in oculomotor adaptation to eccentric reading. Invest Ophthalmol Vis Sci, 47(4): p. 1439–47.CrossRefGoogle Scholar
  17. 17.
    Fornos AP, Sommerhalder J, Rappaz B, et al. (2005), Simulation of artificial vision, III: do the spatial or temporal characteristics of stimulus pixelization really matter? Invest Ophthalmol Vis Sci, 46(10): p. 3906–12.CrossRefGoogle Scholar
  18. 18.
    Hallum LE, Cloherty SL, Lovell NH (2008), Image analysis for microelectronic retinal prosthesis. IEEE Trans Biomed Eng, 55(1): p. 344–6.CrossRefGoogle Scholar
  19. 19.
    Hallum LE, Suaning GJ, Lovell NH (2004), Contribution to the theory of prosthetic vision. ASAIO J, 50(4): p. 392–6.Google Scholar
  20. 20.
    Hallum LE, Suaning GJ, Taubman DS, Lovell NH (2005), Simulated prosthetic visual fixation, saccade, and smooth pursuit. Vision Res, 45(6): p. 775–88.CrossRefGoogle Scholar
  21. 21.
    Hayes JS, Yin VT, Piyathaisere D, et al. (2003), Visually guided performance of simple tasks using simulated prosthetic vision. Artif Organs, 27(11): p. 1016–28.CrossRefGoogle Scholar
  22. 22.
    Humayun MS, Dorn JD, Ahuja AK, et al. (2009), Preliminary 6 month results from the Argus™ II epiretinal prosthesis feasibility study. Conf Proc IEEE Eng Med Biol Soc, 1: p. 4566–8.Google Scholar
  23. 23.
    Kelly AJ, Yang L, Dagnelie G (2004), The effects of stabilization, font scaling and practice on reading in simulated prosthetic vision. Invest Ophthalmol Vis Sci, 45: p. ARVO E-abstr. #5436.Google Scholar
  24. 24.
    Perez Fornos A, Sommerhalder J, Pittard A, et al. (2008), Simulation of artificial vision: IV. Visual information required to achieve simple pointing and manipulation tasks. Vision Res, 48(16): p. 1705–18.CrossRefGoogle Scholar
  25. 25.
    Pezaris JS, Reid RC (2009), Simulations of electrode placement for a thalamic visual prosthesis. IEEE Trans Biomed Eng, 56(1): p. 172–8.CrossRefGoogle Scholar
  26. 26.
    Sommerhalder J, Oueghlani E, Bagnoud M, et al. (2003), Simulation of artificial vision: I. Eccentric reading of isolated words, and perceptual learning. Vision Res, 43(3): p. 269–83.CrossRefGoogle Scholar
  27. 27.
    Sommerhalder J, Rappaz B, de Haller R, et al. (2004), Simulation of artificial vision: II. Eccentric reading of full-page text and the learning of this task. Vision Res, 44(14): p. 1693–706.CrossRefGoogle Scholar
  28. 28.
    Sommerhalder JR, Fornos AP, Chanderli K, et al. (2006), Minimum requirements for mobility inunpredictible environments. Invest Ophthalmol Vis Sci, 47: p. ARVO E-abstr. #3204.Google Scholar
  29. 29.
    Srivastava NR, Troyk PR, Dagnelie G (2009), Detection, eye–hand coordination and virtual mobility performance in simulated vision for a cortical visual prosthesis device. J Neural Eng, 6(3): p. 035008.CrossRefGoogle Scholar
  30. 30.
    Thompson RW, Jr., Barnett GD, Humayun MS, Dagnelie G (2003), Facial recognition using simulated prosthetic pixelized vision. Invest Ophthalmol Vis Sci, 44(11): p. 5035–42.CrossRefGoogle Scholar
  31. 31.
    Wang L, Yang L, Dagnelie G (2008), Initiation and stability of pursuit eye movements in simulated retinal prosthesis at different implant locations. Invest Ophthalmol Vis Sci, 49(9): p. 3933–9.CrossRefGoogle Scholar
  32. 32.
    Wang L, Yang L, Dagnelie G (2008), Virtual wayfinding using simulated prosthetic vision in gaze-locked viewing. Optom Vis Sci, 85(11): p. E1057–63.CrossRefGoogle Scholar
  33. 33.
    Zhao Y, Lu Y, Tian Y, et al. (2010), Image processing based recognition of images with a limited number of pixels using simulated prosthetic vision. Inf Sci, 180: p. 2915–24.CrossRefGoogle Scholar
  34. 34.
    Zhao Y, Tian Y, Liu H, et al. (2008), Pixelized images recognition in simulated prosthetic vision. IFMBE Proc, 19: p. 492–6.CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC 2011

Authors and Affiliations

  1. 1.Lions Vision Research and Rehabilitation Center, Wilmer Eye InstituteJohns Hopkins University School of MedicineBaltimoreUSA

Personalised recommendations