Advertisement

Detection of Gaze Direction for Human–Computer Interaction

  • G. Merlin SheebaEmail author
  • Abitha Memala
Conference paper
Part of the Lecture Notes in Computational Vision and Biomechanics book series (LNCVB, volume 30)

Abstract

Eye guide is an assistive specialized apparatus intended for the incapacitated or physically disabled individuals who were not able to move parts of their body, especially people whose communications are limited only to eye movements. The prototype consists of a camera and a computer. The system recognizes gazes in four directions and performs required user actions in related directions. The detected eye direction can then be used to control the applications. The facial regions which form the images are extracted using the skin color model and connected-component analysis. When the eye regions are detected, the tracking is performed. The system models consist of image processing, face detector, face tracker, and eyeblink detection. The eye guide system potentially helps as a computer input control device for the disabled people with severe paralysis.

Keywords

Image processing Face detector Face tracker Eyeblink detection 

Notes

Acknowledgements

Ethical Compliance Comments

Figure 4 is a facial image taken from UCI repository dataset as an example to indicate the focal and seat points in the face.

References

  1. 1.
    Ebisawa Y (1998) Improved video-based eye-gaze detection method. IEEE Trans Instrum Meas 47(4):948–955CrossRefGoogle Scholar
  2. 2.
    Kim KN, Ramakrishna RS (1999) Vision-based eye-gaze tracking for human computer interface. In: 1999 IEEE international conference on systems, man, and cybernetics, 1999. IEEE SMC’99 conference proceedings, vol 2. IEEE, pp 324–329Google Scholar
  3. 3.
    Galante A, Menezes P (2012) A gaze-based interaction system for people with cerebral palsy. Procedia Technol 5:895–902CrossRefGoogle Scholar
  4. 4.
    Mahalakshmi E, Sheeba GM (2015) Enhancement of CFA in single sensor camera using laplacian projection technique. Res J Pharm Biol Chem Sci 6(3):1529–1536 Google Scholar
  5. 5.
    Magee JJ, Betke M, Gips J, Scott MR, Waber BN (2008) A human–computer interface using symmetry between eyes to detect gaze direction. IEEE Trans Syst, Man, Cybern Part A Syst Humans 38(6):1248–1261CrossRefGoogle Scholar
  6. 6.
    Al-Rahayfeh AMER, Faezipour MIAD (2013) Eye tracking and head movement detection: a state-of-art survey. IEEE J Transl Eng Health Med 1:2100212CrossRefGoogle Scholar
  7. 7.
    Rantanen V, Vanhala T, Tuisku O, Niemenlehto PH, Verho J, SurakkaV, Juhola M, Lekkala JO (2011) A wearable, wireless gaze tracker with integrated selection command source for human–computer interaction. IEEE Trans Inf Technol BioMedicine 15(5):795–801CrossRefGoogle Scholar
  8. 8.
    Carbone A, Martínez F, Pissaloux E, Mazeika D, Velázquez R (2012) On the design of a low cost gaze tracker for interaction. Procedia Technol 3:89–96CrossRefGoogle Scholar
  9. 9.
    Zhang L, Vaughan R (2016, October) Optimal robot selection by gaze direction in multi-human multi-robot interaction. In: 2016 IEEE/RSJ international conference on intelligent robots and systems (IROS).IEEE, pp 5077–5083)Google Scholar
  10. 10.
    Miyake T, Asakawa T, Yoshida T, Imamura T, Zhang Z (2009, November). Detection of view direction with a single camera and its application using eye gaze. In: 35th annual conference of IEEE industrial electronics, 2009. IECON’09, pp 2037–2043Google Scholar
  11. 11.
    Merlin Sheeba G (2016) Enhanced wavelet OTSU tracking method for carcinoma cells. Int J Pharm Technol 8(2):11675–11684Google Scholar
  12. 12.
    Santos R, Santos N, Jorge PM, Abrantes A (2014) Eye gaze as a human-computer interface. Procedia Technol 17:376–383CrossRefGoogle Scholar
  13. 13.
    Sun L, Liu Z, Sun MT (2015) Real time gaze estimation with a consumer depth camera. Inf Sci 320:346–360MathSciNetCrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Sathyabama Institute of Science and TechnologyChennaiIndia

Personalised recommendations