Skip to main content

Development of a System for Controlling IoT Devices Using Gaze Tracking

  • Conference paper
  • First Online:
Intelligent Sustainable Systems (ICoISS 2023)

Abstract

In this work, a prototype based on gaze tracking to control IoT devices was made. This prototype is a web application where the user can interact with interfaces through the gaze. The user can send commands to an IoT device using this application. For the creation of the aforementioned system, the iterative and incremental development methodology was used. The prototype was developed using a webcam and an open source library called WebGazer. These tools were selected based on a state-of-the art review. The prototype also uses an MQTT server and a Broker Communication to establish the connection between the web application and IoT devices. The web application has a user registration function which allows each user to have a channel in Beebotte to control only his/her devices. Finally, the prototype was tested using a light bulb and TV. The commands sent to the light bulb were ON and OFF while the commands sent to the TV were ON, OFF, change the channel (up and down), turn the volume up and turn the volume down. Based on the performed tests, it was possible to measure the effectiveness of the prototype which had 98.7% of precision, 97% of accuracy, and 3.14 s for icon selection.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 189.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 249.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Płużyczka M (2018) The first hundred years: a history of eye tracking as a research. Appl Linguist Pap 25(4):101–116

    Article  Google Scholar 

  2. Sharma N, Shamkuwar M, Singh I (2018) The history, present and future. In: Internet of things and big data analytics for smart,pp 27–51

    Google Scholar 

  3. Cerezo J, Murillo J, Yoo SG (2021) An online BCI system based in SSVEPs to control IoT 13

    Google Scholar 

  4. World Health Organization (2011) World report on disability. WHO Library Cataloguing, Geneva

    Google Scholar 

  5. Klaib F, Alsrehin O, Melhem Y, Bashtawi O (2019) IoT smart home using eye tracking and voice interfaces for elderly and special needs people. J Commun 14(7):8

    Google Scholar 

  6. Krafka K, Khosla A, Kellnhofer P, Kannan H, Bhandarkar S, Matusik W, Torralba A (2016) Eye tracking for everyone. In: Proceedings of the IEEE computer society conference on computer vision and pattern recognition, pp 2176–2184

    Google Scholar 

  7. Su Z, Zhang X, Kimura N, Rekimoto J (2021) Gaze+Lip: rapid, precise and expressive interactions combining gaze input and silent speech commands for hands-free smart TV control. In: Eye tracking research and applications symposium (ETRA), vol PartF169257

    Google Scholar 

  8. Abdallah AS, Elliott LJ, Donley D (2020) Toward smart internet of things (IoT) devices. In: IEEE Canadian conference on electrical and computer engineering, p 4

    Google Scholar 

  9. I. S. H. U. E. T. a. V. Interfaces, Klaib AF, Alsrehin NO, Melhem WY; Bashtawi HO (2019) J Commun 14(7):8

    Google Scholar 

  10. Arias E, López G, Quesada L, Guerrero L Web accessibility for people with reduced mobility 463–473

    Google Scholar 

  11. Brunete A, Gambao E, Hernando M, Cedazo R (2021) Smart assistive architecture for the integration of IoT devices. MDPI 21(2212)

    Google Scholar 

  12. Bissoli A, Lavino-Junior D, Sime M, Encarnaçã L, Bastos AT (2019) A human–machine interface based on eye tracking for controlling and monitoring a smart home using the internet of things. MDPI

    Google Scholar 

  13. Chandra S, Sharma G, Malhotra S, Jha D, Mittal AP (2016) Eye tracking based human computer interaction: applications and their uses. In: Proceedings - 2015 International Conference on Man and Machine Interfacing

    Google Scholar 

  14. Malini BH, Supritha R, Prasad NV, Vandana YR (2020) Eye and voice controlled wheel chair. IEEE Xplore

    Google Scholar 

  15. Veerati R, Suresh E, Chakilam A, Ravula SP (2018) Eye monitoring based motion controlled wheelchair for quadriplegics. Springer, pp 41–49

    Google Scholar 

  16. Brousseau B, Rose J, Eizenman M (2018) SmartEye: an accurate infrared eye tracking system for smartphones. In: 2018 9th IEEE annual ubiquitous computing, pp 951–959

    Google Scholar 

  17. Salunkhe P, Patil AR (2016) A device controlled using eye movement. In: International conference on electrical, electronics, and optimization techniques, p 4

    Google Scholar 

  18. Thampan J, Mohammed F, Tijin M, Prabhu P, Rince KM (2017) Eye based tracking and control system. Int J Innov Sci Mod Eng 4(10):2319–6386

    Google Scholar 

  19. Dondi P, Porta M, Donvito A, Volpe G (2021) A gaze-based interactive system to explore artwork imagery. J Multimodal User Interfaces 13

    Google Scholar 

  20. Robal T (2019) Spontaneous webcam instance for user attention tracking. Technol Manag World Intell Syst

    Google Scholar 

  21. Sogo H (2017) Sgttoolbox: utility for controlling simplegazetracker from psychtoolbox. Behav Res Methods 4(49):1323–1332

    Article  Google Scholar 

  22. Hanke M, Mathôt S, Ort E, Stadler J, Wagner A (2020) A practical guide to functional magnetic resonance imaging with simultaneous eye tracking for cognitive neuroimaging research. NeuroMethods 151:291–305

    Article  Google Scholar 

  23. Karthick S, Madhav K, Jayavidhi K (2019) A comparative study of different eye tracking system. In: AIP conference proceedings, vol 2112

    Google Scholar 

  24. Pai S, Bhardwaj A (2019) Eye gesture based communication for people with motor disabilities in developing nations. IEEE Xplore

    Google Scholar 

  25. Aqel M, Alashqar A, Badra A (2020) Smart home automation system based on eye tracking for quadriplegic users. Scopus

    Google Scholar 

  26. Barriga J, Sulca J, León JL, Ulloa A, Portero D, Andrade R, Yoo SG (2019) Smart parking: a literature review from thetechnological perspective. MDPI, p 34

    Google Scholar 

  27. Drummond J, Themessl-Huber M (2007) The cyclical process of action research. Action Res 5:430–448

    Article  Google Scholar 

  28. Mitchell SM, Seaman CB (2009) A comparison of software cost, duration, and quality for waterfall vs. iterative and incremental development: a systematic review. In: 2009 3rd international symposium on empirical software engineering and measurement

    Google Scholar 

  29. Colorlib, “colorlib,” 15 03 2021. https://colorlib.com/wp/template/login-form-v1/. Accessed 01 June 2022

  30. Izco F (2018) “bookdown,” 27 11 2018. https://bookdown.org/f_izco/BDC-POC/metricas.html. Accessed 04 Aug 2022

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Sang Guun Yoo .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Erazo, M.C., Cocha Tobanda, E., Yoo, S.G. (2023). Development of a System for Controlling IoT Devices Using Gaze Tracking. In: Raj, J.S., Perikos, I., Balas, V.E. (eds) Intelligent Sustainable Systems. ICoISS 2023. Lecture Notes in Networks and Systems, vol 665. Springer, Singapore. https://doi.org/10.1007/978-981-99-1726-6_12

Download citation

Publish with us

Policies and ethics