Skip to main content
Log in

A blueprint for integrated eye-controlled environments

  • Long Paper
  • Published:
Universal Access in the Information Society Aims and scope Submit manuscript

Abstract

Eye-based environmental control requires innovative solutions for supporting effective user interaction, for allowing home automation and control, and for making homes more “attentive” to user needs. Several approaches have already been proposed, which can be seen as isolated attempts to address partial issues and specific sub-sets of the general problem. This paper aims at tackling gaze-based home automation as a whole, exploiting state-of-the-art technologies and trying to integrate interaction modalities that are currently supported and that may be supported in the near future. User–home interaction is sought through two, complementary, interaction patterns: direct interaction and mediated interaction. Integration between home appliances and devices and user interfaces is granted by a central point of abstraction and harmonization called House Manager. Innovative points can be identified in the wide flexibility of the approach which allows on one side to integrate virtually all home devices having a communication interface, and, on the other side, combines direct and mediated user interaction exploiting the advantages of both. A discussion of interaction and accessibility issues is also provided, justifying the presented approach from the point of view of human–environment interaction.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11

Similar content being viewed by others

Notes

  1. The term domotic is a contraction of the words domus (the Latin word that means home) and informatics.

References

  1. The Konnex association. http://www.knx.org/. Accessed March 2009

  2. The My Open BTicino community. http://www.myopen-bticino.it/. Accessed March 2009

  3. X10. http://www.x10.com. Accessed March 2009

  4. The LonWorks platform. http://www.echelon.com/developers/lonworks/default.htm. Accessed March 2009

  5. Just, M.A., Carpenter, P.A.: Eye fixations and cognitive processes. Cogn. Psychol. 8, 441–480 (1976)

    Article  Google Scholar 

  6. Vertegaal, A.M., Sohn, C., Cheng, D.: Media eyepliances: using eye tracking for remote control focus selection of appliances. In: CHI Extended Abstracts, pp. 1861–1864 (2005)

  7. Wang, J., Sung, E.: Study on eye gaze estimation. IEEE Trans. Syst. Man Cybern. B Cybern. 32, 332–350 (2002)

    Article  Google Scholar 

  8. Young, L.R., Sheena, D.: Survey of eye movement recording methods. Behav. Res. Methods Instrum. 7(5), 397–429 (1974)

    Google Scholar 

  9. Bates, R., Istance, H., Oosthuizen, L., Majaranta, P.: Survey of de-facto standards in eye tracking. In: Deliverable 2.1, COGAIN Project (2005)

  10. Bates, R., Spakov, O.: Implementation of COGAIN gaze tracking standards. In: Deliverable 2.3, COGAIN Project (2006)

  11. Jiang, L., Liu, D., Yang, B.: Smart home research. In: Proceedings of the Third Conference on Machine Learning and Cybernetics Shanghai, pp. 659–664 (2004)

  12. The BTicino MyHome system. http://www.myhome-bticino.it. Accessed March 2009

  13. Shi, F., Gale, A., Purdy, K.: Direct gaze-based environmental controls. In: The 2nd Conference on Communication by Gaze Interaction, pp. 36–41 (2006)

  14. Bonino, D., Garbo, A.: An accessible control application for domotic environments. In: First International Conference on Ambient Intelligence Developments, pp. 11–27 (2006)

  15. Pellegrino, P., Bonino, D., Corno, F.: Domotic house gateway. In: Proceedings of SAC 2006, ACM Symposium on Applied Computing, Dijon, France, 23–27 (2006)

  16. Furfari, F., Sommaruga, L., Soria, C., Fresco, R.: DomoML: the definition of a standard markup for interoperability of human home interactions. In EUSAI ‘04: Proceedings of the 2nd European Union symposium on Ambient Intelligence, pp. 41–44. ACM Press, New York (2004)

  17. Web ontology language (OWL). http://www.w3.org/2004/OWL/. Accessed March 2009

  18. RDF core working group. http://www.w3.org/RDF/. Accessed March 2009

  19. OSGi alliance. http://www.osgi.org/. Accessed March 2009

  20. Jacob, R.J.K., Karn, K.S.: Eye tracking in human computer interaction and usability research: ready to deliver the promises. In: The Mind’s Eye: Cognitive and Applied Aspects of Eye Movement Research, pp. 573–605 (2003)

  21. Gale, A.G.: Attention responsive technology and ergonomics. In: Bust, P.D., McCabe, P.T. (eds.) Contemp Ergon pp. 273–276 (2005)

  22. Shi, F., Gale, A.G., Purdy, K.J.: Eye-centric ICT control. In: Contemporary Ergonomics, pp. 215–218. Taylor & Francis, London (2006)

  23. Shi, F., Gale, A.G., Purdy, K.J.: Helping people with ICT device control by eye gaze. In: Miesenberger, K., Klaus, J., Zagler, W., Karshmer, A. (eds.) Lecture Notes in Computer Science, pp. 480–487. Springer, Heidelberg (2006)

  24. Lowe, D.G.: Distinctive image features from scale-invariant keypoints. Int. J. Comput. Vis. 2, 91–110 (2004)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to F. Corno.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Bonino, D., Castellina, E., Corno, F. et al. A blueprint for integrated eye-controlled environments. Univ Access Inf Soc 8, 311–321 (2009). https://doi.org/10.1007/s10209-009-0145-4

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10209-009-0145-4

Keywords

Navigation