Building a Sensory Infrastructure to Support Interaction and Monitoring in Ambient Intelligence Environments

  • Emmanouil Zidianakis
  • Nikolaos Partarakis
  • Margherita Antona
  • Constantine Stephanidis
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8530)


In the context of Ambient Intelligence (AmI), the elaboration of new interaction techniques is becoming the most prominent key to a more natural and intuitive interaction with everyday things [2]. Natural interaction between people and technology can be defined in terms of experience: people naturally communicate through gestures, expressions, movements. To this end, people should be able to interact with technology as they are used to interact with the real world in everyday life [19]. Additionally, AmI systems must be sensitive, responsive, and adaptive to the presence of people [16]. This paper presents the design and implementation of an interaction framework for ambient intelligence targeting to the provision of novel interaction metaphors and techniques in the context of AmI scenarios. The aforementioned infrastructure has been deployed in vitro within the AmI classroom simulation space of the FORTH-ICS AmI research facility and used to extend existing applications offered by an augmented interactive table for young children (Beantable) to support also games that facilitate biometric information, rich interaction metaphors and speech input [20].


Gesture Recognition Dynamic Time Warping Ambient Intelligence Face Tracking Biometric Information 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Aarts, E.H., Marzano, S. (eds.): The new everyday: Views on ambient intelligence. 010 Publishers (2003)Google Scholar
  2. 2.
    Aarts, E., Encarnacao, J.L.: True Visions. The Emergence of Ambient Intelligence. Springer (2008) ISBN 978-3-540-28972-2Google Scholar
  3. 3.
    Alcañiz, M., Rey, B.: New technologies for ambient intelligence. The Evolution of Technology, Communication and Cognition Towards the Future of Human-Computer Interaction, 3–15 (2005)Google Scholar
  4. 4.
    Georgalis, Y., Grammenos, D., Stephanidis, C.: Middleware for Ambient Intelligence Environments: Reviewing Requirements and Communication Technologies. In: Stephanidis, C. (ed.) UAHCI 2009, Part II. LNCS, vol. 5615, pp. 168–177. Springer, Heidelberg (2009)CrossRefGoogle Scholar
  5. 5.
    ISTAG, Ambient Intelligence: From Vision to Reality. In: Lakatta Riva, G., Vatalaro, F., Davide, F., Alcañiz, M. (eds.) Ambient Intelligence. IOS Press (2005)Google Scholar
  6. 6.
    Kaltenbrunner, M., Bencina, R.: reacTIVision: A computer-vision framework for table-based tangible interaction. In: Proceedings of the 1st International Conference on Tangible and Embedded Interaction. ACM (2007)Google Scholar
  7. 7.
    Kameas, A., Mavrommati, I., Markopoulos, P.: Computing in Tangible: Using Artifacts as Components of Ambient Intelligence Environments. IOS Press (2005),
  8. 8.
    Kim, H., Fellner, D.W.: Interaction with hand gesture for a back-projection wall. In: Proceedings of the IEEE International Computer Graphics, pp. 395–402 (June 2004)Google Scholar
  9. 9.
    Nakanishi, Y., Oka, K., Kuramochi, M., Matsukawa, S., Sato, Y., Koike, H.: Narrative Hand: Applying a fast finger-tracking system for media artGoogle Scholar
  10. 10.
    Ohno, T., Mukawa, N., Kawato, S.: Just blink your eyes: A head-free gaze tracking system. In: CHI 2003 Extended Abstracts on Human Factors in Computing Systems, pp. 950–957. ACM (2003)Google Scholar
  11. 11.
    Ohno, T., Mukawa, N.: A Free-head, Simple Calibration, Gaze Tracking System That Enables Gaze-Based InteractionGoogle Scholar
  12. 12.
    Ohno, T., Mukawa, N., Yoshikawa, A.: FreeGaze: A gaze tracking system for everyday gaze interaction. In: Proceedings of the 2002 Symposium on Eye Tracking Research & Applications, pp. 125–132. ACM (March 2002)Google Scholar
  13. 13.
    Ohya, J.: Computer Vision Based Analysis of Non-verbal Information in HCIGoogle Scholar
  14. 14.
    Oka, K., Sato, Y., Koike, H.: Real-time tracking of multiple fingertips and gesture recognition for augmented desk interface systems. In: Proceedings of the Fifth IEEE International Conference on Automatic Face and Gesture Recognition, pp. 429–434. IEEE (May 2002)Google Scholar
  15. 15.
  16. 16.
    Phillips Research, Ambient intelligence: Changing lives for the better (2007),
  17. 17.
    Sakoe, H., Chiba, S.: Dynamic programming algorithm optimization for spoken word recognition. IEEE Trans. on Acoust., Speech, and Signal Process., ASSP 26, 43–49 (1978)CrossRefzbMATHGoogle Scholar
  18. 18.
    Stephanidis, C.: Human Factors in Ambient Intelligence Environments. In: Salvendy, G. (ed.) Handbook of Human Factors and Ergonomics, 4th edn., ch. 49, pp. 1354–1373. John Wiley and Sons, USA (2012)Google Scholar
  19. 19.
    Szeliski, R.: Image alignment and stitching: A tutorial. Foundations and Trends in Computer Graphics and Vision. Now Publishers Inc. (2006)Google Scholar
  20. 20.
    Valli, A.: The design of natural interaction. Multimedia Tools and Applications 38(3), 295–305 (2008)CrossRefGoogle Scholar
  21. 21.
    Zidianakis, E., Antona, M., Paparoulis, G., Stephanidis, C.: An augmented interactive table supporting preschool children development through playingGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  • Emmanouil Zidianakis
    • 1
  • Nikolaos Partarakis
    • 1
  • Margherita Antona
    • 1
  • Constantine Stephanidis
    • 1
    • 2
  1. 1.Institute of Computer Science HeraklionFoundation for Research and Technology – Hellas (FORTH)Greece
  2. 2.Department of Computer ScienceUniversity of CreteGreece

Personalised recommendations