Advertisement

Context-Aware Perception for Cyber-Physical Systems

Chapter
Part of the Studies in Computational Intelligence book series (SCI, volume 540)

Abstract

Being aware of the context is one the important requirements of Cyber-Physical Systems (CPS). Context-aware systems have the capability to sense what is happening or changing in their environment and take appropriate actions to adapt to the changes. In this chapter, we present a technique for identifying the focus of attention in a context-aware cyber-physical system. We propose to use first-person vision, obtained through wearable gaze-directed camera that can capture the scene through the wearer’s point-of-view. We use the fact that human cognition is linked to his gaze and typically the object/person of interest holds our gaze. We argue that our technique is robust and works well in the presence of noise and other distracting signals, where the conventional techniques of IR sensors and tagging fail. Moreover, the technique is unobtrusive and does not pollute the environment with unnecessary signals. Our approach is general in that it may be applied to a generic CPS like healthcare, office and industrial scenarios and also in intelligent homes.

Keywords

First person vision Gaze-directed vision Context-aware perception Cognition-action linkage Video-based object recognition 

References

  1. 1.
    B.H. Krogh, Cyber Physical Systems: The Need for New Models and Design Paradigms. Presentation Report, Carnegie Mellon UniversityGoogle Scholar
  2. 2.
    R. Rajkumar, CPS Briefing (Carnegie Mellon University, Pittsburgh, 2007)Google Scholar
  3. 3.
    J. Shi, J. Wan, H. Yan, H. Suo, A survey of cyber-physical systems, in International Conference on Wireless Communications and Signal Processing, Nanjing, China, 9–11 Nov 2011Google Scholar
  4. 4.
    B.P. Bailey, J.A. Konstan, On the need for attention-aware systems: measuring effects of interruption on task performance, error rate, and affective state. Comput. Hum. Behav. 22, 685–708 (2006)CrossRefGoogle Scholar
  5. 5.
    G. Schirner, D. Erdogmus, K. Chowdhury, T. Padir, The future of human-in-the-loop cyber-physical systems. Computer. IEEE Computer Society Digital Library (2012)Google Scholar
  6. 6.
    M. Hayhoe, D. Ballard, Eye movements in natural behavior. Trends Cogn. Sci. 9(4), 188–194 (2005) Google Scholar
  7. 7.
    M.F. Land, Vision, eye movements, and natural behavior. Vis. Neurosci. 26, 51–62 (2009)CrossRefGoogle Scholar
  8. 8.
    A.L. Yarbus, Eye Movements and Vision (Plenum, New York, 1967)CrossRefGoogle Scholar
  9. 9.
    W. Wolf, Cyber-Physical Systems (IEEE Computer Society, Los Alamitos, 2009)Google Scholar
  10. 10.
    E.A. Lee, Cyber-physical systems—are computing foundations adequate? NSF Workshop on Cyber-Physical Systems: Research Motivation, Techniques and Roadmap, Austin, TX, 16–17 Oct 2006Google Scholar
  11. 11.
    L. Sha, S. Gopalakrishnan, X. Liu, Q. Wang, Cyber-physical systems: a new frontier, in IEEE International Conference on Sensor Networks, Ubiquitous, and Trustworthy Computing (2008)Google Scholar
  12. 12.
    P. Tabuada, Cyber-physical systems: position paper, in Symposium Conducted at the NSF Workshop on Cyber-Physical Systems, Austin, TX, October 2006Google Scholar
  13. 13.
    Y. Tan, S. Goddard, L.C. Pérez, A prototype architecture for cyber-physical systems, in SIGBED Review, Special Issue on the RTSS Forum on Deeply Embedded Real-Time Computing, vol. 5(1) (2008) Google Scholar
  14. 14.
    L.T. Kohn, J. Corrigan, To Err Is Human: Building a Safer Health System (National Academy Press, Washington, 2000)Google Scholar
  15. 15.
    J. Bohn, F.C. Gärtner, H. Vogt, Dependability issues of pervasive computing in a healthcare environment, in SPC 2003 (2003), pp. 53–70Google Scholar
  16. 16.
    J.E. Bardram, H.B. Christensen, A.K. Olsen, Activity-Driven Computing Infrastructure–Pervasive Computing in Healthcare. Department of Computer Science, Center for Pervasive Computing, University of Aarhus 2004, CfPC-2004-PB-65 (2004)Google Scholar
  17. 17.
    D. Zhang, Z. Yu, C.Y. Chin, Context-aware infrastructure for personalized healthcare. Stud. Health Technol. Inform. 117, 154–163 (2005)Google Scholar
  18. 18.
    J.H. Jahnke, Y. Bychkov, D. Dahlem, L. Kawasme, Context-aware information services for health care, in Proceedings of International Workshop on Modelling and Retrieval of Context, vol. 114 (2004)Google Scholar
  19. 19.
    M. Baldauf, S. Dustdar, F. Rosenberg, A survey on context-aware systems. Int. J. Ad Hoc Ubiquitous Comput. 2(4), 263–277 (2007)CrossRefGoogle Scholar
  20. 20.
    C. Orwat, A. Graefe, T. Faulwasser, Towards pervasive computing in health care—a literature review. BMC Med. Inform. Decis. Mak. 8, 26 (2008)CrossRefGoogle Scholar
  21. 21.
    N. Bricon-Souf, C.R. Newman, Context awareness in health care: a review. Int. J. Med. Inform. 76(1), 2–12 (2007)CrossRefGoogle Scholar
  22. 22.
    M. Miraoui, C. Tadj, C. Ben Amar, Architectural survey of context-aware systems in pervasive computing environment. Ubiquitous Comput. Commun. 3(3) 68–76 (2008) Google Scholar
  23. 23.
    D.O. Kang, H.J. Lee, E.J. Ko, K. Kang, J. Lee, A wearable context aware system for ubiquitous healthcare, in Proceedings of the 28th Annual International Conference of the IEEE on Engineering in Medicine and Biology Society, New York, NY (2006)Google Scholar
  24. 24.
    C. Roda, J. Thomas, Attention aware systems, in Encyclopaedia of HCI, ed by C. Ghaoui (IDEA Group, Hershey, 2005)Google Scholar
  25. 25.
    C. Roda, J. Thomas, Attention Aware Systems: Theory, Application, and Research Agenda (Computers in Human Behavior, Elsevier, 2006)Google Scholar
  26. 26.
    A. Toet, Gaze directed displays as an enabling technology for attention aware systems. Comput. Hum. Behav. 22(4), 615–647 (2006)CrossRefGoogle Scholar
  27. 27.
    G. Boening, K. Bartl, T. Dera, S. Bardins, E. Schneider, T. Brandt, Mobile eye tracking as a basis for real-time control of a gaze driven head-mounted video camera, in Symposium on Eye Tracking Research and Applications (2006)Google Scholar
  28. 28.
    T. Brandt, S. Glasauer, E. Schneider, A 3rd eye for the surgeon. J. Neurol. Neurosurg. Psychiatry 77(2), 278 (2006)CrossRefGoogle Scholar
  29. 29.
    M. Land, N. Mennie, J. Rusted, The roles of vision and eye movements in the control of activities of daily living. Perception 28(11), 1311–1328 (1999)CrossRefGoogle Scholar
  30. 30.
    E. Schneider, T. Villgrattner, J. Vockeroth, K. Bartl, S. Kohlbecher, S. Bardins, H. Ulbrich, T. Brandt, EyeSeeCam: an eye movement-driven head camera for the examination of natural visual exploration. Ann. N. Y. Acad. Sci. 1164, 461–467 (2009)CrossRefGoogle Scholar
  31. 31.
    P. Wagner, K. Bartl, W. Günthner, E. Schneider, T. Brandt, H. Ulbrich, A pivotable head mounted camera system that is aligned by three-dimensional eye movements, in Symposium on Eye tracking research and applications, New York, NY, USA (2006), pp. 117–124. doi: 10.1145/1117309.1117354
  32. 32.
    M.F. Land, M. Hayhoe, In what ways do eye movements contribute to everyday activities? Vision. Res. 41, 3559–3565 (2001)CrossRefGoogle Scholar
  33. 33.
    M.F. Land, P. McLeod, From eye movements to actions: how batsmen hit the ball. Nat. Neuroscience. 3, 1340–1345 (2000)CrossRefGoogle Scholar
  34. 34.
    Deep Video Imaging Ltd. (2004), Interactive dual plane imagery, http://www.deepvideo.com
  35. 35.
    C.R. Beal, Adaptive user displays for intelligent tutoring software. Cyber-Psychol. Behav. 7(6), 689–693 (2004)CrossRefGoogle Scholar
  36. 36.
    G.J. Zelinsky, R.P.N. Rao, M. Hayhoe, D.H. Ballard, Eye movements reveal the spatiotemporal dynamics of visual search. Psychol. Sci. 8(6), 448–453 (1997)CrossRefGoogle Scholar
  37. 37.
  38. 38.
    L. Sun, U. Klank, M. Beetz, EYEWATCHME—3D hand and object tracking for inside out activity analysis, in IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Miami, FL, USA, 20–25 June 2009Google Scholar
  39. 39.
    H. Kang, A.A. Efros, M. Hebert, T. Kanade, Image matching in large scale indoor environment, in IEEE Conference on Computer Vision and Pattern Recognition (CVPR) Workshop on Egocentric Vision, Miami, FL, USA, 20–25 June 2009Google Scholar
  40. 40.
    T. Kanade, First-person, inside-out vision. Keynote speech, in Proceedings of 1st Workshop on Egocentric Vision, in Conjunction with CVPR, Miami, FL, USA, 20–25 June 2009Google Scholar
  41. 41.
    B. Noris, J.B. Keller, A. Billard, A wearable gaze tracking system for children in unconstrained environments. Comput. Vis. Image Underst. 115 (4) 476–486 (2011) Google Scholar
  42. 42.
    Face Recognition SDK at Pittsburgh Pattern Recognition Group (2011), http://www.pittpatt.com.. Accessed 23 August 2011
  43. 43.
    S. Noor, H.N. Minhas, Aqeel-ur-Rehman, Using gaze-directed vision to identify focus of attention in pervasive healthcare systems, in IEEE 14th International Multitopic Conference INMIC, SSUET, Karachi, Pakistan (2011)Google Scholar
  44. 44.
    H.N. Minhas, S H. Mirza, Y. Sheikh, A. Jain, M. Shah, Model generation for video-based object recognition, in Proceedings of 14th ACM Multimedia Conference at Santa Barbara, USA (2006)Google Scholar
  45. 45.
    D.G. Lowe, Distinctive image features from scale-invariant keypoints. Int. J. Comput. Vision 60(2), 91–110 (2004)CrossRefGoogle Scholar
  46. 46.
    PASCAL (2011), Pattern analysis, statistical modeling and computational learning visual object classes challenge—VOC2011, http://pascallin.ecs.soton.ac.uk/
  47. 47.
    M.J. Swain, D.H. Ballard, Color indexing. Int. J. Comput. Vision 7(1), 11–32 (1991)CrossRefGoogle Scholar
  48. 48.
    Fast Match Template (2010), Function of OpenCV at OpenCVWiki, http://opencv.willowgarage.com/wiki/FastMatchTemplate. Accessed 19 April 2010

Copyright information

© Springer Science+Business Media Singapore 2014

Authors and Affiliations

  1. 1.Department of Computer EngineeringSir Syed University of Engineering and TechnologyKarachiPakistan
  2. 2.Project Coordinator and Academic Supervisor, German Academic Exchange Service (DAAD)University of Bremen (UB)BremenGermany

Personalised recommendations