Skip to main content
Log in

PASU: A personal area situation understanding system using wireless camera sensor networks

  • Original Article
  • Published:
Personal and Ubiquitous Computing Aims and scope Submit manuscript

Abstract

In this paper, we present a personal area situation understanding (PASU) system, a novel application of a smart device using wireless camera sensor networks. The portability of a PASU system makes it an attractive solution for monitoring and understanding the current situation of the personal area around a user. The PASU system allows its user to construct a 3D scene of the environment and view the scene from various vantage points for better understanding of the environment. The paper describes the architecture and implementation of the PASU system addressing limitations of wireless camera sensor networks, such as low bandwidth and limited computational capabilities. The capabilities of PASU are validated with extensive experiments. The PASU system demonstrates the potential of a portable system combining a smart device and a wireless camera sensor network for personal area monitoring and situation understanding.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19
Fig. 20
Fig. 21
Fig. 22
Fig. 23
Fig. 24
Fig. 25
Fig. 26
Fig. 27

Similar content being viewed by others

Notes

  1. The word pasu means to watch or to guard in Korean.

References

  1. TinyOS. http://www.tinyos.net/

  2. Aliakbarpour H, Dias J (2010) Human silhouette volume reconstruction using a gravity-based virtual camera network. In: 13th conference on information fusion (FUSION)

  3. Bay H, Tuytelaars T, Van Gool L (2006) SURF: Speeded up robust features. In: Proceedings of the European conference on computer vision (ECCV)

  4. Boice J, Lu X, Margi C, Stanek G, Zhang G, Manduchi R, Obraczka K (2006) Meerkats: a power-aware, self-managing wireless camera network for wide area monitoring. In: Workshop on distributed smart cameras (DSC)

  5. Bramberger M, Brunner J, Rinner B, Schwabach H (2004) Real-time video analysis on an embedded smart camera for traffic surveillance. In: Real-time and embedded technology and applications symposium, pp 174–181

  6. Chen P, Ahammad P, Boyer C, Huang S, Lin L, Lobaton E, Meingast M, Oh S, Wang S, Yan P, et al. (2008) Citric: a low-bandwidth wireless camera network platform. In: Proceedings of the 2nd ACM/IEEE international conference on distributed smart cameras (ICDSC)

  7. Chung WH (2011) A smartphone watch for mobile surveillance service. Pers Ubiquit Comput 1–10. URL http://dx.doi.org/10.1007/s00779-011-0435-8

  8. Crossbow: Telosb mote. http://www.xbow.com

  9. Dalal N, Triggs B (2005) Histograms of oriented gradients for human detection. In: Proceedings of the IEEE computer society conference on computer vision and pattern recognition (CVPR)

  10. Fischler M, Bolles R (1981) Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography. Commun ACM 24(6):381–395

    Article  MathSciNet  Google Scholar 

  11. Fleck S, Strasser W (2008) Smart camera based monitoring system and its application to assisted living. Proc IEEE 96(10):1698–1714

    Article  Google Scholar 

  12. Gomez-Romero J, Serrano M, Patricio M, Garcia J, Molina J (2011) Context-based scene recognition from visual data in smart homes: an information fusion approach. Pers Ubiquit Comput 1–23. URL http://dx.doi.org/10.1007/s00779-011-0450-9

  13. IEEE: Ieee 802.15.4 (2003) http://www.ieee802.org/15/pub/TG4.html

  14. Joachims T (2008) SVM light. http://svmlight.joachims.org/

  15. Kelly P, Conaire Ó, Monaghan D, Kuklyte J, Connaghan D, Pérez-Moneo Agapito J, Daras P (2010) Performance analysis and visualisation in tennis using a low-cost camera network. Multimedia Grand Challenge Track at ACM Multimedia

  16. Ko T, Ahmadian S, Hicks J, Rahimi M, Estrin D, Soatto S, Coe S, Hamilton MP (2010) Heartbeat of a nest: using imagers as biological sensors. ACM Trans Sens Netw 6:1–31

    Article  Google Scholar 

  17. Lam S, Wong K, Wong K, Wong W, Mow W (2009) A smartphone-centric platform for personal health monitoring using wireless wearable biosensors. In: 7th International conference on information, communications and signal processing (ICICS), pp 1–7

  18. Li N, Yan B, Chen G, Govindaswamy P, Wang J (2010) Design and implementation of a sensor-based wireless camera system for continuous monitoring in assistive environments. Pers Ubiquit Comput 14:499–510

    Article  Google Scholar 

  19. Lowe D (2004) Distinctive image features from scale invariant keypoints. Int J Comput Vision 60(2):91–110

    Article  Google Scholar 

  20. Lymberopoulos D, Teixeira T, Savvides A (2008) Macroscopic human behavior interpretation using distributed imager and other sensors. Proc IEEE 96(10):1657–1677

    Article  Google Scholar 

  21. Ma Y, Soatto S, Kosecka J, Sastry S (2003) An invitation to 3-D vision: from images to geometric models. Interdisciplinary applied mathematics. Springer, Berlin

    Google Scholar 

  22. Prati A, Vezzani R, Benini L, Farella E, Zappi P (2005) An integrated multi-modal sensor network for video surveillance. In: Proceedings of the third ACM international workshop on video surveillance and sensor networks, pp 95–102

  23. Rinner B, Wolf W (2008) An introduction to distributed smart cameras. Proc IEEE 96(10):1565–1575

    Article  Google Scholar 

  24. Sankaranarayanan A, Veeraraghavan A, Chellappa R (2008) Object detection, tracking and recognition for multiple smart cameras. Proc IEEE 96(10):1606–1624

    Article  Google Scholar 

  25. Shuai Z, Yoon S, Oh S, Yang M-H (2013) Traffic modeling and prediction using sensor networks: who will go where and when? ACM Trans Sens Netw 9(1)

  26. Song G, Ding F, Zhang W, Song A (2008) A wireless power outlet system for smart homes. Trans Consum Electron 54(4):1688–1691

    Article  Google Scholar 

  27. Tavli B, Bicakci K, Zilan R, Barcelo-Ordinas JM (2012) A survey of visual sensor network platforms. Multimed Tools Appl 60(3):689–726

    Google Scholar 

  28. Wu C, Aghajan H, Kleihorst R (2008) Real-time human posture reconstruction in wireless smart camera networks. In: Proceedings of the ACM/IEEE international conference on information processing in sensor networks (IPSN), pp 321–331

  29. Yoon S, Oh H, Lee D, Oh S (2011) Virtual lock: a smartphone application for personal surveillance using camera sensor networks. In: Proceedings of the international workshop on cyber-physical systems, networks, and applications

  30. Zivkovic Z (2010) Wireless smart camera network for real-time human 3D pose reconstruction. Comput Vis Image Underst 114(11):1215–1222

    Article  Google Scholar 

Download references

Acknowledgments

This work was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education, Science and Technology (No. 2010-0027155).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Songhwai Oh.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Yoon, S., Oh, H., Lee, D. et al. PASU: A personal area situation understanding system using wireless camera sensor networks. Pers Ubiquit Comput 17, 713–727 (2013). https://doi.org/10.1007/s00779-012-0611-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00779-012-0611-5

Keywords

Navigation