Advertisement

Machine Vision and Applications

, Volume 21, Issue 4, pp 517–528 | Cite as

Automatic resource allocation in a distributed camera network

  • Deepak R. Karuppiah
  • Roderic A. Grupen
  • Zhigang Zhu
  • Allen R. Hanson
Original Paper

Abstract

In this paper, we present a hierarchical smart resource coordination and reconfiguration framework for distributed systems. We view the coordination problem as one of context aware resource reconfiguration. The fundamental unit in this hierarchy is a Fault Containment Unit (FCU) that provides run-time fault-tolerance by deciding on the best alternative course of action when a failure occurs. FCUs are composed hierarchically and are responsible for dynamically reconfiguring failing FCUs at lower levels. When such a reconfiguration is not possible, FCUs propagate the failure upward for resolution. We evaluate the effectiveness of our framework in a people tracking application using a network of cameras. The task for our multi-camera network is to allocate pairs of cameras that localize a subject optimally given the current run-time context. The system automatically derives policies for switching between camera pairs that enable robust tracking while being attentive to certain performance measures. Our approach is unique in that we model the dynamics in the scene and the camera network configuration steers the policies to provide robust tracking.

Keywords

Distributed sensor networks Multi-camera tracking Context-aware control 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Bramberger M., Doblander A., Maier A., Rinner B., Schwabach H.: Distributed embedded smart cameras for surveillance applications. Computer 39(2), 68–75 (2006) doi: 10.1109/MC.2006.55 CrossRefGoogle Scholar
  2. 2.
    Bramberger, M., Rinner, B., Schwabach, H.: A method for dynamic allocation of tasks in clusters of embedded smart cameras. In: Proceedings of the International Conference Systems, Man and Cybernetics, pp. 2595–2600. IEEE Press (2005). doi: 10.1109/ICSMC.2005.1571540
  3. 3.
    Burrell, J., Brooke, T., Beckwith, R.: Vineyard computing: sensor networks in agricultural production. Pervasive Computing, pp. 38–46 (2004)Google Scholar
  4. 4.
    Chu, M., Reich, J., Zhao, F.: Distributed attention in large scale video sensor networks. In: Intelligent Distributed Surveilliance Systems, IEE, pp. 61–65 (2004)Google Scholar
  5. 5.
    Fleck, S., Loy, R., Vollrath, C., Walter, F., Strasser, W.: Smartclassysurv: A smart camera network for distributed tracking and activity recognition and its application to assisted living. In: Distributed Smart Cameras, 2007. ICDSC ’07, pp. 211–218 (2007)Google Scholar
  6. 6.
    Gandhi T., Trivedi M.M.: Person tracking and reidentification: introducing panoramic appearance map (pam) for feature representation. Mach. Vis. Appl. 18(3/4), 207–220 (2007)CrossRefzbMATHGoogle Scholar
  7. 7.
    Grocholsky, B., Makarenko, A., Durrant-Whyte, H.F.: Scalable control of decentralised sensor platforms. In: Information Processing in Sensor Networks: 2nd Int Workshop, IPSN03, pp. 96–112 (2003)Google Scholar
  8. 8.
    Haritaoglu, I., Harwood, D., Davis, L.S.: W4s: Real-time system for detection and tracking people in 2.5d. In: Proceedings of the 5th European Conference on Computer Vision. Freiburg, Germany (1998)Google Scholar
  9. 9.
    Holness, G., Karuppiah, D.R., Uppala, S., Grupen, R., Ravela, S.C.: A service paradigm for reconfigurable agents. In: Proceedings of the 2nd Workshop on Infrastructure for Agents, MAS, and Scalable MAS (Agents 2001). ACM, Montreal, Canada (2001)Google Scholar
  10. 10.
    Huber, M., Grupen, R.A.: A hybrid architecture for learning robot control tasks. In: AAAI Spring Symposium Series: Hybrid Systems and AI (1999)Google Scholar
  11. 11.
    Isler, V., Bajcsy, R.: The sensor selection problem for bounded uncertainty sensing models. In: Proceedings of the 4th International Symposium on Information Processing in Sensor Networks, pp. 151–158 (2005)Google Scholar
  12. 12.
    Kalbarczyk Z.T., Bagchi S., Whisnant K., Iyer R.K.: Chameleon: a software infrastructure for adaptive fault tolerance. IEEE Trans. Parallel Distrib. Syst. 10(6), 1–20 (1999)CrossRefGoogle Scholar
  13. 13.
    Kokar M.M., Baclawski K., Eracar Y.A.: Control theory based foundations of self controlling software. IEEE Intell. Syst. 14(3), 37–45 (1999)CrossRefGoogle Scholar
  14. 14.
    Kruppa H., Spengler M., Schiele B.: Context-driven model switching for visual tracking. Robot. Autonom. Syst. 41, 101–110 (2002)CrossRefGoogle Scholar
  15. 15.
    Kumar V., Rus D., Singh S.: Robot and sensor networks for first responders. Pervasive Comput. 3(4), 24–34 (2004)CrossRefGoogle Scholar
  16. 16.
    Ladagga R.: Creating robust-software through self-adaptation. IEEE Intell. Syst. 14(3), 26–29 (1999)CrossRefGoogle Scholar
  17. 17.
    Li D., Wong K., Hu Y.H., Sayeed A.: Detection, classification and tracking of targets in distributed sensor networks. IEEE Signal Process. Mag. 19(2), 1–23 (2002)Google Scholar
  18. 18.
    Mainwaring, A., Polastre, J., Szewczyk, R., Culler, D., Anderson, J.: Wireless sensor networks for habitat monitoring. In: WSNA, pp. 88–97 (2002)Google Scholar
  19. 19.
    Martinez K., Hart J.K., Ong R.: Environmental sensor networks. Computer 37(8), 50–56 (2004)CrossRefGoogle Scholar
  20. 20.
    Matsuyama, T., et al.: Dynamic memory: Architecture for real time integration of visual perception, camera action, and network communication. In: Proceedings of IEEE Computer Society Conference on Computer Vision and Pattern Recognition, vol. 2, pp. 728–737. Hilton Head Island, SC (2000)Google Scholar
  21. 21.
    Nakazawa, A., Kato, H., Inokuchi, S.: Human tracking using distributed vision systems. In: 14th International Conference on Pattern Recognition, pp. 593–596. Brisbane, Australia (1998)Google Scholar
  22. 22.
    Ng, K.C., Ishiguro, H., Trivedi, M.M., Sogo, T.: Monitoring dynamically changing environments by ubiquitous vision system. In: IEEE Workshop on Visual Surveillance. Fort Collins, Colorado (1999)Google Scholar
  23. 23.
    Noury, N., Herve, T., Rialle, V., Virone, G., Mercier, E., Morey, G., Moro, A., Porcheron, T.: Monitoring behaviour in home using a smart fall sensor. In: IEEE-EMBS Special Topic Conference on Microtechnologies in Medicine and Biology, pp. 607–610 (2000)Google Scholar
  24. 24.
    Parado-Castellote, G., Schneider, S., Hamilton, M.: NDDS: The real-time publish-subscribe network. In: IEEE Real-time Systems Symposium, pp. 222–232. San Francisco, CA (1997)Google Scholar
  25. 25.
    Pentland A.: Looking at people: sensing for ubiquitous and wearable computing. IEEE Trans. Pattern Anal. Mach. Intell. 22(1), 107–119 (2000)CrossRefGoogle Scholar
  26. 26.
    Schulz, D., Fox, D., Hightower, J.: People tracking with anonymous and id-sensors using rao-blackwellised particle filters. In: Proceedings of the International Joint Conference on Artificial Intelligence (IJCAI) (2003)Google Scholar
  27. 27.
    Sogo T., Ishiguro H., Trivedi M.M.: N-Ocular stereo for real-time human tracking. In: Benosman, R., Kang, S.B. (eds) Panoramic Vision: Sensors, Theory, and Applications, pp. 359–375. Springer, New York (2001)Google Scholar
  28. 28.
    Sony ®: Image Sensing Products, Model EVI-D100 (2004). URLhttp://www.sony.net/Products/ISP/products/ptz/EVID100.html
  29. 29.
    Town, C.: Sensor fusion and environmental modelling for multimodal sentient computing. In: Zhu, Z., Huang, T.S. (eds.) Multimodal Surveillance: Sensors, Algorithms and Systems, pp. 225–257. Artech House Publisher, Boston (2007)Google Scholar
  30. 30.
    Toyama K., Hager G.: Incremental focus of attention for robust vision-based tracking. Int. J. Comput. Vis. 35(1), 45–63 (1999)CrossRefGoogle Scholar
  31. 31.
    Trivedi M.M., Huang K., Mikic I.: Intelligent environments and active camera networks. IEEE Trans. Syst. Man Cybernet. 2, 804–809 (2000)Google Scholar
  32. 32.
    Zhu Z., Karuppiah D.R., Riseman E.M., Hanson A.R.: Keeping smart, omnidirectional eyes on you. Robot. Autom. Mag. Special Issue on “Panoramic Robots” 11(4), 69–78 (2004)Google Scholar
  33. 33.
    Zhu Z., Karuppiah D., Riseman E.M., Hanson A.R.: Dynamic mutual calibration and view planning for cooperative mobile robots with panoramic virtual stereo vision. Comput. Vis. Image Understanding 95(3), 261–286 (2004)CrossRefGoogle Scholar
  34. 34.
    Zhu, Z., Rajasekar, K.D., Riseman, E., Hanson, A.: Panoramic virtual stereo vision of cooperative mobile robots for localizing 3D moving objects. In: Proceedings of IEEE Workshop on Omnidirectional Vision-OMNIVIS’00, pp. 29–36. Hilton Head Island, SC (2000)Google Scholar
  35. 35.
    Zotkin D.N., Raykar V.C., Duraiswami R., Davis L.S.: Multimodal tracking for smart videoconferencing and video surveillance. In: Zhu, Z., Huang T.S., (eds) Multimodal Surveillance: Sensors, Algorithms and Systems, pp. 141–175. Artech House Publisher, Boston (2007)Google Scholar

Copyright information

© Springer-Verlag 2008

Authors and Affiliations

  • Deepak R. Karuppiah
    • 1
  • Roderic A. Grupen
    • 1
  • Zhigang Zhu
    • 2
  • Allen R. Hanson
    • 1
  1. 1.Department of Computer ScienceUniversity of MassachusettsAmherstUSA
  2. 2.Department of Computer ScienceCity College of New YorkNew YorkUSA

Personalised recommendations