Skip to main content

Real Time Tracking and Monitoring of Human Behavior in an Indoor Environment

  • Chapter
Sensors

Part of the book series: Lecture Notes Electrical Engineering ((LNEE,volume 21))

  • 3391 Accesses

Abstract

This chapter reports the development of a real time 3D sensor system and a new concept based on space decomposition by encoding its operational space using limited number of laser spots. The sensor system uses the richness and the strength of the vision while reducing the data-load and computational cost. The chapter presents the development and implementation of an intelligent 3D Fiber Grating (FG) based vision-system that can monitor and track human being status in real time for monitoring purposes to support wide range of applications. The 3D visual sensor is able to measure three-dimensional information with respect to human, objects and surrounding environment. The sensor system consists of a CCD camera, a laser spot array generator (constitutes: laser diode and driver, lens, fiber gratings and holder), and a processing unit with alarm facilities and interfacing capabilities to a higher-level controller and decision-making along with a user-friendly interface. The system works by projecting a two-dimensional matrix of laser spots generated through two perpendicularly overlaid layers of FGs. Then, the spot array generator projects the laser spot array on the front scene within the active view of the CCD camera and the reflected laser spots from the scene play an essential role in detecting and tracking targets. It is possible to adjust or translate the position of the laser spot generator with respect to the CCD camera to have better and balanced resolution. In addition, it is possible to rotate the FG in order to obtain a better laser spot pattern that can facilitate processing and enhance accuracy. Furthermore, multi-laser spot generators can be configured with one CCD camera to widen the operational coverage of the developed sensor system. This chapter introduces and illustrates the structure of the developed sensor system, its operational principles, performance analysis and experimental results.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. M. H., Lee, “Intelligent Robotics”, Halsted Press and Open Univ. Press, 1989, Chap. 1, pp. 1–12.

    Google Scholar 

  2. R. D., Kalfter, T. A., Chemielewski, and M., Negin, “Robotic Engineering and Integrated Approach”, Prentice Hall, 1989, Chap. 6, pp. 440–506.

    Google Scholar 

  3. M. K., Habib, and S., Yuta, “Map Representation of a Large in-door Environment with Path Planning and Navigation Abilities for an Autonomous Mobile Robot with its Implementation on a Real Robot”, Automation in Construction, Vol. 1, No. 2, 1993, pp. 155–179.

    Article  Google Scholar 

  4. S., Suzuki, M. K., Habib, J., Iijima, and S., Yuta, “How to Describe the Mobile Robot’s Sensor-Based Behavior”, Robotics and Autonomous Systems, 7, pp. 227–237, 1991.

    Article  Google Scholar 

  5. M., Kondo, S., Tachiki, M., Ishida and K., Higuchi, “Automatic measuring system for body fit on the automobile assembly line”, IEEE International Conference on Robotics and Automation (ICRA’1995),Vol. 1, 1995,

    Google Scholar 

  6. J., Ishikawa, K., Kosuge and K. Furuta, “Intelligent control of assembling robot using vision sensor”, IEEE International Conference on Robotics and Automation (ICRA’1990), 1990, Vol. 3, pp. 1904–1909.

    Google Scholar 

  7. S. Y., Chen, W. L., Wang, G., Xiao, C. Y., Ya and Y.F., Li, “Robot perception planning for industrial inspection”, IEEE Region 10 Conference TENCON, 2004,

    Google Scholar 

  8. J., Lee, “Applying 3-D vision to robotic manufacturing automation”, Proceedings of Rensselaer’s Second International Conference on Computer Integrated Manufacturing, Troy-USA, May 1990, pp. 99–104.

    Google Scholar 

  9. S., Li, I., Miyawaki, K., Ishiguro, and S., Tsuji, "Finding of 3D structure by an active-vision-based mobile robot”, IEEE International Conference on Robotics and Automation (ICRA’1992), 1992, Vol. 2, pp. 1812–1817.

    Article  Google Scholar 

  10. M. A., Garcia and A., Solanas, “3D simultaneous localization and modeling from stereo vision”, IEEE International Conference on Robotics and Automation (ICRA’2004), 2004, Vol. 1, pp. 847–853.

    Google Scholar 

  11. K., Nakazawa, S., M., Nakajima,, and H., Kobayashi, “Development of 3D shape measurement system using Fiber Grating”, Trans. IEICE, Vol. j69-D, No. 12, 1986, pp. 1929–1935.

    Google Scholar 

  12. M. K. Habib, “Development of 3D Fiber Grating Based Vision Sensor", The second ACCV’95, Singapore, 1995, pp. II 269–274.

    Google Scholar 

  13. J., Yamaguchi, H., Gou, and M., Nakajima, “Finding Intruders System using Hologram Disk”, Technical Digest of the 8th Sensor Symposium, 1989, pp. 83–86.

    Google Scholar 

  14. C. E., Smith, C. A., Richards, S. A., Brandt, and N. P., Papanikolopoulos, “Visual Tracking for Intelligent Vehicle-Highway Systems”, IEEE Transactions on Vehicular Technology, Vol. 45, No. 4, Nov. 1996, pp. 744–759.

    Google Scholar 

  15. C., Setchell, and E. L., Dagless, “Vision-based road-traffic monitoring sensor”, IEE Proceedings-Vision, Image and Signal Processing, Vol. 148, No. 1, Feb. 2001, pp. 78–84.

    Google Scholar 

  16. M-.Y. Kim, K.-W. Ko, H.-S. Cho and J.-H. Kim, “Visual Sensing and Recognition of welding environment for intelligent shipyard welding robots”, Proceedings of the 2000 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2000), Vol. 3, 2000, pp. 2159–2165.

    Article  Google Scholar 

  17. T. Arjuna Balasuriya; Ura, “Vision-based underwater cable detection and following using AUVs”, Oceans ‘02 MTS/IEEE Vol. 3, 2002, pp. 1582–1587.

    Article  Google Scholar 

  18. S., Grange, E., Casanova, T., Fong, and C., Baur, “Vision-based sensor fusion for human-computer interaction”, 2002. IEEE/RSJ International Conference on Intelligent Robots and System, Vol. 2, 2002, pp. 1120–1125.

    Article  Google Scholar 

  19. F., Wallner, R., Graf, and R., Dillmann, “Real-time map refinement by fusing sonar and active stereo-vision”, Proceedings of the 1995 IEEE International Conference on Robotics and Automation, Vol. 3, 1995, pp. 2968–2973.

    Google Scholar 

  20. J-H., Kim and J. C., Myung, “SLAM with omni-directional stereo vision sensor”, Proceedings of the 2003 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2003),Vol. 1, 2003, pp. 442–447.

    Article  Google Scholar 

  21. K., Terada, J. Yamaguchi and M. Nakajima, “An Identification of human faces using bright-spots matrix projection”, Proceedings of the 1993 International Joint Conference on Neural Network, Nagoya-Japan, 1993, pp. 2093–2096.

    Google Scholar 

  22. U. Gopinathan, D. J. Brady, and N. P. Pitsianis, “Coded apertures for efficient pyroelectric motion tracking”, The international Electronic Journal of Optic, Vol. 11, No. 18, September 08, 2003 , pp. 2142–2152.

    Google Scholar 

  23. A. Armitage, D. Binnie, J. Kerridge, and L. Lei, “Measuring pedestrian trajectories with low cost infrared detectors: preliminary results.”, Pedestrian and Evacuation Dynamics, Greenwich: CMS Press London, 2003, pp. 101–110.

    Google Scholar 

  24. P. Aarabi, “The fusion of distributed microphone arrays for sound localization”, EURASIP Journal of Applied Signal Processing, Vol. 4, 2003 pp. 338–347.

    Article  Google Scholar 

  25. H., Machida, J., Nitta, A., Seko, and H., Kobayashi, “High Efficiency Fiber Grating for Producing Multiple Beams of Uniform Intensity”, Applied Optics, Vol. 23, No. 2, pp. 330–332, 1984.

    Article  Google Scholar 

  26. Machida Endoscope Co. Ltd., Tokyo, Japan, 1996 direct communication.

    Google Scholar 

  27. T., Asakura, and N., Takai, “Dynamic Laser Speckle and their Application to Velocity Measurements of Diffuse Object”, Applied Physics, Vol. 25, 1981, pp. 179–194.

    Article  Google Scholar 

  28. M. K., Habib, “Fiber Grating Based Vision System for Real Time Tracking, Monitoring and Obstacle Detection”, IEEE Sensor Journal, Vol. 7, No. 1, Jan. 2007, pp. 105–121.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2008 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Habib, M.K. (2008). Real Time Tracking and Monitoring of Human Behavior in an Indoor Environment. In: Mukhopadhyay, S., Huang, R. (eds) Sensors. Lecture Notes Electrical Engineering, vol 21. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-69033-7_12

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-69033-7_12

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-69030-6

  • Online ISBN: 978-3-540-69033-7

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics