Skip to main content
Log in

Real-time smart lighting control using human motion tracking from depth camera

Journal of Real-Time Image Processing Aims and scope Submit manuscript

Abstract

A smart lighting system provides automatic control of lighting illumination and color temperature for high quality of life as well as energy savings in smart cities. A real-time activity understanding system with accurate human location estimation under varying illumination is required for smart lighting control, since comfortable lighting conditions vary according to human activities. This paper presents a real-time smart lighting control system using human location estimation based on inverse-perspective mapping of depth map images and activity estimation from location, heading direction, and height estimation of the moving person from multiple depth cameras. Lighting control based on estimated proximity to the specific activity area, distance to the target lighting area, and heading direction of the person provides an automatic activity-dependent lighting environment as well as energy savings. We implemented several activity modes such as study mode, dialog mode, and watching TV mode, and applied the proposed lighting control system to a living room lighting control with known furniture, electronics, and lighting locations using multiple Kinect depth cameras. The proposed model is based on localized proximity-based lighting control and can be extended to a more general lighting control by combining with global lighting control schemes.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18

References

  1. Aggarwal, J.K., Ryoo, M.S.: Human activity analysis: a review. ACM Comput. Surv., 43(3), 16:1–16:43 (2012)

    Google Scholar 

  2. Aldrich, M., Zhao, N., Paradiso, J.: Energy efficient control of polychromatic solid state lighting using a sensor network. In Proc. of SPIE, 7784, 778408–778415 (2010)

    Article  Google Scholar 

  3. Armato, A., Lanatá, A.: Comparative study on photometric normalization algorithms for an innovative, robust and real-time eye gaze tracker. J. Real-Time Image Proc. 8, 21–33 (2013)

    Article  Google Scholar 

  4. Barandiaran, I., Paloc, C., Graña, M.: Real-time optical markerless tracking for augmented reality applications. J. Real-Time Image Proc., pp. 129–138 (2010)

  5. Barry, R.: Using the FreeRTOS Real Time Kernel-Standard Edition. freeRTOS (2010)

  6. Bertozzi, M., Broggi, A., Fascioli, A.: Stereo inverse perspective mapping: theory and applications. Image Vis. Comput. 16, 585–590 (1998)

    Article  Google Scholar 

  7. Bleser, G., Becker, M., Stricker, D.: Real-time vision-based tracking and reconstruction. J. Real Time Image Proc. 2, 161–175 (2007)

    Article  Google Scholar 

  8. Caicedo, D., Pandharipande, A., Leus, G.: Occupancy-based illumination control of led lighting systems. Light. Res. Technol. 43, 217–234 (2011)

    Article  Google Scholar 

  9. Caragliu, A., Del Bo, C., Nijkamp, P.: Smart cities in Europe. Technical Report 0048, VU University Amsterdam, Faculty of Economics, Business Administration and Econometrics (2009)

  10. Chandaria, J., Thomas, G.A., Stricker, D.: The MATRIS project: real-time markerless camera tracking for augmented reality and broadcast applications. J. Real Time Image Proc., 2, 69–79 (2007)

    Article  Google Scholar 

  11. Chao, Y.-W., Choi, W., Pantofaru, C., Savarese, S.: Layout estimation of highly cluttered indoor scenes using geometric and semantic cues. Proc. Image Anal. Process. LNCS 8157, 489–499 (2013)

    Google Scholar 

  12. Chen, C., Liu, K., Kehtarnavaz, N.: Real-time human action recognition based on depth motion maps. J. Real Time Image Proc. 10, 1–9 (2013)

    Google Scholar 

  13. Michael Ziegler et. al.: Lighting the cities-accelerating the deployment of innovative lighting in european cities. European Commission Report 0048, European Union (2013)

  14. Dodier, R.H., Henze, G.P., Tiller, D.K.: Xin Guo. Building occupancy detection through sensor belief networks. Energy Build. 38(9), 1033–1043 (2006)

    Article  Google Scholar 

  15. Fairchild, MD.: Color appearance models. Wiley (2005)

  16. Gembris, D., Neeb, M., Gipp, M., Kugel, A., Manner, R.: Correlation analysis on gpu systems using nvidia’s cuda. J. Real Time Image Proc., pp. 275–280 (2011)

  17. Gómez-Romero, J., Serrano, M.A., Patricio, M.A., García, J., Molina, J.M.: Context-based scene recognition from visual data in smart homes: an information fusion approach. Pers. Ubiquitous Comput. 16(7), 835–857 (2012)

    Article  Google Scholar 

  18. Guo, X., Tiller, D.K., Henze, G.P., Waters, C.E.: The performance of occupancy-based lighting control systems: a review. Light. Res. Technol. 42, 415–431 (2010)

    Article  Google Scholar 

  19. Happe, M., Lubbers, E., Platzner, M.: A self-adaptive heterogeneous multi-core architecture for embedded real-time video object tracking. J. Real Time Image Proc. 8, 95–110 (2013)

    Article  Google Scholar 

  20. Hedau, V., Hoiem, D., Forsyth, D.: Recovering the spatial layout of cluttered rooms. In: Proceedings of the IEEE International Conference on Computer Vision (2009)

  21. Hernández, J., Montemayor, A.S., Pantrigo, J.J., Sánchez, Á.: Human action recognition based on tracking features. In IWINAC, Part I, LNCS 6686, pp. 471–480 (2011)

  22. Javed, O., Rasheed, Z., Shafique, K., Shah, M.: Tracking across multiple cameras with disjoint views. Proc. IEEE Int. Conf. Comput. Vis. 2, 952–957 (2003)

    Article  Google Scholar 

  23. Javed,O., Shafique, K., Shah, M.: Appearance modeling for tracking in multiple non-overlapping cameras. In: Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’05) -Vol 2-Vol 02, pp. 26–33 (2005)

  24. Ji, X., Liu, H.: Advances in view-invariant human motion analysis: a review. IEEE Trans. Syst. Man Cybern. Part C, 40, 13–24 (2010)

    Google Scholar 

  25. Karpinsky, N., Zhang, S.: High-resolution, real-time 3d imaging with fringe analysis. J. Real Time Image Proc. 7, 55–66 (2012)

    Google Scholar 

  26. Ke, S.-R., Le Uyen Thuc, H., Lee, Y.-J., Huang, J.-N., Yoo, J.-H., Choi, K.-H.: A review on video-based human activity recognition. Computers, 2:88–131 (2013)

    Article  Google Scholar 

  27. Kulkarni,P., Ganesan, D., Shenoy, P., Lu, Q.: Senseye: a multi-tier camera sensor network. In: Proceedings of the 13th annual ACM International Conference on Multimedia, pp. 229–238 (2005)

  28. Lee, C.-S., Chun, S., Shin, J.: The effect of led lighting color temperature and illumination control on attention and relaxation level. In: Proceedings of CJK Lighting Conference, pp. 21–22 (2012)

  29. Lee, H., Wu, C., Aghajan, H.: Vision-based user-centric light control for smart environments. Pervasive Mob. Comput. 7, 223–240 (2011)

    Article  Google Scholar 

  30. Lee,J., Choi, K., Suk, H.-J.: Led lighting for the educational environment: focusing on math and multimedia based tasks for 4th grade elementary school students in South Korea. Proc. AIC, 4, 1569–1572 (2013)

    Google Scholar 

  31. Jin Lee, S., Shah, G., Bhattacharya, A., Motai, Y.: Human tracking with an infrared camera using a curve matching framework. EURASIP J. Adv. Signal Process. 2012, 1–15 (2012)

  32. Madden, C., Dahai Cheng, E., Piccardi, M.: Tracking people across disjoint camera views by an illumination-tolerant appearance representation. Mach. Vis. Appl. 18, 233–247 (2007)

    Article  MATH  Google Scholar 

  33. Mahotra, S., Patlolla, C., Kehtarnavaz, N.: Real-time computation of disparity for hand-pair gesture recognition using a stereo webcam. J. Real Time Image Proc. 7, 257–266 (2012)

    Google Scholar 

  34. Mallot, H.A., Bülthoff, H.H., Little, J.I., Bohrer, S.: Inverse perspective mapping simplifies optical flow computation and obstacle detection. Biol. Cybern. 64, 177–185 (1991)

    Article  MATH  Google Scholar 

  35. Marsi, S., Saponara, S.: Integrated video motion estimator with retinex-like pre-processing for robust motion analysis in automotive scenarios: algorithmic and real-time architecture design. J. Real-Time Image Proc. 5, 275–289 (2010)

  36. Meyn, S.P., Surana, A., Lin, Y., Maris Oggianu, S., Narayanan, S., Frewen, T.A.: A sensor-utility-network method for estimation of occupancy in buildings. In: Proceedings of IEEE Conference on Decision and Control (CDC), pp. 1494–1500 (2009)

  37. Mogelmose, A., Bahnsen, C., Moeslund, T.B.: Tri-modal person re-identification with rgb, depth and thermal features. In: Proceedings of IEEE Workshop on Perception Beyond Visible, Spectrum, pp. 301–307 (2013)

  38. Newsham, G.R., Arsenault, C.: A camera as a sensor for lighting and shading control. Light. Res. Technol. 41, 143–163 (2009)

    Article  Google Scholar 

  39. Oi, N., Takahashi, H.: Preferred combinations between illuminance and color temperature in several settings for daily living activities, pp. 214–215. In Proc. of International Symposium on Design of Artificial, Environments (2007)

    Google Scholar 

  40. Orejfej, O., Liu, Z.: Hon4d: Histogram of oriented 4d normals for activity recognition from depth sequences. In: Proceedings of IEEE International Conference on Computer Vision and Pattern Recognition (2013)

  41. Paradiso, J.A., Aldich, M., ZHao, N.: Energy-efficient control of solid-state lighting. SPIE Newsroom (2011)

  42. Park, H., Burke, J., Srivastava, M.B.: Design and implementation of a wireless sensor network for intelligent light control. In: Proceedings of the international conference on Information processing in sensor networks, IPSN ’07, pp. 370–379 (2007)

  43. Petrosino, A., Miralto, M., Ferone, A.: A real-time streaming server in the RTlinux environment using VideoLanClient. J. Real-Time Image Proc. 6, 247–256 (2011)

    Google Scholar 

  44. Rambabu, C., Kim, K., Woo, W.: Fast and accurate extraction of moving object silhouette for personalized virtual studio home. J. Real-Time Image Proc. 4, 317–328 (2009)

    Google Scholar 

  45. Rauter, M.: Reliable human detection and tracking in top-view depth images. In: Proceedings IEEE Conference on Computer Vision and Pattern Recognition, Workshop, pp. 529–534 (2013)

  46. Sánchez-Oro, J., Fernández-López, D., Cabido, R.: AS. Montemayor, J. José Pantrigo. Radar-based road-traffic monitoring in urban environments. Digit. Signal Process. 23(1), 364–374 (2013)

    Article  MathSciNet  Google Scholar 

  47. Carlos Santos, P., Stork, A., Buaes, A., Eduardo Pereira, C., Jorge, J.: A real-time low-cost marker-based multiple camera tracking solution for virtual reality applications. J. Real-Time Image Proc. 5, 121–128 (2010)

    Google Scholar 

  48. Fred Schubert, E.: Lighting-emitting diodes. Cambridge University Press (2006)

  49. Singhvi, V., Krause, A., Guestrin, C., Garrett, Jr., J.H., Scott Matthews, H.: Intelligent light control using sensor networks. In: Proceedings of the international conference on Embedded networked sensor systems, pp. 218–229. ACM (2005)

  50. Alan, N., Smith.: Lighting. International labor, Organization (2011)

  51. Song, B., Roy-Chowdhury, A.K.: Robust tracking in a camera network: a multi-objective optimization framework. IEEE J. Selected Top.Signal Process. 2(4), 582–596 (2008)

    Article  Google Scholar 

  52. Tan, S., Dale, J., Anderson, A., Johnston, A.: Inverse perspective projection and optical flow: a calibration method and a quantitative analysis. Image Vis. Comput. 24, 153–165 (2006)

    Article  Google Scholar 

  53. Vacchetti, L., Lepetit, V., Fua, P.: Fusion online and offline information for stable 3d tracking in real-time. Proc. IEEE Conf. Comput. Vis. Pattern Recognit. 2, 241–248 (2003)

    Google Scholar 

  54. Vaisenberg, R., Mehrotra, S., Ramanan, D.: Semartcam scheduler: semantics driven real-time data collection from indoor camera networks to maximize event detection. J. Real-Time Image Proc. 5, 215–230 (2010)

    Google Scholar 

  55. Valberg, A.: Light vision color. Wiley, West Sussex (2005)

  56. Wachs, J.P., Kölsch, M., Goshorn, D.: Human posture recognition for intelligent vehicles. J. Real-Time Image Proc. 5, 231–244 (2010)

    Google Scholar 

  57. Wang, J., Liu, Z., Chorowski, J., Chen, Z., Wu, Y.: Robust 3d action recognition with random occupancy patterns. Proc. ECCV Part II LNCS 7573, 872–885 (2012)

    Google Scholar 

  58. Wang, J., Liu, Z., Wu, Y., Yuan, J.: Mining actionlet ensemble for action recognition with depth cameras. In: Proceedings IEEE Conference on Computer Vision and Pattern Recognition, 1290–1297 (2012)

  59. Wen, Y.-J., Bonnell, J., Agogino, A.M.: Energy conservation utilizing wireless dimmable lighting control in a shared-space office. In Proc. of the Annual Conference of the Illuminating Engineering Society (2008)

  60. Wojnicki, I., Ernst, S., Kotulski, L., Sedziwy, A.: Advanced street lighting control. Expert Syst. Appl. 41(4), 999–1005 (2014)

    Article  Google Scholar 

  61. Xia, L., Aggarwal, J.K.: Spatio-temporal depth cuboid similarity feature for activity recognition using depth camera. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (2013)

  62. Yoon, S., Oh, H., Lee, D., Oh, S.: PASU: a personal area situation understanding system using depth camera sensor networks. Pers. Ubiquitous Comput. 17(4), 713–727 (2013)

    Article  Google Scholar 

  63. Zhang, J., Kan, C., Schwing, A.G., Urtasun, R.: Estimating the 3d layout of indoor scenes and its clutter from depth sensors. In: Proceedings of IEEE International Conference on Computer Vision (2013)

  64. Zhu, Y., Guo, G.: A study on visible to infrared action recognition. IEEE Signal Process. Lett. 20(9), 897–900 (2013)

    Google Scholar 

Download references

Acknowledgments

This research was conducted under the industrial infrastructure program for fundamental technologies which is funded by the Ministry of Trade, Industry & Energy (MOTIE, Korea) and by the Ministry of Education, Science Technology (MEST) and National Research Foundation of Korea (NRF) through the Human Resource Training Project for Regional Innovation.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Chan-Su Lee.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Chun, S., Lee, CS. & Jang, JS. Real-time smart lighting control using human motion tracking from depth camera. J Real-Time Image Proc 10, 805–820 (2015). https://doi.org/10.1007/s11554-014-0414-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11554-014-0414-1

Keywords

Navigation