A Hybrid Motion Based-Object Tracking Algorithm for VTS

Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 309)


This paper presents a hybrid motion path estimation-based object tracking algorithm for virtual touch screen (VTS) system using a projector, a camera, and LEDs as cursor indicator. The position of hand in current frame is estimated based on its position in previous frames. At the estimated hand position, a square region of interest (ROI) is considered to search for exact location of hand indicator. We also devised a modified full search (FS) algorithm to apply on entire frame. Efficiency of the algorithm is evaluated based on hand indicator detection rate with respect to frame rate and distance between the camera and projection surface. Performance comparison with other algorithms using FS on entire frame or fixed size ROI shows that implementation of the proposed algorithm significantly improves real-time performance by increasing accuracy and reducing computational time.


Direction vector Full frame search Motion estimation Object tracking Region of interest Virtual touch screen 


  1. 1.
    Yilmaz, A., Javed, O., Shah, M.: Object tracking: a survey. ACM Comput. Surv. (CSUR) 38(4), Article 13 (2006)Google Scholar
  2. 2.
    Isard, M., Blake, A.: Condensation—conditional density propagation for visual tracking. Int. J. Comput. Vis. 28(1), 5–28 (1998)CrossRefGoogle Scholar
  3. 3.
    Kölsch, M., Turk, M., Höllerer, T.: Vision-based interfaces for mobility. In: First Annual International Conference on Mobile and Ubiquitous Systems: Networking and Services (MobiQuitous’04), pp. 86–94 (2004)Google Scholar
  4. 4.
    Wilson, A.D.: Play anywhere: a compact interactive tabletop projection-vision system. In: Proceedings of the 18th Annual ACM Symposium on User Interface Software and Technology, Seattle, WA, USA, pp. 83–92 (2005)Google Scholar
  5. 5.
    Kurata, T., Okuma, T., Kourogi, M., Sakaue, K.: The hand mouse: GMM hand-color classification and mean shift tracking, In: Proceedings of 2nd International Workshop on RATFG-RTS2001, pp. 119–124 (2001)Google Scholar
  6. 6.
    Fels, S.S., Hinton, G.E.: Glove-talk: a neural network interface between a data-glove and a speech synthesizer. IEEE Trans. Neur. Netw. 4, 2–8 (1993)CrossRefGoogle Scholar
  7. 7.
    Erol, A., Bebis, G., Nicolescu, M., Boyle, R.D., Twombly, X.: A review on vision based full DOF hand motion estimation. In: Proceedings of Computer Vision and Pattern Recognition (CVPR’05), San Diego, CA, USA, pp. 75–75 (2005)Google Scholar
  8. 8.
    Sminchisescu, C., Triggs, B.: Estimating articulated human motion with covariance scaled sampling. Int. J. Robot. Res. 22(6), 371–393 (2003)CrossRefGoogle Scholar
  9. 9.
    Lu, S., Metaxas, D.N., Samaras, D., Oliensis, J.: Using multiple cues for hand tracking and model refinement. In: Proceedings of the 2003 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’v03), pp. 443–450 (2003)Google Scholar
  10. 10.
    Liang, W., Yun-de, J., Tang-li, L., Lei, H., Xin-xiao, W.: Hand motion tracking using simulated annealing method in a discrete space. J. Beijing Inst. Technol. 16(1), 61–66 (2007)MATHGoogle Scholar
  11. 11.
    Yong-jia, Z., Shu-ling, D.: A robust and fast monocular-vision-based hand tracking method for virtual touch screen. In: Proceedings of 2nd International Congress on Image and Signal Processing (CISP’09), pp. 1–5 (2009)Google Scholar
  12. 12.
    Das, S., Rudrapal, D., Jamatia, A., Kumari, L.: Preprocessing and screen-cursor mapping for a virtual touch screen on a projected area. Int. J. Comput. Sci. Eng. (IJCSE) 3, 2420–2429 (2011)Google Scholar

Copyright information

© Springer India 2015

Authors and Affiliations

  1. 1.Department of Electronics and Communication EngineeringNational Institute of TechnologyAgartalaIndia

Personalised recommendations