Advertisement

Bare Hand Interface for Interaction in the Video See-Through HMD Based Wearable AR Environment

  • Taejin Ha
  • Woontack Woo
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4161)

Abstract

In this paper, we propose a natural and intuitive bare hand interface for wearable augmented reality environment using the video see-through HMD. The proposed methodology automatically learned color distribution of the hand object through the template matching and tracking the hand objects by using the Meanshift algorithm under the dynamic background and moving camera. Furthermore, even though users are not wearing gloves, extracting of the hand object from arm is enabled by applying distance transform and using radius of palm. The fingertip points are extracted by convex hull processing and assigning constraint to the radius of palm area. Thus, users don’t need attaching fiducial markers on fingertips. Moreover, we implemented several applications to demonstrate the usefulness of proposed algorithm. For example, “AR-Memo" can help user to memo in the real environment by using a virtual pen which is augmented on the user’s finger, and user can also see the saved memo on his/her palm by augmenting it while moving around anywhere. Finally, we experimented performance and did usability studies.

Keywords

Search Window Fiducial Marker Color Distribution Dynamic Background Bare Hand 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
  2. 2.
    Hu, M.K.: Visual pattern recognition by moment invariants. IEEE Transactions on Information Theory IT-8, 179–187 (1962)MATHGoogle Scholar
  3. 3.
    Shapiro, L., Stockman, G.: Computer Vision, pp. 196–197. Prentice-Hall, Englewood Cliffs (2001)Google Scholar
  4. 4.
    Bradski, G.R.: Computer Vision Face Tracking For Use in a Perceptual User Interface, Microcomputer Research Lab. Santa Clara, Intel CorporationGoogle Scholar
  5. 5.
    Avis, D., Toussaint: An optimal algorithm for determining the visibility of a polygon from an edge. IEEE Trans. Comp. C-30, 910–914 (1981)MathSciNetCrossRefGoogle Scholar
  6. 6.
  7. 7.
    Open Source Computer Vision Library, http://www.intel.com/research/mrl/research/opencv
  8. 8.
    Smith, R., Piekarski, W., Wigley, G.: Hand Tracking For Low Powered Mobile AR User Interfaces. In: 6th Australasian User Interface Conference, Newcastle, NSW (January 2005)Google Scholar
  9. 9.
    Manresa, C., et al.: Hand Tracking and Gesture Recognition for Human-Computer Interaction. Electronic Letters on Computer Vision and Image Analysis 5(3), 96–104 (2005)Google Scholar

Copyright information

© IFIP International Federation for Information Processing 2006

Authors and Affiliations

  • Taejin Ha
    • 1
  • Woontack Woo
    • 1
  1. 1.GIST U-VR Lab.GwangjuSouth Korea

Personalised recommendations