Advertisement

VizWear: Toward Human-Centered Interaction through Wearable Vision and Visualization

  • Takeshi Kurata
  • Takashi Okuma
  • Masakatsu Kourogi
  • Takekazu Kato
  • Katsuhiko Sakaue
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 2195)

Abstract

In this paper, we discuss the development of wearable systems which we collectively term VizWear. Vision plays an important role in both people’s and computer’s understanding of contextual information, and the use of augmented reality (AR) techniques is a good way to show information intuitively. This is the basis of our research on wearable computer vision and visualization systems. Our wearable systems enable us run different vision tasks in real-time. We secribe a novel approach not only to sensing the wearer’s position and direction, but also to displaying video frames overlaid with 2-D annotations related to the wearer’s view. We have also developed a method for 3-D graphical overlay by applying object recognition techniques and Hand Mouse, which enables the wearer to interact directly with an AR environment. We also describe an efficient method of face registration using wearable active vision.

Keywords

Augmented Reality Gaussian Mixture Model Inertial Sensor Panoramic Image Wearable Computer 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Reference

  1. 1.
    S. Birchfield. Elliptical head tracking using intensity greadients and color histograms. In Proc. IEEE Comp. Soc. Conf. on Computer Vision and Pattern Recognition (CVPR98), pages 232–237, 1998.Google Scholar
  2. 2.
    J. Farringdon and V. Oni. Visual augmented memory (VAM). In Proc. 4th Int’l Symp. on Wearable Computers (ISWC2000), pages 167–168, 2000.Google Scholar
  3. 3.
    K. Fukunaga. Introduction to Statistical Pattern Recognition. Academic Press Boston, 1990.zbMATHGoogle Scholar
  4. 4.
    T Kato, Y. Mukaigawa, and T. Shakunaga. Cooperative distibuted registation positioning and orientation in natural environment. In Proc. 4th Asian Conference on Computer Vision (ACCV2000), pages 729–735, 2000.Google Scholar
  5. 5.
    M. Kourogi, T. Kurata, and K. Sakaue. A panorama-based method of personal positioning and orientation and its real-time applications for wearable computers. In Proc. 5th Int’l Symp. on Wearable Computers (ISWC2001), 2001.Google Scholar
  6. 6.
    M. Kourogi, T. Kurata, K. Sakaue, and Y. Muraoka. Improvement of panorama-based annotaion overlay using omnidirectional vision and inetal sensors. In Proc. 4th Int’l Symp. on Wearable Computers (ISWC2000), pages 183–184, 2000.Google Scholar
  7. 7.
    M. Kourogi, T. Kurata, K. Sakaue, and Y. Muraoka. A panorama-based technique for annotation overlay and its real-time implementation. In Proc. Int’l Conf. on Multimedia and Expo (ICME2000), TA2.05, 2000.Google Scholar
  8. 8.
    T. Kurata, J. Fujiki, M. Kourogi, and K. Sakaue. A fast and robust approach to recovering structure and motion from live video frames. In Proc. IEEE Comp. Soc. Conf. on Computer Vision and Pattern Recognition (CVPR2000), volume 2, pages 528–535, 2000.Google Scholar
  9. 9.
    T. Kurata, T. Okuma, M. Kourogi, and K. Sakaue. The hand-mouse: Gmm hand color classificaton and mean shift tracking. In Proc. 2nd Int’l Workshop on Recognition, Analysis and Tracking of Faces and Gestures in Real-time Systems (RATFG-RTS2001) in conjunction with ICCV2001, 2001.Google Scholar
  10. 10.
    M. Lamming and M. Flynn. “forget-me-not” intimate computing in support of human memory. Technical Report EPC-1994-103, RXRC Cambridge Laboratory, 1994.Google Scholar
  11. 11.
    S. Mann. Wearable computing: A first step toward personal imaging. Computer, 30(2):25–32, 1997.CrossRefGoogle Scholar
  12. 12.
    W. W. Mayol, B. Tordoff, and D. W. Murray. Wearable visual robots. In Proc. 4th Int’l Symp. on Wearable Computers (ISWC2000), pages 95–102, 2000.Google Scholar
  13. 13.
    T. Okuma, T. Kurata, and K. Sakaue. Real-time camera parameter estimation from images for a wearable vision system. In Proc. IAPR Workshop on Machine Vision Applications (MVA2000), pages 83–86, 2000.Google Scholar
  14. 14.
    T. Okuma, T. Kurata, and K. Sakaue. 3-D annotation of images captured from a wearer’s camera based on object recognition. In Proc. IAPR Worshop on Machine Reality (ISMR2001), pages 184–185 2001.Google Scholar
  15. 15.
    T. Starner, S. Mann, B. Rhodes, J. Levine, J. Healey, D. Kirsch, W. R. Pichard, and A. Pentland. Augmented reality through wearable computing. Technical Report 397, M.I.T Media Lab. Perceptual Computing Section, 1997.Google Scholar
  16. 16.
    M. Truk and A. Pentland. Eigenfaces for recognition. Journal of Cognitive Neuroscience, 3(1):71–86, 1991.CrossRefGoogle Scholar
  17. 17.
    X. Zhu, J. Yang, and A. Waibel. Segmenting hands of arbiatry color. In Proc. 4th Int’l Conf on Automatic Face and Gesture Recognition (FG2000), pages 446–453, 2000.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2001

Authors and Affiliations

  • Takeshi Kurata
    • 1
  • Takashi Okuma
    • 1
  • Masakatsu Kourogi
    • 1
  • Takekazu Kato
    • 1
  • Katsuhiko Sakaue
    • 1
  1. 1.National Institute of Advanced Industrial Science and Technology (AIST)Intelligent Systems InstituteIbarakiJAPAN

Personalised recommendations