Finger Tracking Methods Using EyesWeb

  • Anne-Marie Burns
  • Barbara Mazzarino
Conference paper

DOI: 10.1007/11678816_18

Part of the Lecture Notes in Computer Science book series (LNCS, volume 3881)
Cite this paper as:
Burns AM., Mazzarino B. (2006) Finger Tracking Methods Using EyesWeb. In: Gibet S., Courty N., Kamp JF. (eds) Gesture in Human-Computer Interaction and Simulation. GW 2005. Lecture Notes in Computer Science, vol 3881. Springer, Berlin, Heidelberg

Abstract

This paper compares different algorithms for tracking the position of fingers in a two-dimensional environment. Four algorithms have been implemented in EyesWeb, developed by DIST-InfoMus laboratory. The three first algorithms use projection signatures, the circular Hough transform, and geometric properties, and rely only on hand characteristics to locate the finger. The fourth algorithm uses color markers and is employed as a reference system for the other three. All the algorithms have been evaluated using two-dimensional video images of a hand performing different finger movements on a flat surface. Results about the accuracy, precision, latency and computer resource usage of the different algorithms are provided. Applications of this research include human-computer interaction systems based on hand gesture, sign language recognition, hand posture recognition, and gestural control of music.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Anne-Marie Burns
    • 1
  • Barbara Mazzarino
    • 2
  1. 1.Input Devices and Music Interaction Lab, Schulich School of MusicMcGill UniversityMontréalCanada
  2. 2.InfoMus LabDIST – University of GenovaGenovaItaly

Personalised recommendations