GART: The Gesture and Activity Recognition Toolkit

  • Kent Lyons
  • Helene Brashear
  • Tracy Westeyn
  • Jung Soo Kim
  • Thad Starner
Conference paper

DOI: 10.1007/978-3-540-73110-8_78

Part of the Lecture Notes in Computer Science book series (LNCS, volume 4552)
Cite this paper as:
Lyons K., Brashear H., Westeyn T., Kim J.S., Starner T. (2007) GART: The Gesture and Activity Recognition Toolkit. In: Jacko J.A. (eds) Human-Computer Interaction. HCI Intelligent Multimodal Interaction Environments. HCI 2007. Lecture Notes in Computer Science, vol 4552. Springer, Berlin, Heidelberg

Abstract

The Gesture and Activity Recognition Toolit (GART) is a user interface toolkit designed to enable the development of gesture-based applications. GART provides an abstraction to machine learning algorithms suitable for modeling and recognizing different types of gestures. The toolkit also provides support for the data collection and the training process. In this paper, we present GART and its machine learning abstractions. Furthermore, we detail the components of the toolkit and present two example gesture recognition applications.

Keywords

Gesture recognition user interface toolkit 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer-Verlag Berlin Heidelberg 2007

Authors and Affiliations

  • Kent Lyons
    • 1
  • Helene Brashear
    • 1
  • Tracy Westeyn
    • 1
  • Jung Soo Kim
    • 1
  • Thad Starner
    • 1
  1. 1.College of Computing and GVU Center, Georgia Institute of Technology, Atlanta, GA 30332-0280USA

Personalised recommendations