Advertisement

Gesture recognition in the problem of contactless control of an unmanned aerial vehicle

  • V. E. NahapetyanEmail author
  • V. M. Khachumov
Automation Systems in Scientific Research and Industry

Abstract

The problem of contactless control of an unmanned aerial vehicle by human gestures is considered. A system of commands is proposed for an AR.Drone 2.0 quadcopter equipped with a builtin computer, two color video cameras, the Linux operating system, and sensors for measuring the flight height, speed, and stability. An Asus Xtion Pro Live three-dimensional sensor based on triangulation and structured light is used to enter data into the control system. Gesture recognition is performed by frame-by-frame processing of a video sequence consisting of depth images. The method does not require initial training, is insensitive to changes in lighting, and is invariant to the sizes of the human palm and body.

Keywords

unmanned aerial vehicle control gesture recognition image processing 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Parrot AR.Drone 2.0: Even more piloting possibilities! http://blog.parrot.com/2014/01/06/parrot-ar-drone-2-0-even-more-piloting-possibilities.Google Scholar
  2. 2.
    AR.Drone + Kinect. http://ardrone.parrot.com/best-of-user-videos/2012/02/17/ardrone-kinect.Google Scholar
  3. 3.
    V. Devyatkov, A. Alfimtsev, “Human-Computer Interaction in Games Using Computer Vision Techniques,” Image Processing: Concepts, Methodologies, Tools, and Applications. Hershey, USA: IGI Global, 2011. Ch. 10. P. 146–167.Google Scholar
  4. 4.
    AR.Drone 2.0. http://ardrone2.parrot.com.Google Scholar
  5. 5.
    Asus Xtion Pro Live. http://www.asus.com/Multimedia/Xtion PRO LIVE.Google Scholar
  6. 6.
    J. Shotton, A. Fitzgibbon, M. Cook, et al., “Real-Time Human Pose Recognition in Parts from Single Depth Images,” in Proc. CVPR’11 (IEEE, 2011), pp. 1297–1304.Google Scholar
  7. 7.
    OpenNI: The Standard Framework for 3D Sensing. http://www.openni.ru/openni-sdk/index.html.Google Scholar
  8. 8.
    K. Palagyi, Skeletonization and its Applications. http://www.inf.u-szeged.hu/~palagyi/skel/skel.html.Google Scholar
  9. 9.
    W. T. Freeman, D. B. Anderson, P. Beardsley, et al., “Computer Vision for Interactive Computer Graphics,” IEEE. Comput. Graph. Appl. 18 (3), 42–53 (1998).CrossRefGoogle Scholar
  10. 10.
    A. A. Rozhentsov, K. V. Morozov, and A. A. Baev, “Modified Generalized Hough Transform for 3D Image Processing with Unknown Rotation and Scaling Parameters,” Avtometriya 49 (2), 30–41 (2013) [Optoelectron., Instrum. Data Process. 49 (2), 131–141 (2013)].Google Scholar
  11. 11.
    S. A. Belokon, Yu. N. Zolotukhin, K. Yu. Kotov, et al., “Using the Kalman Filter in the Quadrotor Vehicle Trajectory Tracking System,” Avtometriya 49 (6), 14–24 (2013) [Optoelectron., Instrum. Data Process. 49 (6), 536–545 (2013)].Google Scholar

Copyright information

© Allerton Press, Inc. 2015

Authors and Affiliations

  1. 1.Peoples’ Friendship UniversityMoscowRussia
  2. 2.Institute for Systems AnalysisMoscowRussia

Personalised recommendations