Skip to main content
Log in

Gesture recognition in the problem of contactless control of an unmanned aerial vehicle

  • Automation Systems in Scientific Research and Industry
  • Published:
Optoelectronics, Instrumentation and Data Processing Aims and scope

Abstract

The problem of contactless control of an unmanned aerial vehicle by human gestures is considered. A system of commands is proposed for an AR.Drone 2.0 quadcopter equipped with a builtin computer, two color video cameras, the Linux operating system, and sensors for measuring the flight height, speed, and stability. An Asus Xtion Pro Live three-dimensional sensor based on triangulation and structured light is used to enter data into the control system. Gesture recognition is performed by frame-by-frame processing of a video sequence consisting of depth images. The method does not require initial training, is insensitive to changes in lighting, and is invariant to the sizes of the human palm and body.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Parrot AR.Drone 2.0: Even more piloting possibilities! http://blog.parrot.com/2014/01/06/parrot-ar-drone-2-0-even-more-piloting-possibilities.

  2. AR.Drone + Kinect. http://ardrone.parrot.com/best-of-user-videos/2012/02/17/ardrone-kinect.

  3. V. Devyatkov, A. Alfimtsev, “Human-Computer Interaction in Games Using Computer Vision Techniques,” Image Processing: Concepts, Methodologies, Tools, and Applications. Hershey, USA: IGI Global, 2011. Ch. 10. P. 146–167.

    Google Scholar 

  4. AR.Drone 2.0. http://ardrone2.parrot.com.

  5. Asus Xtion Pro Live. http://www.asus.com/Multimedia/Xtion PRO LIVE.

  6. J. Shotton, A. Fitzgibbon, M. Cook, et al., “Real-Time Human Pose Recognition in Parts from Single Depth Images,” in Proc. CVPR’11 (IEEE, 2011), pp. 1297–1304.

    Google Scholar 

  7. OpenNI: The Standard Framework for 3D Sensing. http://www.openni.ru/openni-sdk/index.html.

  8. K. Palagyi, Skeletonization and its Applications. http://www.inf.u-szeged.hu/~palagyi/skel/skel.html.

  9. W. T. Freeman, D. B. Anderson, P. Beardsley, et al., “Computer Vision for Interactive Computer Graphics,” IEEE. Comput. Graph. Appl. 18 (3), 42–53 (1998).

    Article  Google Scholar 

  10. A. A. Rozhentsov, K. V. Morozov, and A. A. Baev, “Modified Generalized Hough Transform for 3D Image Processing with Unknown Rotation and Scaling Parameters,” Avtometriya 49 (2), 30–41 (2013) [Optoelectron., Instrum. Data Process. 49 (2), 131–141 (2013)].

    Google Scholar 

  11. S. A. Belokon, Yu. N. Zolotukhin, K. Yu. Kotov, et al., “Using the Kalman Filter in the Quadrotor Vehicle Trajectory Tracking System,” Avtometriya 49 (6), 14–24 (2013) [Optoelectron., Instrum. Data Process. 49 (6), 536–545 (2013)].

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to V. E. Nahapetyan.

Additional information

Original Russian Text © V.E. Nahapetyan, V.M. Khachumov, 2015, published in Avtometriya, 2015, Vol. 51, No. 2, pp. 103–109.

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Nahapetyan, V.E., Khachumov, V.M. Gesture recognition in the problem of contactless control of an unmanned aerial vehicle. Optoelectron.Instrument.Proc. 51, 192–197 (2015). https://doi.org/10.3103/S8756699015020132

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.3103/S8756699015020132

Keywords

Navigation