Space Robot Teleoperation Based on Active Vision

  • Cheng Huang
  • Huaping Liu
  • Fuchun Sun
  • Yuming Sheng
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 279)


In order to increase the robustness of space robot teleoperation system and good operational performance, the article design a set of haptic feedback system based on hand controller data glove and active vision system, which depend on the Pan-Tilt-Zoom (PTZ) camera and Kinect camera. It makes operators have the feeling of immersive. This system requires two operator cooperation, one processing Active Vision tracking and image processing, one controlling the movement of robot under the good perception environment. At the same time, Good interpersonal interface design to alleviate the pressure of the operator, in order to study the real space teleoperation scene, we use the software to set the delay system to make the operator to verify the performance of the system in the case of delay. The experiments show that no matter in the presence or absence of delay, the system completes the tasks at a high success rate.


Teleoperation Active vision system Time delay Perception environment 


  1. 1.
    Sheridan TB (1993) Space teleoperation through time delay: review and prognosis. In: IEEE Trans Robot Autom 9(5):592–606 Google Scholar
  2. 2.
    Lee S (1993) Modeling, design, and evaluation of advanced teleoperator control systems with short time delay. IEEE Trans Robot Autom 9(5):607–623Google Scholar
  3. 3.
    Azorin JM, Reinoso O et al (2004) Generalized control method by state convergence for teleoperation systems with time delay. Automatica 40:1575–1582 (Elsevier)Google Scholar
  4. 4.
    Yoon WK, Goshozono T et al (2004) Model-based space robot teleoperation of ETS-VII manipulator. IEEE Trans Robot Autom 20(3):602–612Google Scholar
  5. 5.
    Liu YY, Liu HP et al (2011) GPU accelerate the online k-means clustering particle filter tracking algorithm. J Cent S Univ (JCR Sci Ed) 42 supplement 9:1–7Google Scholar
  6. 6.
    Jun M, Tsuyoshi K (2000) An active vision system for real time traffic sign recognition In: IEEE intelligent transportation systems conference proceedings Dearborn (MI), pp 52–57. Oct 1–3Google Scholar
  7. 7.
    Ma S (1996) A self-calibration technique for active vision systems. IEEE Trans Robot Autom 12(1):114–120 Google Scholar
  8. 8.
    Pérez P, Hue C, Vermaak J et al (2002) Color based probabilistic tracking. In Proceedings of the 7th European conference on computer vision, vol 2002, no (1), Copenhagen. pp 661–675Google Scholar
  9. 9.
    Kim Z (2008) Real time object tracking based on dynamic feature grouping with background subtraction. In: IEEE conference on computer vision and pattern recognition, Anchorage. pp 1–8.
  10. 10.
    Blake A, Isard M et al (1994) Learning to track curves in motion. In: Proceedings of the IEEE conference on decision theory and control. pp 3788–3793Google Scholar
  11. 11.
    Black MJ, Jepson AD (1995) Eigen tracking: robust matching and tracking of articulated objects using a view-based representation. Technical report T95-00515, Xerox PARCGoogle Scholar
  12. 12.
    Doucet A, Godsill S et al (2000) On sequential Monte Carlo sampling method for Bayesian filtering. Stat Comput 10(3):197–208CrossRefGoogle Scholar
  13. 13.
    Bradski GR (1998) Computer vision face tracking as a component of a perceptual user interface. In: Workshop on applications of computer vision. Princeton. pp 214–219Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2014

Authors and Affiliations

  • Cheng Huang
    • 1
    • 2
  • Huaping Liu
    • 2
  • Fuchun Sun
    • 2
  • Yuming Sheng
    • 1
  1. 1.School of Optical-Electrical Computer EngineeringUniversity of Shanghai for Science and TechnologyShanghaiChina
  2. 2.State Key Laboratory of Intelligent Technology and SystemsTsinghua National Laboratory for Information Science and TechnologyBeijingChina

Personalised recommendations