Active Stabilization of Images Acquired on a Walking Robotic Platform

  • Xander Twombly
  • Richard Boyle
  • Silvano Colombano
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4292)


To increase the quality of scientific data collected from autonomous mobile agents such as rovers and walking robotic devices, biological methods can be mimicked for better navigation and balance control of both the agent itself and the manipulation of scientific instruments. Drawing on the design of the neuro-vestibular control system, the EarBot controller is designed to stabilize a multi-axis camera system mounted atop a moving agent. An eight-legged robot called the SCORPION, designed to navigate and explore rough terrain considered inhospitable to wheeled rovers, is used as the testbed to analyze the EarBot’s functionality and behavior. Eventually, the EarBot will be used to control the balance the robot itself through expanded modelling of the vestibulo-motor control loops used in postural control. This paper presents the theoretical concepts and initial controller implementations for stabilizing the camera during walking motion of the SCORPION.


Vestibular System Robotic Platform Plant Dynamic Inertial Motion Linear Accelerometer 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Angelaki, D.E., McHenry, M.Q., Dickman, J.D., Newlands, S.D., Hess, B.: Computation of inertial motion: neural strategies to resolve ambiguous otolith information. J. Neurosci. 19, 316–327 (1999)Google Scholar
  2. 2.
    Glasauer, S., Merfeld, D.M.: Modelling three-dimensional vestibular responses during complex motion stimulation. In: Fetter, M., Haslwanter, T., Misslisch, H., Tweed, D. (eds.) Three-Dimensional Kinematics of Eye, Head and Limb Movements, pp. 387–398. Harwood Academic, Amsterdam (1997)Google Scholar
  3. 3.
    Green, A.M., Angelaki, D., Angelaki, D.E.: An integrative neural network for detecting inertial motion and head orientation. J. Neurosci. 92, 905–925 (2004)Google Scholar
  4. 4.
    Hess, B.J., Angelaki, D.E.: Inertial vestibular coding of motion: concepts and evidence. Curr. Opin. Neurobiol. 7, 860–866 (1997)CrossRefGoogle Scholar
  5. 5.
    Merfeld, D.M.: Modeling the vestibulo-ocular reflex of the squirrel monkey during eccentric rotation and roll tilt. Exp. Brain Res. 106, 123–134 (1995)CrossRefGoogle Scholar
  6. 6.
    Merfeld, D.M., Zupan, L.H.: Neural processing of gravitoinertial cues in humans. III. Modeling tilt and translation responses. J. Neurophysiol. 87, 819–833 (2002)Google Scholar
  7. 7.
    Mergner, T., Glasauer, S.: A simple model of vestibular canal-otolith signal fusion. Ann. NY Acad. Sci. 871, 430–434 (1999)CrossRefGoogle Scholar
  8. 8.
    Viéville, T., Faugeras, O.D.: Cooperation of the inertial and visual systems. In: Henderson, T.C. (ed.) Traditional and Non-Traditional Robotic Sensors. Springer, Berlin (1990)Google Scholar
  9. 9.
    Zupan, L.H., Merfeld, D.M., Darlot, C.: Using sensory weighting to model the influence of canal, otolith and visual cues on spatial orientation and eye movements. Biol. Cybern. 86, 209–230 (2002)MATHCrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Xander Twombly
    • 1
  • Richard Boyle
    • 1
  • Silvano Colombano
    • 1
  1. 1.NASA Ames Research CenterMoffett FieldUSA

Personalised recommendations