Chapter

Intelligent Autonomous Systems 12

Volume 193 of the series Advances in Intelligent Systems and Computing pp 335-344

Visual Gyroscope for Omnidirectional Cameras

  • Nicola CarlonAffiliated withDepartment of Information Engineering (DEI), Faculty of Engineering, University of Padua Email author 
  • , Emanuele MenegattiAffiliated withDepartment of Information Engineering (DEI), Faculty of Engineering, University of Padua

* Final gross prices may vary according to local VAT.

Get Access

Abstract

At present, algorithms for attitude estimation with omnidirectional cameras are predominantly environment-dependent. This constitutes a significant limitation to the applicability of such techniques. This study introduces an approach aimed at general mobile camera attitude estimation. The approach extracts features to directly estimate three-dimensional movements of a humanoid robot from its head-mounted camera. By doing so, it is not subject to the constraints of Structure from Motion with epipolar geometry, which are currently unattainable in real-time. The central idea is: movements between consecutive frames can be reliably estimated from the identity on the unit sphere between external parallel lines and projected great circles. After calibration, parallel lines match optical flow tracks. The point of infinity corresponds to the expansion focus of the movement. Simulations and experiments validate the ability to distinguish between translation, pure rotation, and roto-translation.