Abstract
We propose a sensor-fusion technique where the data sets for previous moments are properly transformed and fused into the current data sets to allow accurate measurements, such as the distance to an obstacle or the location of the service robot itself. In conventional fusion schemes, measurements are dependent on the current data sets. As a result, more sensors are required to measure a certain physical parameter or to improve the accuracy of a measurement. However, in this approach, instead of adding more sensors to the system, the temporal sequences of the data sets are stored and utilized to improve the measurements. The theoretical basis is illustrated by examples, and the effectiveness is proved through simulations. Finally, the new space and time sensor fusion (STSF) scheme is applied to the control of a mobile robot in an unstructured environment and a structured environment.
Similar content being viewed by others
Author information
Authors and Affiliations
Additional information
This work was presented in part at the 8th International Symposium on Artificial Life and Robotics, Oita, Japan, January 24–26, 2003
About this article
Cite this article
Jin, TS., Lee, KS. & Lee, JM. Space and time sensor fusion using an active camera for mobile robot navigation. Artif Life Robotics 8, 95–100 (2004). https://doi.org/10.1007/s10015-004-0295-7
Received:
Accepted:
Issue Date:
DOI: https://doi.org/10.1007/s10015-004-0295-7