Multi-robot Cooperative Object Localization

* Final gross prices may vary according to local VAT.

Get Access

Abstract

We introduce a multi-robot/sensor cooperative object detection and tracking method based on a decentralized Bayesian approach which uses particle filters to avoid simplifying assumptions about the object motion and the sensors’ observation models. Our method is composed of a local filter and a team filter. The local filter receives a reduced dimension representation of its teammates’ sample belief about the object location, i.e., the parameters of a Gaussian Mixture Model (GMM) approximating the other sensors’ particles, and mixes the particles representing its own belief about the object location with particles sampling the received GMM. All particles are weighted by the local observation model and the best ones are re-sampled for the next local iteration. The team filter receives GMM representations of the object in the world frame, from the sensor teammates, and fuses them all performing Covariance Intersection among GMM components. The local estimate is used when the sensor sees the object, to improve its estimate from the teammates’ estimates. The team estimate is used when the sensor does not see the object alone. To prevent the fusion of incorrect estimates, the disagreement between estimates is measured by a divergence measure for GMMs. Results of the method application to real RoboCup MSL robots are presented.