Sensor Measurements and Image Registration Fusion to Retrieve Variations of Satellite Attitude
Observation satellites use pushbroom sensors to capture images of the earth. These linear cameras acquire 1-D images over time and use the straight motion of the satellite to sweep out a region of space and build 2-D images. The stability of the imaging platform is crucial during the acquisition process to guaranty distortion free images. Positioning sensors are used to control and rectify the attitude variations of the satellite, but their sampling rate is too low to provide an accurate estimate of the motion. In this paper, we describe a way to fuse star tracker measurements with image registration in order to retrieve the attitude variations of the satellite. We introduce first a simplified motion model where the pushbroom camera is rotating during the acquisition of an image. Then we present the fusion model which combines low and high frequency informations of respectively the star tracker and the images; this is embedded in a Bayesian setting. Lastly, we illustrate the performance of our algorithm on three satellite datasets.
KeywordsImage Fusion Polynomial Model Attitude Variation Star Tracker Ground Truth Image
Unable to display preview. Download preview PDF.
- 1.Petrie, G.: Airborne pushbroom line scanners: An alternative to digital frame scanners. Geoinformatics 8, 50–57 (2005)Google Scholar
- 3.Drareni, J., Sturm, P., Roy, S.: Plane-Based Calibration for Linear Cameras. In: The 8th Workshop on Omnidirectional Vision, Camera Networks and Non-classical Cameras - OMNIVIS (2008)Google Scholar
- 4.Perrier, R., Arnaud, E., Sturm, P., Ortner, M.: Estimating satellite attitude from pushbroom sensors. In: CVPR (2010)Google Scholar
- 5.de Lussy, F., Greslou, D., Gross Colzy, L.: Process line for geometrical image correction of disruptive microvibrations. In: International Society for Photogrammetry and Remote Sensing, pp. 27–35 (2008)Google Scholar
- 6.Poli, D.: General model for airborne and spaceborne linear array sensors. International Archives of Photogrammetry and Remote Sensing 34 (2002)Google Scholar
- 8.Blum, R., Liu, Z.: Multi-sensor image fusion and its applications. CRC, Boca Raton (2006)Google Scholar
- 9.Punska, O.: Bayesian approaches to multi-sensor data fusion. Cambridge University, Cambridge (1999)Google Scholar
- 10.Doebling, S., Farrar, C., Prime, M., Shevitz, D.: Damage identification and health monitoring of structural and mechanical systems from changes in their vibration characteristics: A literature review. Technical report, Los Alamos National Lab. (1996)Google Scholar
- 12.Irani, M., Anandan, P.: About direct methods. In: Proceedings of the International Workshop on Vision Algorithms, pp. 267–277 (1999)Google Scholar
- 13.Lucas, B.D., Kanade, T.: An iterative image registration technique with an application to stereo vision. In: Proceedings of the 7th International Joint Conference on Artificial Intelligence (IJCAI 1981), pp. 674–679 (1981)Google Scholar