Abstract
As an emerging approach to space situational awareness and space imaging, the practical use of an event-based camera (EBC) in space imaging for precise source analysis is still in its infancy. The nature of event-based space imaging and data collection needs to be further explored to develop more effective event-based space imaging systems and advance the capabilities of event-based tracking systems with improved target measurement models. Moreover, for event measurements to be meaningful, a framework must be investigated for EBC calibration to project events from pixel array coordinates in the image plane to coordinates in a target resident space object’s reference frame. In this paper, the traditional techniques of conventional astronomy are reconsidered to properly utilise the EBC for space imaging and space situational awareness. This paper presents the techniques and systems used for calibrating an EBC for reliable and accurate measurement acquisition. These techniques are vital in building event-based space imaging systems capable of real-world space situational awareness tasks. By calibrating sources detected using the EBC, the spatiotemporal characteristics of detected sources or “event sources” can be related to the photometric characteristics of the underlying astrophysical objects. Finally, these characteristics are analysed to establish a foundation for principled processing and observing techniques which appropriately exploit the capabilities of the EBC.
Article PDF
Similar content being viewed by others
Explore related subjects
Find the latest articles, discoveries, and news in related topics.Avoid common mistakes on your manuscript.
References
Liou, J. C., Johnson, N. L. Risks in space from orbiting debris. Science 2006, 311(5759): 340–341.
Bobrinsky, N., Del Monte, L. The space situational awareness program of the European Space Agency. Cosmic Research 2010, 48(5): 392–398.
Cohen, G., Afshar, S., Morreale, B., Bessell, T., Wabnitz, A., Rutten, M., van Schaik, A. Event-based sensing for space situational awareness. The Journal of the Astronautical Sciences 2019, 66(2): 125–141.
Mahowald, M. The silicon retina. In: An Analog VLSI System for Stereoscopic Vision. Boston, MA, USA: Springer, 1994: 4–65.
Mead, C. How we created neuromorphic engineering. Nature Electronics 2020, 3(7): 434–435.
Gehrig, D., Rebecq, H., Gallego, G., Scaramuzza, D. Asynchronous, photometric feature tracking using events and frames. In: Proceedings of the European Conference on Computer Vision, 2018: 750–765.
Gallego, G., Delbruck, T., Orchard, G., Bartolozzi, C., Taba, B., Censi, A., Leutenegger, S., Davison, A., Conradt, J., Daniilidis, K., et al. Event-based vision: A survey. IEEE Transactions on Pattern Analysis and Machine Intelligence 2019, 44(1): 154–180.
Chen, G., Cao, H., Conradt, J., Tang, H. J., Rohrbein, F., Knoll, A. Event-based neuromorphic vision for autonomous driving: A paradigm shift for bio-inspired visual sensing and perception. IEEE Signal Processing Magazine 2020, 37(4): 34–49.
Cheung, B., Rutten, M., Davey, S., Cohen, G. Probabilistic multi hypothesis tracker for an event based sensor. In: Proceedings of the 2018 21st International Conference on Information Fusion, Cambridge, UK, 2018: 933–940.
Afshar, S., Nicholson, A. P., van Schaik, A., Cohen, G. Event-based object detection and tracking for space situational awareness. IEEE Sensors Journal 2020, 20(24): 15117–15132.
Ralph, N., Maybour, D., Bethi, Y., Cohen, G. Observations and design of a new neuromorphic event-based all-sky and fixed region imaging system. In: Proceedings of the Advanced Maui Optical and Space Surveillance Technologies Conference, Maui, Hawaii, USA, 2019: 71.
Cohen, G., Afshar, S., van Schaik, A. Approaches for astrometry using event-based sensors. In: Proceedings of the Advanced Maui Optical and Space Surveillance Technologies Conference, Maui, Hawaii, USA, 2018: 25.
Chin, T. J., Bagchi, S., Eriksson, A., van Schaik, A. Star tracking using an event camera. In: Proceedings of the 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, Long Beach, CA, USA, 2019: 1646–1655.
Bagchi, S., Chin, T. J. Event-based star tracking via multiresolution progressive Hough transforms. In: Proceedings of the 2020 IEEE Winter Conference on Applications of Computer Vision, Snowmass, CO, USA, 2020: 2132–2141.
Sikorski, O., Izzo, D., Meoni, G. Event-based spacecraft landing using time-to-contact. In: Proceedings of the 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, Nashville, TN, USA, 2021: 1941–1950.
Jolley, A., Cohen, G., Joubert, D., Lambert, A. Evaluation of event-based sensors for satellite material characterization. Journal of Spacecraft and Rockets 2022, 59(2): 627–636.
Roffe, S., Akolkar, H., George, A. D., Linares-Barranco, B., Benosman, R. B. Neutron-induced, single-event effects on neuromorphic event-based vision sensor: A first step and tools to space applications. IEEE Access 2021, 9: 85748–85763.
Zołnowski, M., Reszelewski, R., Moeys, D. P., Delbrück, T., Kamiński, K. Observational evaluation of event cameras performance in optical space surveillance. In: Proceedings of the NEO and Debris Detection Conference, Darmstadt, Germany, 2019.
McMahon-Crabtree, P. N., Monet, D. G. Commercial-off-the-shelf event-based cameras for space surveillance applications. Applied Optics 2021, 60(25): G144–G153.
Oliver, R., Savransky, D. Event-based sensor model for space domain awareness. In: Proceedings of the Advanced Maui Optical and Space Surveillance Technologies Conference, Maui, Hawaii, USA, 2021.
Joubert, D., Marcireau, A., Ralph, N., Jolley, A., van Schaik, A., Cohen, G. Event camera simulator improvements via characterized parameters. Frontiers in Neuroscience 2021, 15: 702765.
Gallego, G., Rebecq, H., Scaramuzza, D. A unifying contrast maximization framework for event cameras, with applications to motion, depth, and optical flow estimation. In: Proceedings of the 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 2018: 3867–3876.
Stoffregen, T., Kleeman, L. Event cameras, contrast maximization and reward functions: An analysis. In: Proceedings of the 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition, Long Beach, CA, USA, 2019: 12292–12300.
Gallego, G., Gehrig, M., Scaramuzza, D. Focus is all you need: Loss functions for event-based vision. In: Proceedings of the 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition, Long Beach, CA, USA, 2019: 12272–12281.
Liu, D. Q., Parra, Á., Chin, T. J. Globally optimal contrast maximisation for event-based motion estimation. In: Proceedings of the 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition, Seattle, WA, USA, 2020: 6348–6357.
Wang, Y. F., Yang, J. Q., Peng, X., Wu, P., Gao, L., Huang, K., Chen, J. B., Kneip, L. Visual odometry with an event camera using continuous ray warping and volumetric contrast maximization. Sensors 2021, 22(15): 5687.
Zhu, A. Z., Yuan, L. Z., Chaney, K., Daniilidis, K. Unsupervised event-based learning of optical flow, depth, and egomotion. In: Proceedings of the 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition, Long Beach, CA, USA, 2019: 989–997.
Ralph, N., Joubert, D., Jolley, A., Afshar, S., Tothill, N., van Schaik, A., Cohen, G. Real-time event-based unsupervised feature consolidation and tracking for space situational awareness. Frontiers in Neuroscience 2022, 16: 821157.
Foster, B. J., Ye, D. H., Bouman, C. A. Multi-target tracking with an event-based vision sensor and a partial-update GMPHD filter. Electronic Imaging 2019, 31(13): 127–1–127–7.
Bertin, E., Arnouts, S. SExtractor: Software for source extraction. Astronomy and Astrophysics Supplement Series 1996, 117(2): 393–404.
Kleyna, J. T., Wilkinson, M. I., Evans, N. W., Gilmore, G. A photometrically and kinematically distinct core in the Sextans dwarf spheroidal galaxy. Monthly Notices of the Royal Astronomical Society 2004, 354(4): L66–L72.
Ester, M., Kriegel, H.-P., Sander, J., Xu, X. A density-based algorithm for discovering clusters in large spatial databases with noise. In: Proceedings of the 2nd International Conference on Knowledge Discovery and Data Mining, Portland, Oregon, USA, 1996: 226–231.
Schubert, E., Sander, J., Ester, M., Kriegel, H. P., Xu, X. W. DBSCAN revisited, revisited: Why and how you should (still) use DBSCAN. ACM Transactions on Database Systems 2017, 42(3): 19.
Fix, E., Hodges, J. L. Discriminatory analysis—nonparametric discrimination: Consistency properties. Revue Internationale De Statistique 1989, 57(3): 238.
Lloyd, S. Least squares quantization in PCM. IEEE Transactions on Information Theory 1982, 28(2): 129–137.
Comaniciu, D., Meer, P. Mean shift: A robust approach toward feature space analysis. IEEE Transactions on Pattern Analysis and Machine Intelligence 2002, 24(5): 603–619.
Karami, A., Johansson, R. Choosing DBSCAN parameters automatically using differential evolution. International Journal of Computer Applications 2014, 91(7): 1–11.
Martins, C., Galetti, P. M. Jr. Chromosomal localization of 5S rDNA genes in Leporinus fish (Anostomidae, Characiformes). Chromosome Research 1999, 7(5): 363–367.
Hinz, G., Chen, G., Aafaque, M., Röhrbein, F., Conradt, J., Bing, Z. S., Qu, Z. N., Stechele, W., Knoll, A. Online multi-object tracking-by-clustering for intelligent transportation system with neuromorphic vision sensor. In: KI 2017: Advances in Artificial Intelligence. Lecture Notes in Computer Science, Vol. 10505. Kern-Isberner, G., Fürnkranz, J., Thimm, M. Eds. Springer Cham, 2017: 142–154.
Chen, G. A., Cao, H., Aafaque, M., Chen, J. N., Ye, C. B., Röhrbein, F., Conradt, J., Chen, K., Bing, Z. S., Liu, X. B., et al. Neuromorphic vision based multivehicle detection and tracking for intelligent transportation system. Journal of Advanced Transportation 2018, 2018: 4815383.
Soille, P. Constrained connectivity for hierarchical image partitioning and simplification. IEEE Transactions on Pattern Analysis and Machine Intelligence 2008, 30(7): 1132–1145.
Pier, J. R., Munn, J. A., Hindsley, R. B., Hennessy, G. S., Kent, S. M., Lupton, R. H., Ivezić, Ž. Astrometric calibration of the Sloan digital sky survey. The Astronomical Journal 2003, 125(3): 1559–1579.
Lang, D., Hogg, D. W., Mierle, K., Blanton, M., Roweis, S. Astrometry.net: Blind astrometric calibration of arbitrary astronomical images. The Astronomical Journal 2010, 139(5): 1782–1800.
Dubrofsky, E. Homography estimation. Master Thesis. Vancouver, Canada: Univerzita Britské Kolumbie, 2009.
Seedahmed, G. H. Direct retrieval of exterior orientation parameters using a 2D projective transformation. The Photogrammetric Record 2006, 21(115): 211–231.
Tsai, R. A versatile camera calibration technique for high-accuracy 3D machine vision metrology using off-the-shelf TV cameras and lenses. IEEE Journal on Robotics and Automation 1987, 3(4): 323–344.
Wijnholds, S. J., Chiarucci, S. Blind calibration of phased arrays using sparsity constraints on the signal model. In: Proceedings of the 2016 24th European Signal Processing Conference, 2016: 270–274.
Gaia, C., Brown, A., Vallenari, A., Prusti, T., De Bruijne, J., Babusiaux, C., Juhász, Á., Marschalkó, G., Marton, G., Molnár, L., et al. Gaia Data Release 2 Summary of the contents and survey properties. Astronomy & Astrophysics 2018, 616(1): A1.
Finateu, T., Niwa, A., Matolin, D., Tsuchimoto, K., Mascheroni, A., Reynaud, E., Mostafalu, P., Brady, F., Chotard, L., LeGoff, F., et al. 5.10 A 1280 × 720 back-illuminated stacked temporal contrast event-based vision sensor with 4.86 µm pixels, 1.066 GEPS readout, programmable event-rate controller and compressive data-formatting pipeline. In: Proceedings of the 2020 IEEE International Solid-State Circuits Conference, 2020: 112–114.
Funding
Open Access funding enabled and organized by CAUL and its Member Institutions.
Author information
Authors and Affiliations
Contributions
NOR conducted the investigation, developed the methodology and software, performed the validation, and wrote the original draft. AM collected the data used in this study and reviewed the final manuscript. SA, NT, GC, and AvS reviewed and edited the original draft, supervised the project, and contributed to the methodology.
Corresponding author
Ethics declarations
The authors have no competing interests to declare that are relevant to the content of this article.
Additional information
Nicholas Owen Ralph received his B.Eng. (Hons.) degree in mechatronics in 2017, his M.Res. degree in engineering in 2019, and has submitted his Ph.D. thesis in neuromorphic engineering. He is currently a postdoctoral research fellow in neuromorphic space imaging at the International Center for Neuromorphic Systems, MARCS Institute for Brain, Behavior, and Development at Western Sydney University, Australia. He is focusing on the development of neuromorphic approaches to space-domain awareness, astronomy, and machine learning. E-mail: N.Ralph@westernsydney.edu.au
Alexandre Marcireau received his M.Sc. (Eng.) degree in computer science from École Centrale, France, in 2015 and his Ph.D. degree in neuromorphic engineering from Sorbonne Université, France, in 2019. He is currently a postdoctoral research fellow in neuromorphic engineering and space applications at the International Center for Neuromorphic Systems (ICNS), Western Sydney University, Australia. His research focuses on bioinspired computer vision, event vision sensors, event-based processing, and software development. E-mail: A.Marcireau@westernsydney.edu.cn
Saeed Afshar completed his B.Sc. (Eng.) degree in 2014 and M.Sc. (Eng.) degree in 2016 at the University of New South Wales, Australia, and his Ph.D. degree in neuromorphic engineering at Western Sydney University, Australia, in 2020. His research focuses on the investigation of computational architectures and algorithms in the fields of neuroscience, machine learning, signal processing, and circuit design for developing novel vision, memory, and auditory sensing and processing systems with superior performance in dynamic noisy environments compared with state-of-the-art conventional computing approaches. E-mail: S.Afshar@westernsydney.edu.au
Nickolas Tothill majored in natural sciences at Cambridge University. He received his M.Sc. degree in radioastronomy from the University of Manchester, UK, in 1995 and his Ph.D. degree in astrophysics from the University of London, UK, in 1999. He has held research positions in Antarctica, Australia, Canada, Germany, USA, and UK. He is currently a senior lecturer in the School of Science, Western Sydney University, Australia, where he teaches physics and astronomy. His research interests include radio-astronomical surveys of the interstellar medium of the Milky Way galaxy and novel optical instruments for small telescopes, including polarimetry and event-based sensing. E-mail: N.Tothill@westernsydney.edu.au
André van Schaik received his M.Sc. degree in electrical engineering from the University of Twente, Enschede, the Netherlands, in 1990 and his Ph.D. degree in electrical engineering from the Swiss Federal Institute of Technology (EPFL), Lausanne, Switzerland, in 1998. He has authored over 300 publications, obtained over 35 patents, and is the founder of three start-up companies (VAST Audio, Personal Audio, and Heard Systems). E-mail: A.vanSchaik@westernsydney.edu.au
Gregory Cohen received his B.Sc. (Eng.) degree in electrical and computer engineering, M.Sc. (Eng.) and B.Com. (Hons.) degrees in finance and portfolio management from the University of Cape Town, South Africa, in 2007, 2008, and 2010, respectively. He received a joint Ph.D. degree in signal processing and neuromorphic engineering from Western Sydney University, Australia, and the University of Pierre and Marie Curie in Paris, France. Prior to returning to industry research, he worked in several start-ups and established engineering and consulting firms including as a consulting engineer in the field of large-scale HVAC from 2007 to 2009, an electronic design engineer from 2009 to 2011, and an expert consultant for the Kaiser Economic Development Practice in 2012. He is currently an associate professor in neuromorphic systems at the International Centre for Neuromorphic Systems at Western Sydney University and a program lead for neuromorphic algorithms and space applications. E-mail: G.Cohen@westernsydney.edu.au
Rights and permissions
This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made.
The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.
To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Ralph, N.O., Marcireau, A., Afshar, S. et al. Astrometric calibration and source characterisation of the latest generation neuromorphic event-based cameras for space imaging. Astrodyn 7, 415–443 (2023). https://doi.org/10.1007/s42064-023-0168-2
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s42064-023-0168-2