Skip to main content
Log in

Event-based Sensing for Space Situational Awareness

  • Published:
The Journal of the Astronautical Sciences Aims and scope Submit manuscript

Abstract

A revolutionary type of imaging device, known as a silicon retina or event-based sensor, has recently been developed and is gaining in popularity in the field of artificial vision systems. These devices are inspired by a biological retina and operate in a significantly different way to traditional CCD-based imaging sensors. While a CCD produces frames of pixel intensities, an event-based sensor produces a continuous stream of events, each of which is generated when a pixel detects a change in log light intensity. These pixels operate asynchronously and independently, producing an event-based output with high temporal resolution. There are also no fixed exposure times, allowing these devices to offer a very high dynamic range independently for each pixel. Additionally, these devices offer high-speed, low power operation and a sparse spatio-temporal output. As a consequence, the data from these sensors must be interpreted in a significantly different way to traditional imaging sensors and this paper explores the advantages this technology provides for space imaging. The applicability and capabilities of event-based sensors for SSA applications are demonstrated through telescope field trials. Trial results have confirmed that the devices are capable of observing resident space objects from LEO through to GEO orbital regimes. Significantly, observations of RSOs were made during both day-time and night-time (terminator) conditions without modification to the camera or optics. The event based sensor’s ability to image stars and satellites during day-time hours offers a dramatic capability increase for terrestrial optical sensors. This paper shows the field testing and validation of two different architectures of event-based imaging sensors. An event-based sensor’s asynchronous output has an intrinsically low data-rate. In addition to low-bandwidth communications requirements, the low weight, low-power and high-speed make them ideally suitable to meeting the demanding challenges required by space-based SSA systems. Results from these experiments and the systems developed highlight the applicability of event-based sensors to ground and space-based SSA tasks.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

Notes

  1. Chronocam’s website is available at http://www.chronocam.com/.

  2. IniLabs website is available at http://inilabs.com/.

References

  1. Belbachir, A.N., Litzenberger, M., Schraml, S., Hofstatter, M., Bauer, D., Schon, P., Humenberger, M., Sulzbachner, C., Lunden, T., Merne, M.: CARE: a dynamic stereo vision sensor system for fall detection. In: 2012 IEEE International Symposium on Circuits and Systems, pp. 7310–734. IEEE, Seoul (2012). https://doi.org/10.1109/ISCAS.2012.6272141

  2. Boahen, K.: Point-to-point connectivity between neuromorphic chips using address events. IEEE Trans. Circ. Syst. II Analog Digit. Signal Process. 47(5), 416–434 (2000). https://doi.org/10.1109/82.842110

    Article  MATH  Google Scholar 

  3. Brandli, C., Berner, R., Yang, M., Liu, S.-C., Delbruck, T.: A 240 x 180 130 dB 3 us latency global shutter spatiotemporal vision sensor. IEEE J. Solid State Circ. 49(10), 2333–2341 (2014). https://doi.org/10.1109/JSSC.2014.2342715

    Article  Google Scholar 

  4. Cohen, G.K., Orchard, G., Leng, S.H., Tapson, J., Benosman, R.B., van Schaik, A.: Skimming digits: Neuromorphic classification of spike-encoded images. Front. Neurosci. 10(184), 1–11 (2016). https://doi.org/10.3389/fnins.2016.00184

    Article  Google Scholar 

  5. Delbrück, T., Lang, M.: Robotic goalie with 3 ms reaction time at 4% CPU load using event-based dynamic vision sensor. Front. Neurosci. 7(7 NOV), 1–7 (2013). https://doi.org/10.3389/fnins.2013.00223

    Article  Google Scholar 

  6. Delbrück, T., Linares-Barranco, B., Culurciello, E., Posch, C.: Activity-driven, event-based vision sensors. In: Proceedings of 2010 IEEE International Symposium on Circuits and Systems, IEEE, pp. 2426–2429 (2010). https://doi.org/10.1109/ISCAS.2010.5537149

  7. Fukushima, K., Yamaguchi, Y., Yasuda, M., Nagata, S.: An electronic model of the retina. Proc. IEEE 58(12), 1950–1952 (1970). https://doi.org/10.1109/PROC.1970.8066

    Article  Google Scholar 

  8. Indiveri, G., Horiuchi, T.K.: Frontiers in neuromorphic engineering. https://doi.org/10.3389/fnins.2011.00118 (2011)

  9. Kueng, B., Mueggler, E., Gallego, G., Scaramuzza, D.: Low-Latency Visual Odometry Using Event-Based Feature Tracks. In: 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), IEEE, pp. 16–23 (2016), https://doi.org/10.1109/IROS.2016.7758089

  10. Lagorce, X., Meyer, C., Ieng, S.H., Filliat, D., Benosman, R.: Asynchronous event-based multikernel algorithm for high-speed visual features tracking. IEEE Trans. Neural Netw. Learn. Syst. 26(8), 1–12 (2014). https://doi.org/10.1109/TNNLS.2014.2352401

    Article  MathSciNet  Google Scholar 

  11. Land, M.F., Fernald, R.D.: The evolution of eyes. Ann. Rev. Neurosci. 15 (1990), 1–29 (1992). https://doi.org/10.1146/annurev.ne.15.030192.000245

    Article  Google Scholar 

  12. Lichtsteiner, P., Posch, C., Delbrück, T.: A 128 X 128 120db 30mw asynchronous vision sensor that responds to relative intensity change. In: 2006 IEEE International Solid State Circuits Conference - Digest of Technical Papers, pp. 2004–2006 (2006). https://doi.org/10.1109/ISSCC.2006.1696265

  13. Lichtsteiner, P., Posch, C., Delbrück, T.: A 128 x 128 120 dB 15 us latency asynchronous temporal contrast vision sensor. IEEE J. Solid State Circ. 43(2), 566–576 (2008). https://doi.org/10.1109/JSSC.2007.914337

    Article  Google Scholar 

  14. Mahowald, M.: An analog VLSI system for stereoscopic vision. Kluwer Int. series in engineering and computer science. Kluwer Academic Publishers, Norwell (1994)

    Book  Google Scholar 

  15. Mueggler, E., Huber, B., Scaramuzza, D.: Event-based, 6-DOF pose tracking for high-speed maneuvers. In: 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 2761–2768. IEEE. https://doi.org/10.1109/IROS.2014.6942940 (2014)

  16. Ni, Z., Pacoret, C., Benosman, R., Ieng, S.H., Régnier, S: Asynchronous event based high speed vision for micro-particles tracking. J. Microsc. 245(3), 236–244 (2011)

    Article  Google Scholar 

  17. Ni, Z., Bolopion, A., Agnus, J., Benosman, R., Régnier, S.: Asynchronous event-based visual shape tracking for stable haptic feedback in Microrobotics. IEEE Trans. Robot. (T-RO) 28(5), 1081–1089 (2012)

    Article  Google Scholar 

  18. Posch, C., Matolin, D., Wohlgenannt, R.: A QVGA 143 dB dynamic range frame-free PWM image sensor with lossless pixel-level video compression and time-domain CDS. IEEE J. Solid State Circ. 46(1), 259–275 (2011). https://doi.org/10.1109/JSSC.2010.2085952

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Gregory Cohen.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Cohen, G., Afshar, S., Morreale, B. et al. Event-based Sensing for Space Situational Awareness. J Astronaut Sci 66, 125–141 (2019). https://doi.org/10.1007/s40295-018-00140-5

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s40295-018-00140-5

Keywords

Navigation