Skip to main content

Advertisement

Log in

All-optical spiking neurosynaptic networks with self-learning capabilities

  • Article
  • Published:

From Nature

View current issue Submit your manuscript

Abstract

Software implementations of brain-inspired computing underlie many important computational tasks, from image processing to speech recognition, artificial intelligence and deep learning applications. Yet, unlike real neural tissue, traditional computing architectures physically separate the core computing functions of memory and processing, making fast, efficient and low-energy computing difficult to achieve. To overcome such limitations, an attractive alternative is to design hardware that mimics neurons and synapses. Such hardware, when connected in networks or neuromorphic systems, processes information in a way more analogous to brains. Here we present an all-optical version of such a neurosynaptic system, capable of supervised and unsupervised learning. We exploit wavelength division multiplexing techniques to implement a scalable circuit architecture for photonic neural networks, successfully demonstrating pattern recognition directly in the optical domain. Such photonic neurosynaptic networks promise access to the high speed and high bandwidth inherent to optical systems, thus enabling the direct processing of optical telecommunication and visual data.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1: All-optical spiking neuronal circuits.
Fig. 2: Spike generation and operation of the artificial neuron.
Fig. 3: Supervised and unsupervised learning with phase-change all-optical neurons.
Fig. 4: Scaling architecture for all-optical neural networks.
Fig. 5: Experimental realization of a single-layer spiking neural network.

Similar content being viewed by others

Data availability

All data used in this study are available from the corresponding author upon reasonable request.

References

  1. Lane, N. D. et al. Squeezing deep learning into mobile and embedded devices. IEEE Pervasive Comput. 16, 82–88 (2017).

    Article  Google Scholar 

  2. Amato, F. et al. Artificial neural networks in medical diagnosis. J. Appl. Biomed. 11, 47–58 (2013).

    Article  CAS  Google Scholar 

  3. Nawrocki, R. A., Voyles, R. M. & Shaheen, S. E. A mini review of neuromorphic architectures and implementations. IEEE Trans. Electron Dev. 63, 3819–3829 (2016).

    Article  ADS  CAS  Google Scholar 

  4. Preissl, R. et al. Compass: a scalable simulator for an architecture for cognitive computing. In Proc. Int. Conf. on High Performance Computing, Networking, Storage and Analysis (SC ’12) https://doi.org/10.1109/SC.2012.34 (IEEE, 2012).

  5. von Neumann, J. The Computer and the Brain (Yale Univ. Press, 1958).

  6. Wu, H., Yao, P., Gao, B. & Qian, H. Multiplication on the edge. Nat. Electron. 1, 8–9 (2018).

    Article  Google Scholar 

  7. Furber, S. Bio-inspired massively-parallel computation. In Advances in Parallel Computing Vol. 27 Parallel Computing: On the Road to Exascale (IOS Press, 2018).

  8. Schmitt, S. et al. Neuromorphic hardware in the loop: training a deep spiking network on the BrainScaleS wafer-scale system. In Proc. Int. Joint Conf. on Neural Networks 2227–2234 https://ieeexplore.ieee.org/document/7966125 (2017).

  9. Shen, Y. et al. Deep learning with coherent nanophotonic circuits. Nat. Photon. 11, 441–446 (2017).

    Article  ADS  CAS  Google Scholar 

  10. Vinckier, Q. et al. High-performance photonic reservoir computer based on a coherently driven passive cavity. Optica 2, 438–446 (2015).

    Article  ADS  Google Scholar 

  11. Brunner, D., Soriano, M. C., Mirasso, C. R. & Fischer, I. Parallel photonic information processing at gigabyte per second data rates using transient states. Nat. Commun. 4, 1364–1367 (2013).

    Article  ADS  Google Scholar 

  12. Ferreira de Lima, T., Shastri, B. J., Tait, A. N., Nahmias, M. A. & Prucnal, P. R. Progress in neuromorphic photonics. Nanophotonics 6, 577–599 (2017).

    Article  Google Scholar 

  13. Vandoorne, K. et al. Experimental demonstration of reservoir computing on a silicon photonics chip. Nat. Commun. 5, 3541 (2014).

    Article  ADS  Google Scholar 

  14. Cheng, Z., Ríos, C., Pernice, W. H. P., Wright, C. D. & Bhaskaran, H. On-chip photonic synapse. Sci. Adv. 2, e1700160 (2017).

    Google Scholar 

  15. Kim, S. et al. NVM neuromorphic core with 64k-cell (256-by-256) phase change memory synaptic array with on-chip neuron circuits for continuous in-situ learning. In Proc. International Electron Devices Meeting (IEDM) 17.1.1–17.1.4 (IEEE, 2015).

  16. Kuzum, D., Jeyasingh, R. G. D., Lee, B. & Wong, H. P. Nanoelectronic programmable synapses based on phase change materials for brain-inspired computing. Nano Lett. 12, 2179–2186 (2012).

    Article  ADS  CAS  Google Scholar 

  17. Wright, C. D., Liu, Y., Kohary, K. I., Aziz, M. M. & Hicken, R. J. Arithmetic and biologically-inspired computing using phase-change materials. Adv. Mater. 23, 3408–3413 (2011).

    Article  CAS  Google Scholar 

  18. Pantazi, A., Woźniak, S., Tuma, T. & Eleftheriou, E. All-memristive neuromorphic computing with level-tuned neurons. Nanotechnology 27, 355205 (2016).

    Article  Google Scholar 

  19. Lecun, Y., Bottou, L., Bengio, Y. & Haffner, P. Gradient-based learning applied to document recognition. Proc. IEEE 86, 2278–2324 (1998).

    Article  Google Scholar 

  20. Burr, G. W. et al. Experimental demonstration and tolerancing of a large-scale neural network (165 000 synapses) using phase-change memory as the synaptic weight element. IEEE Trans. Electron Dev. 62, 3498–3507 (2015).

    Article  ADS  Google Scholar 

  21. Le, Q. V. et al. Building high-level features using large scale unsupervised learning. In 29th Int. Conf. on Machine Learning https://dl.acm.org/citation.cfm?id=3042641 (2012).

  22. Alduino, A. & Paniccia, M. Interconnects: wiring electronics with light. Nat. Photon. 1, 153–155 (2007).

    Article  ADS  CAS  Google Scholar 

  23. Sun, C. et al. Single-chip microprocessor that communicates directly using light. Nature 528, 534–538 (2015).

    Article  ADS  CAS  Google Scholar 

  24. Wuttig, M. & Yamada, N. Phase-change materials for rewriteable data storage. Nat. Mater. 6, 824–832 (2007).

    Article  ADS  CAS  Google Scholar 

  25. Raoux, S., Xiong, F., Wuttig, M. & Pop, E. Phase change materials and phase change memory. MRS Bull. 39, 703–710 (2014).

    Article  Google Scholar 

  26. Burr, G. W. et al. Recent progress in phase-change memory technology. IEEE J. Emerg. Sel. Top. Circuits Syst. 6, 146–162 (2016).

    Article  ADS  Google Scholar 

  27. Tait, A. N., Nahmias, M. A., Shastri, B. J. & Prucnal, P. R. Broadcast and weight: an integrated network for scalable photonic spike processing. J. Lightwave Technol. 32, 3427–3439 (2014).

    Article  ADS  Google Scholar 

  28. Tait, A. N. et al. Neuromorphic silicon photonic networks. Sci. Rep. 7, 7430 (2016).

    Article  ADS  Google Scholar 

  29. Hebb, D. The Organization of Behaviour (Wiley, 1949).

  30. Feldmann, J. et al. Calculating with light using a chip-scale all-optical abacus. Nat. Commun. 8, 1256 (2017).

    Article  ADS  CAS  Google Scholar 

  31. Hahnloser, R. H. R., Sarpeshkar, R., Mahowald, M. A., Douglas, R. J. & Seung, H. S. Digital selection and analogue amplification coexist in a cortex-inspired silicon circuit. Nature 405, 947–951 (2000).

    Article  ADS  CAS  Google Scholar 

  32. Bi, G. & Poo, M. Synaptic modification by correlated activity: Hebb’s postulate revisited. Annu. Rev. Neurosci. 24, 139–166 (2001).

    Article  CAS  Google Scholar 

  33. Ríos, C. et al. Integrated all-photonic non-volatile multi-level memory. Nat. Photon. 9, 725–732 (2015).

    Article  ADS  Google Scholar 

  34. Goldhahn, D., Eckart, T. & Quasthoff, U. Building large monolingual dictionaries at the Leipzig Corpora Collection: from 100 to 200 languages. In Proc. 8th Int. Conf. on Language Resources and Evaluation (LREC’12) http://www.lrec-conf.org/proceedings/lrec2012/pdf/327_Paper.pdf (2012).

Download references

Acknowledgements

This research was supported by EPSRC via grants EP/J018694/1, EP/M015173/1 and EP/M015130/1 in the UK and the Deutsche Forschungsgemeinschaft (DFG) grant PE 1832/5-1 in Germany. W.H.P.P. acknowledges support by the European Research Council through grant 724707. We acknowledge funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement number 780848 (Fun-COMP).

Reviewer information

Nature thanks Geoffrey Burr, Ingo Fischer and the other anonymous reviewer(s) for their contribution to the peer review of this work.

Author information

Authors and Affiliations

Authors

Contributions

W.H.P.P., H.B. and C.D.W. conceived the experiment. J.F. fabricated the devices (with assistance from N.Y.). N.Y. performed the deposition of the GST material. J.F. implemented the measurement setup and carried out the measurements (with help from N.Y.). All authors discussed the data and wrote the manuscript together.

Corresponding author

Correspondence to W. H. P. Pernice.

Ethics declarations

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s note: Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary information

Supplementary Information

This file contains Supplementary Tables 2-4 and Supplementary Figures 1-2.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Feldmann, J., Youngblood, N., Wright, C.D. et al. All-optical spiking neurosynaptic networks with self-learning capabilities. Nature 569, 208–214 (2019). https://doi.org/10.1038/s41586-019-1157-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1038/s41586-019-1157-8

  • Springer Nature Limited

This article is cited by

Navigation