Abstract
In this chapter we will address three questions: (1) What is reservoir computing? (2) What does it have to do with optics and electronics? (3) What are FPGAs?
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
This is obviously a debatable point. But it did work for me—my true revelation on reservoir computing, how and why it works, happened when I saw what is around—so I am going to stick to this plan.
- 2.
Note that the example in Fig. 1.2 does contain several loops.
- 3.
Although I cannot guarantee the completeness of this list, I did my best to cite all experimental setups known at the moment of writing these lines.
- 4.
Photonics is quite a tricky term. I am yet to find an established and precise definition and, in my experience, various scientists interpret this concept differently. In the present work, for simplicity, I make no distinction between these three terms.
- 5.
Technically, it is not null: the SLD is emitting light, hence the DC voltage \(V_\text {DC}\sim I_0/2\) is present. But we can ignore it, since it is filtered by the amplifier.
- 6.
Note that the delay T is the total propagation time from the MZ optical output to its electric input, that is, the full loop. In other words, fibre patch cords and electrical cables also add up to the delay, but their contribution is relatively small.
- 7.
It seems that the engineers ran out of inspiration when they named their devices! Do not worry if you get lost in all these acronyms, though—we will not use them past this section.
- 8.
An Artix evaluation board can be purchased for as low as $100.
- 9.
Implementation times of my designs never exceeded an hour, though.
- 10.
From a few seconds, up to a minute, in my experience.
References
Fernando, Chrisantha and Sampsa Sojakka. 2003. Pattern recognition in a bucket. In European conference on artificial life, 588–597. Springer
van Leeuwen, Jan. 1990. Handbook of theoretical computer science: Algorithms and complexity. Elsevier
Ralston, Anthony, Edwin D. Reilly, and David Hemmendinger. 2000. Encyclopedia of computer science. Nature Publishing Group.
Reilly, Edwin D. 2003. Milestones in computer science and information technology. Greenwood Publishing Group
Tucker, Allen B. 2004. Computer science handbook. CRC Press
Peter, J. 2005. Denning. Is computer science science?". Communications of the ACM 48 (4): 27–31.
Winston, Patrick Herny. 1984. Artificial intelligence. Addison-Wesley
Michalski, Ryszard S., Jaime G. Carbonell, and Tom M. Mitchell. 1984. Machine learning an artificial intelligence approach. Morgan Kaufmann Publication Incorporated
Mitchell, Tim Michael. 1997. Machine learning. McGraw-Hill Education
Russell, Stuart Jonathan, Peter Norvig, John F Canny, Jitendra M Malik, and Douglas D Edwards. 2003. Artificial intelligenc e: A modern approach. Prentice hall Upper Saddle River.
Bishop, Christopher M. 2006. Pattern recognition and machine learning. Springer
Hastie, Trevor, Robert Tibshirani, and Jerome Friedman. 2013. The elements of statistical learning: data mining, inference, and prediction. New York: Springer. 34 Chapter I. Introduction
Navada, A., A.N. Ansari, S. Patil, and B.A. Sonkamble. 2011. Overview of use of decision tree algorithms in machine learning. In 2011 IEEE control and system graduate research colloquium, 37–42. June 2011
Kotsiantis, S.B. 2013. Decision trees: a recent overview. Artificial Intelligence Review 39 (4): 261–283.
Charniak, Eugene. 1991. Bayesian networks without tears. AI Magazine 12 (4): 50.
Nielsen, Thomas Dyhre, and Finn Verner Jensen. 2009. Bayesian networks and decision graphs. Springer Science & Business Media
Dasarathy, Belur V. 1991. Nearest neighbor (NN) norms: NN pattern classification techniques
Naomi, S. 1992. Altman. An introduction to kernel and nearest-neighbor nonparametric regression. The American Statistician 46 (3): 175–185.
Shakhnarovich, Gregory, Trevor Darrell, and Piotr Indyk. Nearest-neighbor methods in learning and vision: Theory and practice (neural information processing). The MIT Press
Cristianini, Nello, and John Shawe-Taylor. 2000. An introduction to support vector machines and other kernel-based learning methods. Cambridge university Press
Kecman, Vojislav. 2001. Learning and soft computing: Support vector ma- chines, neural networks, and fuzzy logic models. MIT Press
Steinwart, Ingo, and Andreas Christmann. 2008. Support vector machines. Springer Science & Business Media
Salcedo-Sanz, Sancho. 2014. José Luis Rojo-Álvarez, Manel Martínez-Ramón, and Gustavo Camps-Valls. Support vector machines in engineering: An overview. In Wiley Interdisciplinary Reviews. Data Mining and Knowledge Discovery 4 (3): 234–267.
Hertz, John, Anders Krogh, and Richard G. Palmer. 1991. Introduction to the theory of neural computation. Addison-Wesley/Addison Wesley Longman
Bishop, Christopher M. 1995. Neural networks for pattern recognition. Oxford University Press
Gurney, Kevin. 1997. An introduction to neural networks. CRC Press
Haykin, Symon. 1999. Neural networks: A comprehensive foundation
Yoshua, Bengio, Aaron Courville, and Pascal Vincent. 2013. Representation learning: A review and new perspectives. IEEE Transactions on Pattern Analysis and Machine Intelligence 35 (8): 1798–1828.
Deng, Li, and Y. Dong. 2014. Foundations and trends®in signal processing. Signal Processing 7: 3–4.
LeCun, Yann, Yoshua Bengio, and Geoffrey Hinton. 2015. Deep learning. Nature 521 (7553): 436–444.
Schmidhuber, Jürgen. 2015. Deep learning in neural networks: An overview. Neural Networks 61: 85–117.
Hastie, Trevor, Jerome Friedman, and Robert Tibshirani. 2001. Overview of supervised learning. In The elements of statistical learning, 9–40. Springer
Kotsiantis, Sotiris B., I. Zaharakis, and P. Pintelas. 2007. Supervised machine learning: A review of classification techniques. In Emerging artificial intelligence applications in computer engineering 160: 3–24.
Sutton, Richard S, Andrew G Barto. 1998. Reinforcement learning: Anintroduction. Vol. 1. 1. MIT Press Cambridge
Szepesvári, Csaba. 2009. Algorithms for reinforcement learning. Morgan and Claypool
Friedman, Jerome, Trevor Hastie, and Robert Tibshirani. 2001. The elements of statistical learning. Vol. 1. Springer Series in Sstatistics New York
Xu, Lei. 2001. An overview on unsupervised learning from data mining perspective. Advances in self-organising maps, 181–209. London: Springer, London.
Ghahramani, Zoubin. 2004. Unsupervised learning. In Advanced lectureson machine learning. Springer, 72–112.
Chapelle, O., B. Schölkopf, and A. Zien. 2006. Semi-supervised Learning. Adaptive computation and machine learning: MIT Press.
McCulloch, Warren S., and Walter Pitts. 1943. A logical calculus of the ideas immanent in nervous activity. The bulletin of mathematical biophysics 5 (4): 115–133.
Minsky, Marvin, and Seymour Papert. 1969. Perceptrons: Anlntroduction to computational geometry. Cambridge, Mass: MIT Press.
Werbos, Paul. 1974. Beyond regression: New tools for prediction and analysis in the behavioral sciences
Paul, J. 1990. Werbos. Backpropagation through time: What it does and howto do it". Proceedings of the IEEE 78 (10): 1550–1560.
Alan, L. 1952. Hodgkin and Andrew F Huxley. A quantitative description of membrane current and its application to conduction and excitation in nerve. The Journal of physiology 117 (4): 500.
FitzHugh, Richard. 1955. Mathematical models of threshold phenomena inthe nerve membrane. The Bulletin of Mathematical Biophysics 17: 257–278.
Gerstner, Wulfram. 2001. A framework for spiking neuron models: The spikeresponse model. Handbook of Biological Physics 4: 469–516.
Gerstner, Wulfram, and Kistler, Werner M. 2002. Spiking neuron models: Single neurons, populations, plasticity. Cambridge University Press
Izhikevich, Eugene M. 2004. Which model to use for cortical spiking neurons? IEEE transactions on neural networks 15 (5): 1063–1070.
Haykin, Simon. 1998. Neural networks: A comprehensive foundation. Prentice Hall. 36 Chapter I. Introduction
Rosenblatt, Frank. 1961. Principles of neurodynamics. Cornell Aeronautical Lab Inc Buffalo NY: Perceptrons and the theory of brain mechanisms. Tech. rep.
Mandic, Danilo P., and Jonathon A. Chambers et al. 2001. Recurrent neural networks for prediction: Learning algorithms, architectures and stability. Wiley Online Library
Lipton, Z.C., J. Berkowitz, and C. Elkan. 2015. A critical review of recurrent neural networks for sequence learning. In: ArXiv e-prints arXiv:1506.00019 (2015).
Turchetti, Claudio. 2004. Stochastic models of neural networks. Vol. 102. IOS Press
Wong, Eugene. 1991. Stochastic neural networks. Algorithmica 6 (1–6): 466.
Maass, Wolfgang. 1997. Networks of spiking neurons: The third generation of neural network models. Neural Networks 10 (9): 1659–1671.
Maass, Wolfgang, and Christopher M Bishop. 2001. Pulsed neural networks. MIT Press
Ponulak, Filip, and Andrzej Kasiński. 2011. Introduction to spiking neuralnetworks: Information processing, learning and applications. 71: 409–33.
Grüning, André and Sander M Bohte. 2014. Principles and challenges: Spiking neural networks. In ESANN.
Orr, Mark J.L. etal. 1996. Introduction to radial basis function networks
Bors, Adrian G. 2001. Introduction of the radial basis function (rbf) networks. Online symposium for electronics engineers. 1 (1): 1–7.
Wu, Yue, Hui Wang, Biaobiao Zhang, and K-L Du. Using radial basis function networks for function approximation and classification. In ISRN Applied Mathematics 2012 (2012).
Jaeger, Herbert, and Harald Haas. 2004. Harnessing nonlinearity: Predicting chaotic systems and saving energy in wireless communication. Science 304: 78–80.
Maass, Wolfgang, Thomas Natschláger, and Henry Markram. 2002. Realtime computing without stable states: A new framework for neural computation based on perturbations. Neural Computation 14: 2531–2560.
Jaeger, Herbert. 2001. The echo state approach to analysing and training recurrent neural networks—with an Erratum note. In GMD report 148
Rodan, Ali, and Peter Tino. 2011. Minimum complexity echo state network. IEEE Transactions on Neural Networks 22: 131–144.
Duport, François, Bendix Schneider, Anteo Smerieri, Marc Haelterman, and Serge Massar. 2012. All-optical reservoir computing. In Optics Express 20: 22783–22795. I.4. References 37
Dejonckheere, Antoine, François Duport, Anteo Smerieri, Li Fang, Jean-Louis Oudar, Marc Haelterman, and Serge Massar. 2014. All-optical reservoir computer based on saturation of absorption. Optics Express 22: 10868–10881.
Antonik, Piotr, Marc Haelterman, and Serge Massar. 2017. Brain-inspired photonic signal processor for generating periodic patterns and emulating chaotic systems. In Physical Review Applied 7: 054014.
Amemiya, Takeshi. 1985. Advanced econometrics. Harvard University Press
Tikhonov, Andrei Nikolaevich, A.V. Goncharsky, V.V. Stepanov, and Anatoly G. Yagola. 1995. Numerical methods for the solution of ill-posed problems, vol. 328. Netherlands: Springer.
Hermans, Michiel. 2012. Expanding the theoretical framework of reservoir computing. PhD thesis. Ghent University
Singh, Jaspreet, Sandeep Ponnuru, and Upamanyu Madhow. 2009. Multigigabit communication: The ADC bottleneck. In IEEE international conference on Ultra-Wideband, 2009. ICUWB, 22–27.IEEE
Sobel, David Amory, and Robert W. Brodersen. 2009. A 1 Gb/s mixed-signal baseband analog front-end for a 60 GHz wireless receiver. IEEE Journal of Solid-State Circuits 44 (4): 1281–1289.
Feng, Xiaodong, Guanghui He, and Jun Ma. 2010. A new approach to reduce the resolution requirement of the ADC for high data rate wireless receivers. In 2010 IEEE 10th international conference on signal processing (ICSP), 1565–1568. IEEE
Yong, Su-Khiong, Pengfei Xia, and Alberto Valdes-Garcia. 2011. 60 GHz technology for Gbps WLAN and WPAN: from theory to practice. Wiley
Hassan, Khursheed, Theodore S Rappaport, and Jeffrey G Andrews. 2010. Analog equalization for low power 60 GHz receivers in realistic multipath channels. In 2010 IEEE global telecommunications conference (GLOBE-COM 2010), 1–5. IEEE.
Malone, Jerry, and Mark A. Wickert. 2011. Practical volterra equalizers for wideband satellite communications with twta nonlinearities. In Digital signal processing workshop and IEEE signal processing education workshop (DSP/SPE), 2011 IEEE, 48–53. IEEE
Bauduin, Marc, Anteo Smerieri, Serge Massar, and François Horlin. 2015. Equalization of the non-linear satellite communication channel with an echo state network. In Vehicular technology conference (VTC Spring), IEEE 81st, 1–5. IEEE
Mathews, V John, and Junghsi Lee. 1994. Adaptive algorithms for bilinear filtering. In SPIE’s 1994 international symposium on optics, imaging, and instrumentation. International Society for Optics and Photonics. 317–327.
Whitle, Peter. 1951. Hypothesis testing in time series analysis. Vol. 4. Almqvist & Wiksells. 38 Chapter I. Introduction
Hannan, Edward James. 2009. Multiple time series. Vol. 38. Wiley & Sons
Paquot, Yvan, François Duport, Anteo Smerieri, Joni Dambre, Benjaminschrauwen, Marc Haelterman, and Serge Massar. 2012. Optoelectronic reservoir computing. Scientific Reports 2: 287.
Vinckier, Quentin, François Duport, Anteo Smerieri, Kristof Vandoorne, Peter Bienstman, Marc Haelterman, and Serge Massar. 2015. High-performance photonic reservoir computer based on a coherently driven passive cavity. Optica 2 (5): 438–446.
Hermans, Michiel, Piotr Antonik, Marc Haelterman, and Serge Massar. 2016. Embodiment of learning in electro-optical signal processors. Physical Review Letters 117: 128301.
Schúrmann, Felix, Karlheinz Meier, and Johannes Schemmel. 2004. Edge of chaos computation in mixed-mode VLSI-A Hard liquid. In NIPS, 1201–1208.
Appeltant, Lennert, Miguel Cornelles Soriano, Guy Van der Sande, JanDanckaert, Serge Massar, Joni Dambre, Benjamin Schrauwen, Claudio R Mirasso, and Ingo Fischer. 2011. Information processing using a single dynamical node as complex system. Nature Communications 2: 468.
Larger, Laurent, M.C. Soriano, L. Daniel Brunner, Jose M. Appeltant, Luis Pesquera Gutiérrez, Claudio R. Mirasso, and Ingo Fischer. 2012. Photonic information processing beyond Turing: An optoelectronic implementation of reservoir computing. Optic Express 20: 3241–3249.
Martinenghi, Romain, Sergei Rybalko, Maxime Jacquot, Yanne KouomouChembo, and Laurent Larger. 2012. Photonic nonlinear transient computing with multiple-delay wavelength dynamics. Physical Review Letters 108: 244101.
Brunner, Daniel, Miguel C. Soriano, Claudio R. Mirasso, and Ingo Fischer. 2013. Parallel photonic information processing at gigabyte per second data rates using transient states. Nature Communications 4: 1364.
Vandoorne, Kristof, Pauline Mechet, Thomas Van Vaerenbergh, Martin Fiers, Geert Morthier, David Verstraeten, Benjamin Schrauwen, Joni Dambre, and Peter Bienstman. 2014. Experimental demonstration of reservoir computing on a silicon photonics chip. Nature Communications 5: 3541.
Haynes, Nicholas D., Miguel C. Soriano, David P. Rosin, Ingo Fischer, and Daniel J. Gauthier. 2015. Reservoir computing with a single timedelay autonomous Boolean node. Physical Review E 91 (2): 020801.
Torrejon, Jacob, Mathieu Riou, Flavio Abreu Araujo, Sumito Tsunegi,Guru Khalsa, Damien Querlioz, Paolo Bortolotti, Vincent Cros, Akio Fukushima, Hitoshi Kubota, et al. 2017. Neuromorphic computing with I.4. References 39nanoscale spintronic oscillators. In arXiv preprint arXiv:1701.07715
Larger, Laurent, Antonio Baylón-Fuentes, Romain Martinenghi, Vladimir S. Udaltsov, Yanne K. Chembo, and Maxime Jacquot. 2017. High-speed photonic reservoir computing using a time-delay-based architecture: Million words per second classification. Physical Review X 7, 011015.
Akrout, Akram, Arno Bouwens, François Duport, Quentin Vinckier, Marc Haelterman, and Serge Massar. 2016. Parallel photonic reservoir computing using frequency multiplexing of neurons. In arXiv:1612.08606
Akrout, Akram, Piotr Antonik, Marc Haelterman, and Serge Massar. 2017. Towards autonomous photonic reservoir computer based on frequency parallelism of neurons. Proceedins SPIE 10089. 100890S- 100890S-7.
Kadric, Edin. 2011. An FPGA implementation for a high-speed optical link with a PCIe interface. PhD thesis
Franz, Kaitlyn. 2015. History of the FPGA. http://blog.digilentinc.com/history-of-the-fpga/.
Wikipedia. Transistor. 2017. http://en.wikipedia.org/wiki/ Transistor.
Stavinov, Evgeni. 2011. 100 Power tips for FPGA designers. CreateSpace Independent Publishing Platform
Waldrop, M. Mitchell. 2016. The chips are down for Moore’s law. Nature 530: 144–147.
Bright, Peter. 2016. Moore’s law really is dead this time. https://arstechnica.com/information-technology/2016/02/moores- law-really-is-dead-this-time/.
Virtex-6 Family Overview. 2012. DS150 (v2.4). Xilinx Inc.
Virtex-6 FPGA DSP48E1 Slice. 2011. UG369. Xilinx Inc.
Getting Started with the Xilinx Virtex-6 FPGA ML605 Evaluation Kit. 2011. UG533 (v1.5). Xilinx Inc..
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Copyright information
© 2018 Springer International Publishing AG, part of Springer Nature
About this chapter
Cite this chapter
Antonik, P. (2018). Introduction. In: Application of FPGA to Real‐Time Machine Learning. Springer Theses. Springer, Cham. https://doi.org/10.1007/978-3-319-91053-6_1
Download citation
DOI: https://doi.org/10.1007/978-3-319-91053-6_1
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-91052-9
Online ISBN: 978-3-319-91053-6
eBook Packages: Physics and AstronomyPhysics and Astronomy (R0)